CN101115131A - Pixel space relativity based image syncretizing effect real-time estimating method and apparatus - Google Patents

Pixel space relativity based image syncretizing effect real-time estimating method and apparatus Download PDF

Info

Publication number
CN101115131A
CN101115131A CNA2006100408336A CN200610040833A CN101115131A CN 101115131 A CN101115131 A CN 101115131A CN A2006100408336 A CNA2006100408336 A CN A2006100408336A CN 200610040833 A CN200610040833 A CN 200610040833A CN 101115131 A CN101115131 A CN 101115131A
Authority
CN
China
Prior art keywords
image
evaluation
gray level
video
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CNA2006100408336A
Other languages
Chinese (zh)
Other versions
CN100571335C (en
Inventor
柏连发
张闯
张毅
张保民
钱惟贤
顾国华
陈钱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CNB2006100408336A priority Critical patent/CN100571335C/en
Publication of CN101115131A publication Critical patent/CN101115131A/en
Application granted granted Critical
Publication of CN100571335C publication Critical patent/CN100571335C/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a real time image fusion method and an equipment thereof based on the pixel spatial correlation, the evaluation method of the invention sends video signals to the image fusion effect evaluation plate via the video input end by the video signals taking part in the fusion and the fusion video signals, under the effect of the evaluation control keyboard, and accomplishes the collection of video image, pretreatment of video signals, computing the image fusion, seeking of the gray level co-occurrence matrix and the standard deviation, displaying of the evaluation results and data storing by the effect evaluation plate and displays on the evaluation results display via the output end. The evaluation equipment comprises a video input end, an image fusion effect evaluation plate, an evaluation control keyboard, a video output end and evaluation result display that are connected with each other sequentially. The invention can evaluate the improvement of the fusion image relative to the original image form the angle of image definition; operate real time fusion image quality evaluation and give evaluation result in time according to the change of scene.

Description

Image fusion effect real-time evaluation method and device based on pixel spatial correlation
Technical field
The invention relates to an image quality evaluation technology, in particular to a real-time evaluation method and a real-time evaluation device for an image fusion effect based on pixel space correlation.
Two background art
Image fusion refers to the integration of information from two or more source images to obtain a more accurate, more comprehensive and more reliable image description of the same scene. The fusion processing aims at making up for deficiencies of the different spectral band image characteristics, optimizing the image quality, improving the definition of the image and further improving the discovery and identification probability of the target. The existing image fusion effect evaluation method mainly adopts indexes such as information entropy, cross entropy, average mutual information, root mean square error, peak signal to noise ratio and the like, and a subjective evaluation method of the detection and identification probability of an on-site actual measurement target. The information entropy can give the average information quantity of the image, but cannot explain the improvement of the fused image in the aspect of image definition; similarly, indexes of cross entropy, mutual information, root mean square error and peak signal to noise ratio only give the approximation degree of the fused image to a reference image (standard image), and cannot reflect the definition degree of the fused image. In order to ensure statistical significance, at least 20 observers should participate in the evaluation, and the evaluation result is often influenced by the image type, the test environment and the like.
In the Evaluation of spatial resolution of fused images, a "fused Image spatial quality Evaluation based on distinguishing an area type expected by an object from an area type of an actual Fusion result" is proposed in an article by v.buntillov, t.bretscheider, nanyang technical University, a Fusion Evaluation with object Function for Multispectral Image imaging shape (2005 International geographic and motion Sensing symposium, volume 4, page 2830-2833). The method divides a source image into three regions with different types according to the gradient threshold value of the gradient image of the source image, and then divides pixel points into nine expected region types according to the fusion region types to which the pixel points at the same position in the source image belong; the fusion image is divided into three regions of different types according to the gradient threshold value of the gradient image, and pixel points are divided into nine actual region types according to the comparison result of the fusion region types of the pixel points at the same position in the source image; and finally, counting the difference between the actual region type and the expected region type of the pixel points in the fusion result image to evaluate the image space quality of the fusion result. According to the method, histogram equalization and edge detection are needed when a gradient image is generated, and the quality of the gradient image can be influenced by selection of an edge detection operator; when the pixel points of the gradient image are classified according to the gradient threshold value, the selection of the gradient value threshold also needs to be adjusted according to different image types, and the practicability is poor; meanwhile, the method requires strong correlation of the source graphs participating in fusion when in use.
Disclosure of the invention
The invention aims to provide a real-time image fusion effect evaluation method and device based on pixel spatial correlation, which can effectively evaluate the improvement condition of a fusion image in definition relative to a source image when a fusion system works in different scenes in real time.
The technical solution for realizing the purpose of the invention is as follows: a method for evaluating image fusion effect fruits based on pixel spatial correlation comprises the following steps:
1.1 the video signal participating in the evaluation of the image fusion effect is connected to the video input end by the output end of the video camera to complete the input of the video signal and simultaneously input a plurality of paths of signals, wherein the input signals are standard video signals of PAL system or NTSC system;
1.2 the video signals passing through the video input end are respectively converted into digital image signals by a video acquisition part, namely, standard PAL system or NTSC system video signals are respectively converted into digital color difference signals, the output format is ITU-R BT.656, and the digital color difference signals are stored according to frames;
1.3, the brightness signal in each frame of digital color difference signal is subjected to image preprocessing, namely, a multi-frame accumulation average image smoothing method is adopted to improve the effect of fusing images;
1.4, carrying out image fusion on the smoothed multipath brightness images, wherein a gray level modulation method is adopted as a fusion method;
1.5 the smoothed multi-channel brightness image data and the fused image are quickly solved through a gray level co-occurrence matrix to obtain a gray level co-occurrence matrix corresponding to each channel of brightness image, and the gray level co-occurrence matrix records the occurrence number of various gray level arrays;
1.6 under the condition that any gray level is determined, calculating the standard deviation of the dispersion of another gray level by the gray level co-occurrence matrix, and averaging the standard deviation;
1.7, inputting the standard deviation average value of the gray level co-occurrence matrixes into an evaluation result display part, calculating the value of an improvement factor by the evaluation result display part, and superposing the result on a fused image;
1.8 the fused image with the display result is adjusted to the standard ITU-R BT.656 format by the evaluation result display part, and is coded into a standard video signal by a video coder and output to a result display.
The invention relates to an image fusion effect real-time evaluation method based on pixel space correlation, which adopts a fast acquisition method of a gray level co-occurrence matrix to fuse images, namely firstly, a storage space of 256 bytes is established for storing the gray level co-occurrence matrix, when an image file is prepared, the gray level of the current pixel at the upper left corner of the corresponding image is taken out, then the position of a dual pixel point is calculated, the gray level of the dual pixel point is taken out, the numerical value stored in the gray level of the current pixel in the gray level co-occurrence matrix and the position corresponding to the gray level of the dual pixel is added with 1, and then whether the position of the dual pixel reaches the tail of the image file is judged; if the image file is not the file tail, the position of the current pixel point is shifted to the right, calculation is continued, and if the end of the image file is reached, the operation is ended.
A device for realizing the image fusion effect real-time evaluation method based on the pixel space correlation comprises a video input end and an evaluation result display, wherein the video input end is connected with a video acquisition module in an image fusion effect evaluation board, the video acquisition module is connected with a gray level co-occurrence matrix calculation module in the image fusion effect evaluation board, the gray level co-occurrence matrix calculation module is connected with a standard deviation calculation module, the standard deviation calculation module is connected with an evaluation result display module, the video acquisition module, the gray level co-occurrence matrix calculation module, the standard deviation calculation module and the evaluation result display module are respectively connected with a data storage module, the evaluation result display module is connected with a video output end outside the image fusion effect evaluation board, and the video output end is connected with the evaluation result display; the evaluation control keyboard is respectively connected with the video acquisition module, the gray level co-occurrence matrix solving module and the evaluation result display module in the image fusion effect evaluation board.
The invention relates to an image fusion effect real-time evaluation device based on pixel space correlation.A pre-processing module and a fusion module are added in an image fusion effect evaluation board, the image pre-processing module is respectively connected with a video acquisition module and a fusion module, the fusion module is connected with a gray level co-occurrence matrix solving module, and the image pre-processing module and the fusion module are respectively connected with a data storage module and an evaluation control keyboard.
The image fusion effect evaluation board of the image fusion effect real-time evaluation device based on the pixel space correlation is characterized in that four video decoders are respectively connected with four video ports of a digital signal processing chip and a control bus of the digital signal processing chip, and an extended data bus of the digital signal processing chip is respectively connected with a serial port control, SDRAM and FLASH; the digital signal processing chip is provided with a CPU which can carry out image preprocessing, fusion calculation, gray level co-occurrence matrix calculation, standard deviation calculation and evaluation result display data calculation, the data exchange between the CPU and the data storage module is completed by an external memory interface data bus, and an evaluation control keyboard carries out data exchange between a serial port and an image fusion effect evaluation board.
Compared with the prior art, the invention has the remarkable advantages that: (1) The improvement degree of the fusion image relative to the source image can be evaluated from the perspective of image definition; (2) The real-time fusion image effect evaluation can be carried out, and an evaluation result can be given in time according to the change of the scene; (3) The evaluation of fused images can be carried out on the video images of different wave bands participating in fusion, and the correlation among source images is not limited; (4) The evaluation algorithm is high in hardware implementation efficiency, and does not need to refer to a standard image.
Description of the four figures
FIG. 1 is a flow chart of a fast gray level co-occurrence matrix solving method in the image fusion effect real-time evaluation method based on pixel spatial correlation.
Fig. 2 is a schematic structural diagram of the image fusion effect real-time evaluation device based on the pixel spatial correlation according to the present invention.
Fig. 3 is a structural diagram of an example of an image fusion effect evaluation board in the image fusion effect real-time evaluation device based on pixel spatial correlation according to the present invention.
FIG. 4 is a dim light image input when the fusion effect evaluation is performed on the dim light image and the ultraviolet image by using the image fusion effect real-time evaluation device based on the pixel space correlation.
FIG. 5 is an ultraviolet image input when the fusion effect evaluation is performed on the low-light-level image and the ultraviolet image by using the image fusion effect real-time evaluation device based on the pixel space correlation of the invention.
FIG. 6 is a fused image obtained when the fusion effect of the low-light level image and the ultraviolet image is evaluated by using the image fusion effect real-time evaluation device based on the pixel space correlation.
FIG. 7 is an evaluation result obtained when the fusion effect of the low-light-level image and the ultraviolet image is evaluated by using the image fusion effect real-time evaluation device based on the pixel space correlation.
Detailed description of the preferred embodiments
The present invention will be described in further detail with reference to the accompanying drawings.
The invention discloses a real-time evaluation method for an image fusion effect based on pixel spatial correlation, which comprises the following steps of:
1.1 the video signal participating in the evaluation of the image fusion effect is connected to the video input end by the output end of the video camera to complete the input of the video signal and simultaneously input multiple signals, for example, the input video signal 1 and the input video signal 2 are dim light signals and ultraviolet signals shown in fig. 4 and fig. 5, and the dim light signals and the ultraviolet signals are standard video signals of PAL system or NTSC system.
1.2 the video signals through the video input end are converted into digital image signals by the video collecting part, namely, the standard PAL system or NTSC system video signals are converted into digital color difference signals respectively, the output format is ITU-R BT.656, and the digital color difference signals are stored according to frames.
1.3, the brightness signal in each frame of digital color difference signal is processed by image preprocessing, namely, the effect of fusing images is improved by adopting a multi-frame accumulation average image smoothing method.
1.4, carrying out image fusion on the smoothed multipath brightness images, wherein the fusion method adopts a gray modulation method, and the fused images are shown in fig. 6.
1.5 the smoothed multi-channel brightness image data and the fused image are rapidly solved through a gray level co-occurrence matrix to obtain a gray level co-occurrence matrix corresponding to each channel of brightness image, the gray level co-occurrence matrix records a pixel pair formed by a pair of adjacent pixels (m, n) and (m, n + 1) with a determined interval, and the pixels are in gray level number in the whole imageGroup [ G ] 1 ,G 2 ]The gray level co-occurrence matrix is represented as H 2D
H 2D =f(G 1 ,G 2 )
In general, the visual appearance of a two-dimensional histogram of an image characterized by a gray-level co-occurrence matrix is roughly along (G) 1 ,G 2 ) The 45 degree oblique line distribution of the plane can clearly and quantitatively represent the correlation degree of the gray level between adjacent pixels in the image, and the spatial correlation degree of the gray level of the adjacent pixels is just a statistical representation of the image definition. Images with poor correlation (good definition), wide spread range; the image having strong correlation (poor definition) has a narrow spread.
With reference to fig. 1, an image is fused by using a fast solution method of a gray level co-occurrence matrix, that is, a 256 × 256 byte storage space is first established to store the gray level co-occurrence matrix, when an image file is prepared, a gray value of a current pixel at a position corresponding to an upper left corner of the image is taken out, then positions of dual pixels are calculated and gray values of the dual pixels are taken out, 1 is added to a numerical value stored in a position corresponding to the gray value of the current pixel and the gray value of the dual pixel in the gray level co-occurrence matrix, and then whether the position of the dual pixel reaches the end of the image file is determined; if the image file is not the file tail, the position of the current pixel point is shifted to the right, calculation is continued, and if the end of the image file is reached, the operation is ended.
1.6 in case of any gray level determination, the standard deviation of another gray level spread is calculated from the gray level co-occurrence matrix and the standard deviation is averaged. I.e. for different l (l =1,l =2.. The H is obtained) 2D Then, a gray level array [ G ] of a two-dimensional histogram of the image represented by the gray level co-occurrence matrix is obtained 1 ,G 2 ]Middle, any gray level G 1 In the determined case, another gray level G 2 Standard deviation σ of spread:
Figure A20061004083300091
wherein
The result is a function (e.g., the mean of the selected standard deviations) that can be used to characterize the correlation between adjacent pixels of the image:
σ=σ(l)
where l represents the distance between pairs of pixels.
When the device is used for evaluating the fusion effect of the low-light-level image and the ultraviolet image, the average value of the standard deviation of the low-light-level image is 11.297, the average value of the standard deviation of the ultraviolet image is 13.8, and the average value of the standard deviation of the fusion image is 21.84.
1.7 inputting the standard deviation average value of the gray level co-occurrence matrixes into an evaluation result display part, calculating the value of the improvement factor by the evaluation result display part, and superposing the result on the fused image.
Fusion improving factor D (l) is defined as:
D(l)=σ F (l)/σ S (l)
in the formula σ F (l) A correlation function, σ, representing the fused image S (l) Representing a source image correlation function. The fusion improvement factor obtained by the above formula operation is: fusion/video signal 1=1.933; fusion/video signal 2=1.583.
The evaluation algorithm for the image fusion effect adopts a pixel space correlation method. The pixel space correlation method for evaluating the image fusion effect utilizes the gray level co-occurrence matrix two-dimensional histogram to judge the statistical index, thereby realizing the evaluation technology for judging the fusion image quality improvement level.
1.8 the fused image with the display result is adjusted to the standard ITU-R BT.656 format by the evaluation result display part, and is coded into a standard video signal by a video coder and output to a result display, and the display result is shown in figure 7.
With reference to fig. 2, the apparatus for implementing the image fusion effect real-time evaluation method based on pixel spatial correlation of the present invention includes a video input end and an evaluation result display, when a fusion video exists in an input video signal, the video input end is connected to a video acquisition module in an image fusion effect evaluation board, the image fusion effect evaluation board completes the acquisition of a video image, the calculation of a gray level co-occurrence matrix, the calculation of a standard deviation, the display of an evaluation result, and the data storage, that is, in the image fusion effect evaluation board, the video acquisition module is connected to a gray level co-occurrence matrix calculation module, the gray level co-occurrence matrix calculation module is connected to a standard deviation calculation module, the standard deviation calculation module is connected to an evaluation result display module, the video acquisition module, the gray level co-occurrence matrix calculation module, the standard deviation calculation module, and the evaluation result display module are respectively connected to a data storage module, the evaluation result display module is connected to a video output end outside the image fusion effect evaluation board, the video output end is connected to the evaluation result display, and the evaluation result display is used for outputting an improvement factor including a fusion source image. The evaluation control keyboard is respectively connected with the video acquisition module, the gray level co-occurrence matrix solving module and the evaluation result display module in the image fusion effect evaluation board. The evaluation control keyboard controls the work of the image preprocessing and fusion module according to the input condition of the video signal; and controlling the parameter setting of the gray level co-occurrence matrix calculation and the time selection of the evaluation result display.
When no fusion video exists in the input video signal, under the action of the evaluation control keyboard, the image fusion effect evaluation board finishes the work of video image acquisition, video signal preprocessing, image fusion calculation, gray level co-occurrence matrix calculation, standard deviation calculation, evaluation result display and data storage. The fusion video signal fused with the video signal is obtained by calculation of a fusion module in an image fusion effect evaluation board, namely, an image preprocessing module and a fusion module are added in the image fusion effect evaluation board, the image preprocessing module is respectively connected with a video acquisition module and a fusion module, the fusion module is connected with a gray level co-occurrence matrix solving module, and the image preprocessing module and the fusion module are respectively connected with a data storage module and an evaluation control keyboard.
In conjunction with fig. 2 and 3, an image fusion effect evaluation board was designed based on a digital signal processing chip (e.g., TMS320DM 642). The video image acquisition module adopts four video decoders (such as SAA 7111) to convert NTSC and PAL video signals into digital color difference signals, and the output format is ITU-R BT.656. The configuration of the video decoder is completed through a control bus of a digital signal processing chip, and a data storage module adopts SDRAM with 4 MX 64 bits; the program memory adopts FLASH with 4 Mx 8 bits. The video data output in the evaluation result display module adopts a video encoder (such as THS 8134), which supports video encoding in PAL and NTSC formats, and the input of the video encoder is digital video in BT.656 format, and the configuration of the video encoder is also completed through a control bus. The image preprocessing, the fusion calculation, the gray level co-occurrence matrix calculation, the standard deviation calculation and the evaluation result display data calculation are completed by a CPU of the digital signal processing chip, and the data exchange between the CPU and the data storage module is completed by an external memory interface data bus. And the evaluation control keyboard exchanges data with the image fusion effect evaluation board through the serial port. The working process of each part is as follows:
1. after each part of the device is electrified, a digital signal processing chip firstly initializes, including guiding and address allocation of a program and configuration of a video decoder, a video encoder and the like by a control bus;
2. the decoded digital image signals are respectively stored in SDRAM under the control of a data bus through the video input end of a digital signal processing chip;
3. the digital signal processing chip reads input digital signals from the SDRAM according to frames, respectively calculates gray level co-occurrence matrixes for each frame of image according to the input of the evaluation control keyboard, and stores calculation results in the SDRAM;
4. the digital signal processing chip is used for solving the standard deviation value of the gray level co-occurrence matrix result of each frame of image and integrating the solved result into a display digital video signal;
5. and responding to the display signal of the evaluation control keyboard, encoding the evaluation result digital video signal and the fusion result signal at the moment by a video encoder, and preparing to output a display result.
The working flow of the image fusion effect real-time evaluation device based on the pixel space correlation is detailed as follows: the video signals participating in fusion and the video signals of the fusion result are connected to a video image acquisition module of the image fusion effect evaluation board through a video input end, and the video signals are converted into digital image signals and then stored in a data storage module according to frames; the gray level co-occurrence matrix solving module is used for solving the gray level statistical result of the digital image signals in the data storage module according to the pixel even-distance set by the control keyboard and storing the result in a specific area of the data storage module; the standard deviation calculation module reads the gray level co-occurrence matrix result of the data storage module and calculates all gray levels G 1 In the determined case, another gray level G 2 The standard deviation of the dispersion is stored in a specific area of the data storage module; the evaluation result display module determines data to be displayed according to the control keyboard, converts the data into video signals suitable for display and fusion images, and sends the video signals and the fusion images to the evaluation result display through the output end to output evaluation results; when the fused video signal does not exist in the input video signal, under the control of the evaluation control keyboard, a fusion algorithm in the selection device is used for solving the fused video signal, and the solved fused video signal replaces the input fused video signal to complete subsequent processing.

Claims (5)

1. A real-time evaluation method for image fusion effect based on pixel spatial correlation comprises the following steps:
1.1 the video signal participating in the evaluation of the image fusion effect is connected to the video input end by the output end of the video camera to complete the input of the video signal and simultaneously input a plurality of paths of signals, wherein the input signals are standard video signals of PAL system or NTSC system;
1.2 the video signals passing through the video input end are respectively converted into digital image signals by a video acquisition part, namely, standard PAL system or NTSC system video signals are respectively converted into digital color difference signals, the output format is ITU-R BT.656, and the digital color difference signals are stored according to frames;
1.3, the brightness signal in each frame of digital color difference signal is subjected to image preprocessing, namely, a multi-frame accumulation average image smoothing method is adopted to improve the effect of fusing images;
1.4, carrying out image fusion on the smoothed multipath brightness images, wherein a gray level modulation method is adopted as a fusion method;
1.5 the smoothed multi-channel brightness image data and the fused image are rapidly solved through a gray level co-occurrence matrix to obtain a gray level co-occurrence matrix corresponding to each channel of brightness image, and the gray level co-occurrence matrix records the occurrence number of various gray level arrays;
1.6 under the condition that any gray level is determined, calculating the standard deviation of the dispersion of another gray level by the gray level co-occurrence matrix, and averaging the standard deviation;
1.7, inputting the standard deviation average value of the gray level co-occurrence matrixes into an evaluation result display part, calculating the value of an improvement factor by the evaluation result display part, and superposing the result on a fused image;
1.8 the fused image with the display result is adjusted to the standard ITU-R BT.656 format by the evaluation result display part, and is coded into a standard video signal by a video coder and output to a result display.
2. The method for evaluating the image fusion effect based on the pixel spatial correlation according to claim 1, wherein: fusing images by adopting a rapid acquisition method of a gray level co-occurrence matrix, namely firstly establishing 256-by-256 byte storage space for storing the gray level co-occurrence matrix, taking out a gray level of a current pixel at the upper left corner position of a corresponding image after an image file is prepared, then calculating the position of dual pixel points and taking out the gray level of the dual pixel points, adding 1 to a numerical value stored in the position corresponding to the gray level of the current pixel and the gray level of the dual pixel in the gray level co-occurrence matrix, and then judging whether the position of the dual pixel reaches the end of the image file; if the image file is not the file tail, the position of the current pixel point is shifted to the right, calculation is continued, and if the end of the image file is reached, the operation is ended.
3. An apparatus for implementing the image fusion effect real-time evaluation method based on pixel spatial correlation according to claim 1 or 2, comprising a video input end and an evaluation result display, wherein: the video input end is connected with a video acquisition module in the image fusion effect evaluation board, the video acquisition module is connected with a gray level co-occurrence matrix solving module in the image fusion effect evaluation board, the gray level co-occurrence matrix solving module is connected with a standard deviation solving module, the standard deviation solving module is connected with an evaluation result display module, the video acquisition module, the gray level co-occurrence matrix solving module, the standard deviation solving module and the evaluation result display module are respectively connected with a data storage module, the evaluation result display module is connected with a video output end outside the image fusion effect evaluation board, and the video output end is connected with an evaluation result display; the evaluation control keyboard is respectively connected with the video acquisition module, the gray level co-occurrence matrix solving module and the evaluation result display module in the image fusion effect evaluation board.
4. The device for evaluating the image fusion effect based on the pixel spatial correlation according to claim 3, wherein: an image preprocessing module and a fusion module are added in the image fusion effect evaluation board, the image preprocessing module is respectively connected with a video acquisition module and the fusion module, the fusion module is connected with a gray level co-occurrence matrix solving module, and the image preprocessing module and the fusion module are respectively connected with a data storage module and an evaluation control keyboard.
5. The device for evaluating the image fusion effect based on the pixel spatial correlation according to claim 3, wherein: the image fusion effect evaluation board is characterized in that four video decoders are respectively connected with four video ports of a digital signal processing chip and connected with a control bus of the digital signal processing chip, and an expansion data bus of the digital signal processing chip is respectively connected with a serial port control, an SDRAM and a FLASH; the digital signal processing chip is provided with a CPU which can carry out image preprocessing, fusion calculation, gray level co-occurrence matrix calculation, standard deviation calculation and evaluation result display data calculation, the data exchange between the CPU and the data storage module is completed by an external memory interface data bus, and an evaluation control keyboard carries out data exchange between a serial port and an image fusion effect evaluation board.
CNB2006100408336A 2006-07-28 2006-07-28 Image syncretizing effect real-time estimating method and device based on pixel space relativity Expired - Fee Related CN100571335C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2006100408336A CN100571335C (en) 2006-07-28 2006-07-28 Image syncretizing effect real-time estimating method and device based on pixel space relativity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2006100408336A CN100571335C (en) 2006-07-28 2006-07-28 Image syncretizing effect real-time estimating method and device based on pixel space relativity

Publications (2)

Publication Number Publication Date
CN101115131A true CN101115131A (en) 2008-01-30
CN100571335C CN100571335C (en) 2009-12-16

Family

ID=39023212

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2006100408336A Expired - Fee Related CN100571335C (en) 2006-07-28 2006-07-28 Image syncretizing effect real-time estimating method and device based on pixel space relativity

Country Status (1)

Country Link
CN (1) CN100571335C (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334893B (en) * 2008-08-01 2011-05-04 天津大学 Fused image quality integrated evaluating method based on fuzzy neural network
CN102169576A (en) * 2011-04-02 2011-08-31 北京理工大学 Quantified evaluation method of image mosaic algorithms
CN103186894A (en) * 2013-03-22 2013-07-03 南京信息工程大学 Multi-focus image fusion method for self-adaptive partitioning
CN103905815A (en) * 2014-03-19 2014-07-02 西安电子科技大学 Video fusion performance evaluating method based on high-order singular value decomposition
CN108269277A (en) * 2016-12-30 2018-07-10 清华大学 For carrying out the method and system of quality evaluation to radiation image
CN109637479A (en) * 2019-01-11 2019-04-16 北京德为智慧科技有限公司 A kind of image processing method, device and display
CN109753217A (en) * 2018-12-11 2019-05-14 航天信息股份有限公司 Dynamic keyboard operating method, device, storage medium and electronic equipment
CN110211085A (en) * 2018-02-28 2019-09-06 清华大学 A kind of Quality Measures for Image Fusion and system
CN110969591A (en) * 2018-09-28 2020-04-07 中国科学院合肥物质科学研究院 Experimental device and method for evaluating background image fusion degree of moving object
CN111027589A (en) * 2019-11-07 2020-04-17 成都傅立叶电子科技有限公司 Multi-division target detection algorithm evaluation system and method
CN111107267A (en) * 2019-12-30 2020-05-05 广州华多网络科技有限公司 Image processing method, device, equipment and storage medium
US11582814B2 (en) 2014-06-25 2023-02-14 Pismo Labs Technology Limited Methods and systems for transmitting and receiving data through one or more tunnels for packets satisfying one or more conditions

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101334893B (en) * 2008-08-01 2011-05-04 天津大学 Fused image quality integrated evaluating method based on fuzzy neural network
CN102169576A (en) * 2011-04-02 2011-08-31 北京理工大学 Quantified evaluation method of image mosaic algorithms
CN102169576B (en) * 2011-04-02 2013-01-16 北京理工大学 Quantified evaluation method of image mosaic algorithms
CN103186894B (en) * 2013-03-22 2015-10-07 南京信息工程大学 A kind of multi-focus image fusing method of self-adaptation piecemeal
CN103186894A (en) * 2013-03-22 2013-07-03 南京信息工程大学 Multi-focus image fusion method for self-adaptive partitioning
CN103905815B (en) * 2014-03-19 2016-01-13 西安电子科技大学 Based on the video fusion method of evaluating performance of Higher-order Singular value decomposition
CN103905815A (en) * 2014-03-19 2014-07-02 西安电子科技大学 Video fusion performance evaluating method based on high-order singular value decomposition
US11582814B2 (en) 2014-06-25 2023-02-14 Pismo Labs Technology Limited Methods and systems for transmitting and receiving data through one or more tunnels for packets satisfying one or more conditions
CN108269277B (en) * 2016-12-30 2022-03-08 清华大学 Method and system for quality evaluation of radiation images
CN108269277A (en) * 2016-12-30 2018-07-10 清华大学 For carrying out the method and system of quality evaluation to radiation image
CN110211085A (en) * 2018-02-28 2019-09-06 清华大学 A kind of Quality Measures for Image Fusion and system
CN110211085B (en) * 2018-02-28 2021-04-27 清华大学 Image fusion quality evaluation method and system
CN110969591A (en) * 2018-09-28 2020-04-07 中国科学院合肥物质科学研究院 Experimental device and method for evaluating background image fusion degree of moving object
CN110969591B (en) * 2018-09-28 2023-04-07 中国科学院合肥物质科学研究院 Experimental device and method for evaluating moving object background image fusion degree
CN109753217A (en) * 2018-12-11 2019-05-14 航天信息股份有限公司 Dynamic keyboard operating method, device, storage medium and electronic equipment
CN109637479A (en) * 2019-01-11 2019-04-16 北京德为智慧科技有限公司 A kind of image processing method, device and display
CN111027589A (en) * 2019-11-07 2020-04-17 成都傅立叶电子科技有限公司 Multi-division target detection algorithm evaluation system and method
CN111027589B (en) * 2019-11-07 2023-04-18 成都傅立叶电子科技有限公司 Multi-division target detection algorithm evaluation system and method
CN111107267A (en) * 2019-12-30 2020-05-05 广州华多网络科技有限公司 Image processing method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN100571335C (en) 2009-12-16

Similar Documents

Publication Publication Date Title
CN101115131A (en) Pixel space relativity based image syncretizing effect real-time estimating method and apparatus
WO2023134791A2 (en) Environmental security engineering monitoring data management method and system
CN111027415B (en) Vehicle detection method based on polarization image
CN101883291A (en) Method for drawing viewpoints by reinforcing interested region
CN102088539B (en) Method and system for evaluating pre-shot picture quality
CN112703532B (en) Image processing method, device, equipment and storage medium
CN112200807B (en) Video quality diagnosis method and system
CN102223545B (en) Rapid multi-view video color correction method
CN113901928A (en) Target detection method based on dynamic super-resolution, and power transmission line component detection method and system
CN113076953A (en) Black car detection method, system, device and storage medium
CN111541886A (en) Vision enhancement system applied to muddy underwater
CN116781881A (en) Real-time three-dimensional vision imaging system based on microscope
CN111583315A (en) Novel visible light image and infrared image registration method and device
Zhao et al. Radiance-based color calibration for image-based modeling with multiple cameras
CN105430397A (en) 3D (three-dimensional) image experience quality prediction method and apparatus
CN114018214A (en) Marker binocular sub-pixel distance measurement method based on hardware acceleration system
Li (Retracted) Infrared image filtering and enhancement processing method based upon image processing technology
CN114022820A (en) Intelligent beacon light quality detection method based on machine vision
CN114549386A (en) Multi-exposure image fusion method based on self-adaptive illumination consistency
Kumar et al. Texture feature extraction to colorize gray images
CN107316040A (en) A kind of color of image spatial transform method of illumination invariant
CN111928944A (en) Laser ray detection method, device and system
CN111862184A (en) Light field camera depth estimation system and method based on polar image color difference
CN101751664B (en) Generating system and generating method for three-dimensional depth information
CN111145219A (en) Efficient video moving target detection method based on Codebook principle

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20091216

Termination date: 20140728

EXPY Termination of patent right or utility model