CN108416768B - Binary-based foreground image similarity evaluation method - Google Patents

Binary-based foreground image similarity evaluation method Download PDF

Info

Publication number
CN108416768B
CN108416768B CN201810171102.8A CN201810171102A CN108416768B CN 108416768 B CN108416768 B CN 108416768B CN 201810171102 A CN201810171102 A CN 201810171102A CN 108416768 B CN108416768 B CN 108416768B
Authority
CN
China
Prior art keywords
similarity
matrix
foreground image
stretching
binary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810171102.8A
Other languages
Chinese (zh)
Other versions
CN108416768A (en
Inventor
范登平
程明明
曹洋
吴宇寰
任博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nankai University
Original Assignee
Nankai University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nankai University filed Critical Nankai University
Priority to CN201810171102.8A priority Critical patent/CN108416768B/en
Publication of CN108416768A publication Critical patent/CN108416768A/en
Application granted granted Critical
Publication of CN108416768B publication Critical patent/CN108416768B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection

Abstract

The invention discloses a foreground image similarity evaluation method based on binary system, belonging to the technical field of image processing, and comprising the following steps: a. and (3) solving an alignment matrix: subtracting the foreground image mean value from the value of each pixel point in the foreground image to obtain an alignment matrix; b. solving a similarity matrix: the similarity measurement is carried out between the predicted foreground image and the actual artificially marked foreground image, and the similarity matrix is obtained by calculating the alignment matrix of the predicted foreground image and the actual foreground image and then taking the product of corresponding elements of the two matrixes as the similarity matrix; c. matrix normalization: normalizing the elements of the similarity matrix one by one to enable the element value in the matrix to be between-1 and 1; d. stretching matrix elements: performing nonlinear stretching on the normalized similarity matrix value; e. and (4) solving the similarity: and averaging the similarity matrix after stretching to obtain the final foreground image similarity. The method can obtain more accurate foreground image similarity evaluation results.

Description

Binary-based foreground image similarity evaluation method
Technical Field
The invention belongs to the technical field of image processing, and particularly relates to a foreground image similarity evaluation method based on a binary system.
Background
In many important application fields such as image segmentation, object detection, object recognition, foreground extraction, salient object detection and the like, the evaluation of the similarity of foreground images is a very important problem. In general, the predicted foreground map is a binary map obtained from a model for detecting the foreground, and in order to measure the similarity of the predicted foreground map, the predicted foreground map is compared with a manually marked real foreground map, so as to judge the foreground extracting capability of the foreground map prediction model. The conventional binary foreground image similarity evaluation method is based on a pixel mode, such as' Ethernet synthetic de la distribution flow radar un units sections des et des jura. Bull Soc Vaudoise Sci Nat,37: 547-. Margolin et al also propose an Fbw index "How to estimate for the mapped mapsPvPR, pages 248-. However, these methods ignore structural similarity in the foreground map. A New Way to estimate formed Maps, ICCV and 2017' is proposed by Deng-Ping Fan et al in 2017, and the method intensively discusses an evaluation method of non-binary Foreground image similarity, and compared with the traditional evaluation method, the evaluation reliability is greatly improved. For the binary foreground image similarity evaluation problem, the evaluation of the binary foreground image similarity is substantially different from the evaluation of the non-binary foreground image similarity. A non-binary foreground map, each element of which has a value between 0 and 1, is a continuous value representing the magnitude of the probability that it is foreground. Therefore, in measuring the similarity, the Structure-measure takes brightness into consideration, and the contrast is effective. The binary foreground image has discontinuous values, which are not 0 or 1, and is completely different from the evaluation of the binary foreground image, and some non-binary characteristics considered by the Structure-measure are not satisfied under the binary condition, so the Structure-measure can not be directly applied to the evaluation of the similarity of the binary foreground image. It has been found by analysis that the conventional index and this latest index either ignore image-level information or take pixel-level information into account separately from image-level information. Visual physiology studies show that human eyes perceive similarity locally and globally simultaneously. Neglecting this characteristic, the current method cannot give a reasonable evaluation of the foreground map similarity, which results in an erroneous evaluation of the foreground map with high similarity with a low score.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method capable of simultaneously considering the similarity of a pixel level and an image level aiming at the problem that the existing method for evaluating the similarity cannot simultaneously consider the similarity of the pixel level and the image level. The method comprises the steps of firstly, calculating the difference between an input foreground image and the mean value of the foreground image to obtain an alignment matrix. And then calculating the product of corresponding elements between the predicted foreground image and the actual foreground image alignment matrix to obtain a similarity matrix, further normalizing and stretching the similarity matrix, and finally solving the average value of the matrix to obtain the similarity of the predicted foreground image.
The invention discloses a foreground image similarity evaluation method based on a binary system, which comprises the following steps:
a. aligning the matrix: the method uses the value of each pixel point in the foreground image to subtract the foreground image mean value to obtain an alignment matrix, and the method can combine the local information with the global information;
b. similarity matrix: the similarity measurement is carried out between the predicted foreground image and the actual artificially marked foreground image, and the similarity matrix is obtained by calculating the alignment matrix of the predicted foreground image and the actual foreground image and then taking the product of corresponding elements of the two matrixes as the similarity matrix;
c. matrix normalization: normalizing the elements of the similarity matrix one by one to ensure that the element values in the matrix are between-1 and 1, wherein-1 represents complete dissimilarity, 1 represents complete similarity, and the similarity degree is highest;
d. stretching matrix elements: performing nonlinear stretching on the normalized similarity matrix value;
e. and (4) solving the similarity: and averaging the stretched similarity matrix to obtain the final foreground image similarity.
The invention has the beneficial effects that: the method can obtain the similarity of the foreground images without complex calculation, and compared with other current evaluation methods, the performance of the method is improved by 9% -19% in practical application scenes, so that the evaluation result of the similarity of the binary foreground images is more reliable.
Drawings
The invention is described in further detail below with reference to the following figures and detailed description:
fig. 1 is a flow chart of a foreground image similarity evaluation method based on binary system.
Fig. 2 is a schematic diagram of a foreground image similarity evaluation method based on a binary system.
Detailed Description
Referring to fig. 1, a flow chart of a binary-based foreground image similarity evaluation method is shown, wherein the steps shown in the flow chart are as follows:
a. and inputting a prediction foreground image and a real foreground image, wherein the prediction foreground image is generally a result detected by a foreground detection model, and the real foreground image is a real foreground image manually marked by a human.
b. And respectively subtracting the average value of the foreground images of the prediction foreground image and the real foreground image from the prediction foreground image and the real foreground image to obtain 2 alignment matrixes. The mean value of the mean calculation here is a mean value at an image level so that statistical information at an image level can be obtained. The alignment matrix obtained by subtracting the mean value from the foreground map describes the local pixel to image level difference, and can capture local and global information simultaneously.
c. And multiplying the 2 alignment matrixes obtained in the last step by matrix elements to obtain a similarity matrix, wherein the larger each element in the similarity matrix is, the higher the similarity of the corresponding positions of the two prediction foreground images and the real foreground image is, and vice versa.
d. The elements of the similarity matrix can be normalized to [ -1,1] after the similarity matrix is normalized, wherein, -1 represents complete dissimilarity of corresponding positions, and 1 represents complete similarity.
e. When the predicted foreground image is very close to the real foreground image, for example, the similarity reaches 90%, and if the similarity further reaches 100%, the similarity needs to be improved by 10%; for another case, when the similarity between the predicted foreground image and the real foreground image is 0, the similarity is increased by 10%, so that the similarity is improved to 10%. Again, this is a 10% increase, but the former is much more difficult from 90% to 100% than from 0% to 10%. Based on this idea, the method then performs a non-linear stretching on each element of the similarity matrix. If xijRepresenting the elements of the ith row and j column of the normalized similarity matrix X, the stretching is to each element X of the similarity matrixijThe following formula is calculated:
Figure BDA0001585851930000041
f (x) is a stretching function. This may enable higher weights to be obtained for more similar locations.
f. Finally, summing and averaging the similarity matrix f (X) after linear stretching, namely calculating the following formula:
Figure BDA0001585851930000042
the final similarity S (p.s.similarity) is obtained, where m and n are the matrix width and height. A large number of experimental results show that the method can effectively evaluate the similarity of the foreground images.
Referring to fig. 2, a schematic diagram of the method is shown. The visualized results after algorithm processing in each stage are shown in fig. 2 and fig. 1 to have the same meaning, which is mainly helpful to understand the effect of each part in fig. 1 after processing. The symbols are as follows:
(a) GT represents the true foreground map of the input; (b) EM represents an input prediction foreground map; (c) u. ofgtRepresents the mean of GT; (d) u. ofemMean values for EM; (e)
Figure BDA0001585851930000043
is the alignment matrix of the GT; (f)
Figure BDA0001585851930000044
is an alignment matrix for EM; (g) represents the nonlinear stretching function f (x) used in the method; (h) phi denotes the results of the normalization and non-linear stretching of the similarity matrix of step b of FIG. 1.

Claims (4)

1. A foreground image similarity evaluation method based on binary system is characterized in that: the method comprises the following steps:
a. and (3) solving an alignment matrix: the method uses the value of each pixel point in the foreground image to subtract the foreground image mean value to obtain an alignment matrix, and the method can combine local information with global information;
b. solving a similarity matrix: the similarity measurement is carried out between the predicted foreground image and the actual artificially marked foreground image, and the similarity matrix is obtained by calculating the alignment matrix of the predicted foreground image and the actual foreground image and then taking the product of corresponding elements of the two matrixes as the similarity matrix;
c. matrix normalization: normalizing the elements of the similarity matrix one by one to ensure that the element values in the matrix are between-1 and 1, wherein-1 represents complete dissimilarity, 1 represents complete similarity, and the similarity degree is highest;
d. stretching matrix elements: performing nonlinear stretching on the normalized similarity matrix value;
e. and (4) solving the similarity: and summing the average value of the similarity matrixes after stretching to obtain the final foreground image similarity.
2. The binary-based foreground image similarity evaluation method according to claim 1, wherein: and respectively calculating the average values of the predicted foreground image and the real foreground image, and then subtracting the average value from each element of the foreground image to obtain an alignment matrix.
3. The binary-based foreground image similarity evaluation method according to claim 1, wherein: stretching is to each element x of the similarity matrixijThe following formula is calculated:
Figure FDA0001585851920000011
wherein f (x) is a stretching function, xijRepresenting the elements of the ith row and j column of the normalized similarity matrix X.
4. The binary-based foreground image similarity evaluation method according to claim 1, wherein: the similarity matrix f (X) after linear stretching is subjected to summation and average, namely, the following formula is calculated:
Figure FDA0001585851920000012
and obtaining the final similarity S, wherein m and n are the matrix width and height.
CN201810171102.8A 2018-03-01 2018-03-01 Binary-based foreground image similarity evaluation method Active CN108416768B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810171102.8A CN108416768B (en) 2018-03-01 2018-03-01 Binary-based foreground image similarity evaluation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810171102.8A CN108416768B (en) 2018-03-01 2018-03-01 Binary-based foreground image similarity evaluation method

Publications (2)

Publication Number Publication Date
CN108416768A CN108416768A (en) 2018-08-17
CN108416768B true CN108416768B (en) 2021-05-25

Family

ID=63129665

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810171102.8A Active CN108416768B (en) 2018-03-01 2018-03-01 Binary-based foreground image similarity evaluation method

Country Status (1)

Country Link
CN (1) CN108416768B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114528015B (en) * 2022-04-24 2022-07-29 湖南泛联新安信息科技有限公司 Method for analyzing homology of binary executable file, computer device and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551904A (en) * 2009-05-19 2009-10-07 清华大学 Image synthesis method and apparatus based on mixed gradient field and mixed boundary condition
CN106203430A (en) * 2016-07-07 2016-12-07 北京航空航天大学 A kind of significance object detecting method based on foreground focused degree and background priori

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8452087B2 (en) * 2009-09-30 2013-05-28 Microsoft Corporation Image selection techniques
US8792013B2 (en) * 2012-04-23 2014-07-29 Qualcomm Technologies, Inc. Method for determining the extent of a foreground object in an image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101551904A (en) * 2009-05-19 2009-10-07 清华大学 Image synthesis method and apparatus based on mixed gradient field and mixed boundary condition
CN106203430A (en) * 2016-07-07 2016-12-07 北京航空航天大学 A kind of significance object detecting method based on foreground focused degree and background priori

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Structure-measure: A New Way to Evaluate Foreground Maps";Deng-Ping Fan.etc;《2017 IEEE International Conference on Computer Vision》;20171225;全文 *
"How to Evaluate Foreground Maps";Ran Margolin.etc;《2014 IEEE Conference on Computer Vision and Pattern Recognition》;20141231;全文 *

Also Published As

Publication number Publication date
CN108416768A (en) 2018-08-17

Similar Documents

Publication Publication Date Title
CN108121991B (en) Deep learning ship target detection method based on edge candidate region extraction
CN104463870A (en) Image salient region detection method
CN112308873B (en) Edge detection method for multi-scale Gabor wavelet PCA fusion image
CN113111878B (en) Infrared weak and small target detection method under complex background
CN106934338B (en) Long-term pedestrian tracking method based on correlation filter
CN110852327A (en) Image processing method, image processing device, electronic equipment and storage medium
CN113822352A (en) Infrared dim target detection method based on multi-feature fusion
CN113780110A (en) Method and device for detecting weak and small targets in image sequence in real time
CN105354547A (en) Pedestrian detection method in combination of texture and color features
CN111027564A (en) Low-illumination imaging license plate recognition method and device based on deep learning integration
Jing et al. Automatic recognition of weave pattern and repeat for yarn-dyed fabric based on KFCM and IDMF
CN108416768B (en) Binary-based foreground image similarity evaluation method
Chen et al. Image segmentation based on mathematical morphological operator
CN116912184B (en) Weak supervision depth restoration image tampering positioning method and system based on tampering area separation and area constraint loss
US20230386023A1 (en) Method for detecting medical images, electronic device, and storage medium
CN113221603A (en) Method and device for detecting shielding of monitoring equipment by foreign matters
CN111104857A (en) Identity recognition method and system based on gait energy diagram
CN108241837B (en) Method and device for detecting remnants
CN110633705A (en) Low-illumination imaging license plate recognition method and device
CN106446832B (en) Video-based pedestrian real-time detection method
CN115731257A (en) Leaf form information extraction method based on image
Kaur et al. Performance Evaluation of various thresholding methods using canny edge detector
Lin et al. Knowledge-based hierarchical region-of-interest detection
CN114581475A (en) Laser stripe segmentation method based on multi-scale saliency features
CN115619801A (en) Monitoring video image occlusion detection method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant