CN111931811A - Calculation method based on super-pixel image similarity - Google Patents

Calculation method based on super-pixel image similarity Download PDF

Info

Publication number
CN111931811A
CN111931811A CN202010607158.0A CN202010607158A CN111931811A CN 111931811 A CN111931811 A CN 111931811A CN 202010607158 A CN202010607158 A CN 202010607158A CN 111931811 A CN111931811 A CN 111931811A
Authority
CN
China
Prior art keywords
image
similarity
key frame
pixel
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010607158.0A
Other languages
Chinese (zh)
Other versions
CN111931811B (en
Inventor
王卫
李珍珍
王梅云
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Jusha Display Technology Co Ltd
Nanjing Jusha Medical Technology Co Ltd
Original Assignee
Nanjing Jusha Display Technology Co Ltd
Nanjing Jusha Medical Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Jusha Display Technology Co Ltd, Nanjing Jusha Medical Technology Co Ltd filed Critical Nanjing Jusha Display Technology Co Ltd
Priority to CN202010607158.0A priority Critical patent/CN111931811B/en
Publication of CN111931811A publication Critical patent/CN111931811A/en
Priority to PCT/CN2021/098184 priority patent/WO2022001571A1/en
Application granted granted Critical
Publication of CN111931811B publication Critical patent/CN111931811B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a calculation method based on super-pixel image similarity. The method comprises the following steps: selecting a key frame image from the image data through a network model; adopting an image segmentation algorithm to divide the key frame image into super-pixel segmentation images of different pixel blocks; performing horizontal turning processing on the key frame image; extracting a segmentation boundary of the super-pixel segmentation map from the super-pixel segmentation map; superposing the segmentation boundary on the key frame image after horizontal turning processing to obtain a superposed image; and carrying out similarity comparison on corresponding pixel blocks in the super-pixel segmentation image and the superposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks. The invention automatically screens the key frame images from the acquired image video stream, and identifies the key frame images to realize intelligent computer-aided diagnosis.

Description

Calculation method based on super-pixel image similarity
Technical Field
The invention belongs to the field of image recognition, and particularly relates to a calculation method based on super-pixel image similarity.
Background
In recent years, the application of deep learning in the field of image processing has also been rapidly developed, but classification and identification for images are important challenges for deep learning. With the increasing amount of data, there is an increasing need for reliable computer-aided diagnosis methods. Currently, researchers have studied a lot of artificial intelligence applied to computer-aided diagnosis methods. The computer aided diagnosis method is to extract effective features from one or several kinds of modal image data and to classify and recognize the extracted effective feature sample in a machine learning method. However, the current computer-aided diagnosis imaging method has the following problems: (1) only a certain part or a specific part is trained to form a network model, and the network model is not suitable for other parts; (2) the selection of the key frame and the region of interest needs manual intervention and is manually finished, so that the convenience is not high.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide a calculation method based on the super-pixel image similarity, which can be used for judging the similarity of a pixel block obtained by segmenting a selected key frame image. The problem that practicability and precision are not high in the prior art is solved.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a calculation method based on super-pixel image similarity comprises the following steps:
selecting a key frame image from the image data through a network model;
adopting an image segmentation algorithm to divide the key frame image into super-pixel segmentation images of different pixel blocks;
performing horizontal turning processing on the key frame image;
extracting segmentation boundaries from the superpixel segmentation map;
superposing the segmentation boundary on the key frame image after horizontal turning processing to obtain a superposed image;
and carrying out similarity comparison on corresponding pixel blocks in the super-pixel segmentation image and the superposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
Further, the training method of the network model comprises the following steps:
selecting a key frame image of a corresponding part according to the part to be identified;
making a training set of the key frame images;
making a test set and a verification set which are not crossed and overlapped with the training set;
and training a network model with the highest accuracy for identifying the key frame images on the test set according to the training set and the verification set.
Further, the key frame image is segmented as follows:
converting the key frame image into a feature vector;
constructing a distance metric according to the feature vector;
clustering local image pixels according to the distance measurement standard;
and continuously iterating and optimizing the clusters until the difference from each pixel point to the cluster center is not changed any more, and obtaining the superpixel segmentation maps of different pixel blocks.
Further, the method further comprises calculating the color distance and the space distance of the searched pixel point according to the distance measurement standard.
Further, the calculation formula of the color distance and the spatial distance is as follows:
Figure BDA0002559301020000021
Figure BDA0002559301020000031
Figure BDA0002559301020000032
in the formula, i and j represent a key frame image x and a key frame image y respectively, l, a and b are feature vectors in Lab color space, x and y are feature vectors in XY coordinates, and dcAnd dsExpressed as colour distance and spatial distance, respectively, DiIs the distance between the pixel point and the seed point, m is a constant, and S is the maximum spatial distance.
Further, the method for acquiring the similarity includes:
converting the superpixel segmentation graph and the superposed image into a YIQ color space image;
calculating PC values of the two color space images;
obtaining the similarity between each point on the image according to the PC value;
and carrying out measurement weighting on the similarity to obtain the similarity of each pair of superpixel blocks.
Further, the similarity is calculated as follows:
Figure BDA0002559301020000033
Figure BDA0002559301020000034
in the formula, Pc represents the phase consistency information of x and y of the image, G represents the gradient amplitude, SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) is the gradient similarity of the two images x, y, T1And T2Is a constant.
Further, the calculation formula of the similarity is as follows:
Figure BDA0002559301020000035
Pcn(x,y)=max[Pc(x),Pc(y)],
where x and y are the key frame image x and the key frame image y, and Ω represents integerOne space domain, SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) is the gradient similarity of the two images x, y, alpha, beta are positive integers, Pc represents the phase consistency information of the images x, y, and n represents the analysis of each pair of superpixel block labels.
Further, the result output of the network model adopts a Softmax function, and the calculation is disclosed as follows:
Figure BDA0002559301020000041
wherein e is a natural constant, j is a classification category, k is the total number of classification categories to be classified, and ziIs the i-th component, P, in a k-dimensional vectoriThe probability of predicting to the i-th class in the image classification is determined.
A computing system based on superpixel image similarity, the system comprising:
a screening module: the system is used for selecting a key frame image from image data through a network model;
a segmentation module: a super-pixel segmentation map for dividing the key frame image into different pixels by an image segmentation algorithm;
the module overturns: the key frame image processing device is used for carrying out horizontal turning processing on the key frame image;
an extraction module: for extracting segmentation boundaries from the superpixel segmentation map;
a superposition module: the segmentation boundary is used for being superposed on the key frame image after the horizontal turning processing to obtain a superposed image;
a comparison module: and the method is used for carrying out similarity comparison on corresponding pixel blocks in the super-pixel segmentation image and the superposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
A system for a method of computing a degree of similarity based on a superpixel image, the system comprising a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method described above.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method described above.
Compared with the prior art, the invention has the following beneficial effects:
according to the method, the key frame images are screened from the image data through the network model, and the convenience, the practicability and the precision of the method are improved through processing and similarity identification of the key frame images; the method can train a network model with the highest image accuracy rate capable of identifying the key frames of the images of different parts according to the difference of the key frames of different parts, and solves the problem that the existing method is single and only suitable for auxiliary diagnosis of the images of specific parts.
Drawings
FIG. 1 is a schematic diagram of an image process for assisting in diagnosing symmetric parts of a human body according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating the effect of a superpixel segmentation brain NMR image in an embodiment of the method of the present invention;
fig. 3 is a superpixel segmentation boundary extracted from a superpixel segmentation brain nmr image according to an embodiment of the method of the present invention.
Detailed Description
The present invention is further illustrated by the following specific examples, which are intended to be illustrative, not limiting and are not intended to limit the scope of the invention.
Firstly, the specific operation method of the invention is introduced:
the specific embodiment of the invention is to provide a calculation method based on the super-pixel image similarity, and the specific embodiment of the invention is to apply the method to the auxiliary diagnosis process, but the method is not limited to the application field provided in the specific embodiment, and the invention can also be equivalently applied to other fields except auxiliary diagnosis.
As shown in fig. 1, which is a schematic diagram of a process of calculating similarity of image pixel blocks in an embodiment of the method of the present invention, the method is applied to auxiliary diagnosis, in the diagram, image data is input into a network model, a key frame image x is obtained, the key frame image x is horizontally flipped to obtain a key frame image y, the key frame image x and the key frame image y are respectively subjected to image segmentation, similarity analysis is performed on each pair of super pixel blocks, and a lesion area is finally located. The method comprises the following specific steps:
step 1, training a network model which can identify the key frame image with the highest accuracy of the part image according to different diagnostic key frames of doctors at different parts of a human body. And selecting a human body part needing image analysis, and automatically screening out a key frame image from the obtained image data through a network model. The concrete implementation steps are as follows:
step 1.1, selecting key frames of corresponding parts for labeling according to parts needing to be identified, and manufacturing a training set for identifying the key frames;
step 1.2, aiming at the training set in the step 1.1, a test set and a verification set which are not overlapped with the training set in a crossing way are manufactured;
and step 1.3, selecting a network model with a neural network depth suitable for manufacturing the data volume of the data set by using the training set and the verification set manufactured in the step 1.1 and the step 1.2, and training the network model with the highest accuracy on the test set. The network model carries out classification and identification on the input image data, the label of the key frame image is 0, and the image predicted by the network model to be the 0 th class with the highest probability is the key frame image. And (3) calculating and disclosing a Softmax function adopted by the output of the final result of the network model as follows:
Figure BDA0002559301020000061
wherein e is a natural constant, j is a classification category, k is the total number of classification categories to be classified, and ziIs the i-th component, P, in a k-dimensional vectoriThe probability of predicting to the i-th class in the image classification is determined.
And 2, labeling each pixel of the key frame image by adopting an image segmentation algorithm (super-pixel segmentation) of simple linear iterative clustering, and dividing the labeled pixels into a plurality of pixel sets. Therefore, the pixel points with similar characteristics, such as texture, information entropy, brightness and the like, are subdivided into an irregular block, the method is compatible with the common gray level image and color image of the segmented image, the operation speed is high, meanwhile, the complete outline can be kept, and the segmentation effect on the region of interest is relatively met. The concrete implementation steps are as follows:
step 2.1, converting the color key frame image into a Lab color space and a 5-dimensional characteristic vector under XY coordinates, wherein the Lab color space is independent of equipment and consists of three elements of brightness L and color channels a and b, and XY is a plane coordinate and is used for positioning the position;
and 2.2, constructing a new distance measurement standard for the feature vector obtained by conversion in the step 2.1, and further clustering local image pixels. Initializing data, distributing different labels to each pixel point in the neighborhood around the seed point, calculating the label of the pixel point belonging to a clustering center, storing the distance from the pixel point to the pixel center, and calculating the color distance and the space distance of the searched pixel point by a new distance measurement standard, wherein the calculation formula is as follows:
Figure BDA0002559301020000071
Figure BDA0002559301020000072
Figure BDA0002559301020000073
where i and j represent key frame image x and key frame image y, respectively. l, a and b are characteristic vectors in Lab color space, and x and y are characteristic vectors in XY coordinates. dcAnd dsRespectively expressed as color distance and space distance to obtain distance D between pixel point and seed pointi. m is a constant with a value in the range of [1,40 ]]The value is generally 10, and S is the maximum spatial distance. In the process, each pixel point can be searched for many times, and the minimum distance between the pixel point and the surrounding seed points is takenThe corresponding seed point is the clustering center of the pixel;
and 2.3, continuously iterating and optimizing until the difference from each pixel point to the clustering center is not changed, and finding out the optimal superpixel segmentation effect after 20 iterations at most through multiple image segmentation experiences.
And 3, horizontally turning the key frame image, extracting the segmentation boundary of the current super-pixel processing from the super-pixel segmentation image obtained in the step 2, and superposing the extracted super-pixel segmentation boundary on the key frame image subjected to the horizontal turning processing, so that the two images are respectively corresponding to the horizontal parts of the same segmentation block. The concrete implementation steps are as follows:
step 3.1, on the basis of the step 1 and the step 2, obtaining a part for segmenting the region of interest and a superpixel segmentation boundary, and horizontally turning the key frame image;
step 3.2, superposing the superpixel segmentation boundary on the horizontally turned key frame image, and segmenting the turned key frame image to ensure that the two images are respectively corresponding superpixel blocks at the horizontal parts in the same segmentation block;
and 4, performing similarity comparison on corresponding pixel blocks in the super-pixel segmentation key frame image and the super-pixel segmentation boundary superposed and inverted key frame image by using a phase consistency algorithm to respectively obtain the similarity of each segmented super-pixel block. The concrete implementation steps are as follows:
step 4.1, converting the super-pixel segmentation key frame image and the super-pixel segmentation boundary superposed and inverted key frame image into a YIQ color space image, wherein a Y component represents the brightness information of the image, an I, Q component represents the chrominance information, and the YIQ color space can separate the brightness and the chrominance of the color image;
step 4.2, calculating PC values of the two images, wherein the PC is the measurement of image phase consistency information, and the similarity and gradient amplitude between the chromaticity characteristics to obtain the similarity between each point on the images; the calculation formula is as follows:
Figure BDA0002559301020000081
Figure BDA0002559301020000091
in the formula, Pc represents the phase consistency information of the image x, y, and G represents the gradient amplitude. SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) is the gradient similarity of the two images x, y, T1Constant sum T2The constant has the function of avoiding that the denominator is zero, and the value is 0.001;
and 4.3, on the basis of the step 4.2, weighting the chroma characteristic similarity measurement by combining the characteristic similarity and the gradient amplitude of the image to obtain the similarity of each point, and further obtaining the similarity between the two images, wherein the calculation formula is as follows:
Figure BDA0002559301020000092
in the formula, x and y are key frame image x and key frame image y, omega represents the whole space domain, SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) is the gradient similarity of the two images x, y, alpha, beta are positive integers, mainly used for adjusting the weight between the characteristic similarity and the gradient similarity, Pc represents the phase consistency information of the images x, y, n represents the label of each pair of superpixel blocks, Pcn(x,y)=max[Pc(x),Pc(y)]And is used to weight the overall similarity of the two images. The similarity FSIM between the two images is obtained through calculation, and the closer the FSIM is, the lower the similarity between the two images is.
And 5, setting a threshold value to analyze the similarity of each pair of superpixel blocks, wherein the pixels of the cancerous tissues and the normal tissues are different in image due to the fact that the nutrient substances, the density and the like of the tissues with pathological changes are changed compared with those of the normal tissues, so that the lower the similarity of each superpixel block of the key frame image is, the higher the possibility of the part with the cancerous tissues is, and otherwise, the normal tissues are. The coordinates of the superpixel blocks are obtained according to the set threshold value, and a large number of experimental tests show that when the similarity is 0.15-0.48, the located pathological part is the closest to the expected effect, so that the threshold value is set to be 0.15-0.48, namely the similarity of the superpixel blocks is in the range, the position which is different from the normal part and is suspected of generating pathological changes is located, and the pathological part can be accurately located. The concrete implementation steps are as follows:
step 5.1, analyzing the similarity of each pair of superpixel blocks, and sequencing all the superpixel blocks according to the similarity in an ascending order;
and 5.2, analyzing the superpixel blocks sequenced in the step 5.1, acquiring the coordinates of the superpixel blocks in a mode of setting a threshold, positioning the positions which are different from the normal positions and are suspected to have pathological changes, and accurately positioning the pathological positions which are possibly have the pathological changes.
The following is a specific embodiment of the method for assisting in diagnosing the symmetrical parts of the human body by image recognition: the implementation process of the present invention is described in detail by taking the brain magnetic resonance image as an example for auxiliary diagnosis and analysis.
Because of the high soft tissue resolution of mri, mri techniques are widely used clinically to assess brain lesions. However, with the ever-increasing amount of data and the possible empirical errors of visual identification, there is an increasing need for automatic and reliable methods of locating pathological brain regions.
Firstly, step 1 is executed, and according to the part needing to be identified, the key frame of the corresponding part is selected and labeled, and a training set, a testing set and a verification set for identifying the key frame are manufactured. And selecting a network model with the neural network depth suitable for manufacturing the data volume of the data set, and training the network model with the highest accuracy on the test set for screening out the key frame images.
Secondly, step 2 is executed, and each pixel of the key frame image is labeled by adopting an image segmentation algorithm of simple linear iterative clustering and is divided into a plurality of pixel sets. This subdivides pixels with similar characteristics, such as texture, entropy, brightness, etc., into an irregular block. The effect of the superpixel segmentation image is shown in fig. 2. The extracted superpixel partition boundaries are shown in FIG. 3
Thirdly, executing step 3, performing horizontal turning processing on the key frame image, superposing the extracted super-pixel segmentation boundary on the key frame image after the horizontal turning processing, wherein the extracted super-pixel segmentation boundary is as shown in fig. 3, and segmenting the turned key frame image to enable the two images to be respectively corresponding parts of the horizontal parts in the same segmentation block;
executing step 4, and performing similarity calculation on corresponding pixel blocks in the super-pixel segmentation key frame image and the super-pixel segmentation boundary superposed and overturned key frame image through a phase consistency algorithm to respectively obtain the similarity of each segmented super-pixel block;
and finally, executing the step 5, sequencing each pair of superpixel blocks according to the similarity, and acquiring the superpixel block coordinates with the similarity within the threshold value through setting the threshold value range to be 0.15-0.48, so that the pathological part of the patient can be accurately positioned.
A computing system based on superpixel image similarity, the system comprising:
a screening module: the system is used for selecting a key frame image from image data through a network model;
a segmentation module: a super-pixel segmentation map for dividing the key frame image into different pixels by an image segmentation algorithm;
the module overturns: the key frame image processing device is used for carrying out horizontal turning processing on the key frame image;
an extraction module: for extracting segmentation boundaries from the superpixel segmentation map;
a superposition module: the segmentation boundary is used for being superposed on the key frame image after the horizontal turning processing to obtain a superposed image;
a comparison module: and the method is used for carrying out similarity comparison on corresponding pixel blocks in the super-pixel segmentation image and the superposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
A computing system based on superpixel image similarity, the system comprising a processor and a storage medium;
the storage medium is used for storing instructions;
the processor is configured to operate in accordance with the instructions to perform the steps of the method described above.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method described above.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (9)

1. A calculation method based on super-pixel image similarity is characterized by comprising the following steps:
selecting a key frame image from the image data through a network model;
dividing the key frame image into super pixel segmentation images of different pixel blocks through an image segmentation algorithm;
performing horizontal turning processing on the key frame image;
extracting segmentation boundaries from the superpixel segmentation map;
superposing the segmentation boundary on the key frame image after horizontal turning processing to obtain a superposed image;
and carrying out similarity comparison on corresponding pixel blocks in the super-pixel segmentation image and the superposed image through a phase consistency algorithm to respectively obtain the similarity of each pair of segmented pixel blocks.
2. The method for calculating the similarity of the super-pixel images according to claim 1, wherein the training method of the network model comprises the following steps:
selecting a key frame image of a corresponding part according to the part to be identified;
making a training set of the key frame images;
making a test set and a verification set which are not crossed and overlapped with the training set;
and training a network model with the highest accuracy for identifying the key frame images on the test set according to the training set and the verification set.
3. The method for calculating the similarity of the super-pixel images according to claim 1, wherein the segmentation process of the key frame images is as follows:
converting the key frame image into a feature vector;
constructing a distance metric according to the feature vector;
clustering local image pixels according to the distance measurement standard;
and continuously iterating and optimizing the clusters until the difference from each pixel point to the cluster center is not changed any more, and obtaining the superpixel segmentation maps of different pixel blocks.
4. The method of claim 3, further comprising calculating the color distance and the spatial distance of the searched pixel points according to a distance metric.
5. The method according to claim 4, wherein the color distance and the spatial distance are calculated as follows:
Figure FDA0002559301010000021
Figure FDA0002559301010000022
Figure FDA0002559301010000023
in the formula, i and j represent a key frame image x and a key frame image y respectively, l, a and b are feature vectors in Lab color space, x and y are feature vectors in XY coordinates, and dcAnd dsExpressed as colour distance and spatial distance, respectively, DiIs the distance between the pixel point and the seed point, m is a constant, and S is the maximum spatial distance.
6. The method for calculating the similarity based on the super-pixel image according to claim 1, wherein the method for obtaining the similarity comprises:
converting the superpixel segmentation graph and the superposed image into a YIQ color space image;
calculating PC values of the two color space images;
obtaining the similarity between each point on the image according to the PC value;
and carrying out measurement weighting on the similarity to obtain the similarity of each pair of superpixel blocks.
7. The method according to claim 6, wherein the similarity is calculated by the following formula:
Figure FDA0002559301010000031
Figure FDA0002559301010000032
in the formula, Pc represents the phase consistency information of x and y of the image, G represents the gradient amplitude, SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) are two images x,gradient similarity of y, T1And T2Is a constant.
8. The method according to claim 6, wherein the similarity is calculated by the following formula:
Figure FDA0002559301010000033
Pcn(x,y)=max[Pc(x),Pc(y)],
in the formula, x and y are key frame image x and key frame image y, omega represents the whole space domain, SPC(x, y) is the characteristic similarity of the two images x, y, SG(x, y) is the gradient similarity of the two images x, y, alpha, beta are positive integers, Pc represents the phase consistency information of the images x, y, and n represents the analysis of each pair of superpixel block labels.
9. The method for calculating the similarity of the super-pixel images according to claim 1, wherein the result output of the network model adopts a Softmax function, and the calculation is disclosed as follows:
Figure FDA0002559301010000034
wherein e is a natural constant, j is a classification category, k is the total number of classification categories to be classified, and ziIs the i-th component, P, in a k-dimensional vectoriThe probability of predicting to the i-th class in the image classification is determined.
CN202010607158.0A 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity Active CN111931811B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010607158.0A CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity
PCT/CN2021/098184 WO2022001571A1 (en) 2020-06-29 2021-06-03 Computing method based on super-pixel image similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010607158.0A CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity

Publications (2)

Publication Number Publication Date
CN111931811A true CN111931811A (en) 2020-11-13
CN111931811B CN111931811B (en) 2024-03-29

Family

ID=73317721

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010607158.0A Active CN111931811B (en) 2020-06-29 2020-06-29 Calculation method based on super-pixel image similarity

Country Status (2)

Country Link
CN (1) CN111931811B (en)
WO (1) WO2022001571A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112669346A (en) * 2020-12-25 2021-04-16 浙江大华技术股份有限公司 Method and device for determining road surface emergency
WO2022001571A1 (en) * 2020-06-29 2022-01-06 南京巨鲨显示科技有限公司 Computing method based on super-pixel image similarity
CN117173175A (en) * 2023-11-02 2023-12-05 湖南格尔智慧科技有限公司 Image similarity detection method based on super pixels

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115294131A (en) * 2022-10-08 2022-11-04 南通海发水处理工程有限公司 Sewage treatment quality detection method and system
CN115641327B (en) * 2022-11-09 2023-05-09 浙江天律工程管理有限公司 Building engineering quality supervision and early warning system based on big data
CN115880295B (en) * 2023-02-28 2023-05-12 吉林省安瑞健康科技有限公司 Computer-aided tumor ablation navigation system with accurate positioning function
CN115914649B (en) * 2023-03-01 2023-05-05 广州高通影像技术有限公司 Data transmission method and system for medical video
CN116863469B (en) * 2023-06-27 2024-05-14 首都医科大学附属北京潞河医院 Deep learning-based surgical anatomy part identification labeling method
CN116630311B (en) * 2023-07-21 2023-09-19 聊城市瀚格智能科技有限公司 Pavement damage identification alarm method for highway administration
CN116823811B (en) * 2023-08-25 2023-12-01 汶上县誉诚制衣有限公司 Functional jacket surface quality detection method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012365A1 (en) * 2015-03-20 2018-01-11 Ventana Medical Systems, Inc. System and method for image segmentation
CN108600865A (en) * 2018-05-14 2018-09-28 西安理工大学 A kind of video abstraction generating method based on super-pixel segmentation
CN109712143A (en) * 2018-12-27 2019-05-03 北京邮电大学世纪学院 A kind of Fast image segmentation method based on super-pixel multiple features fusion
CN109712153A (en) * 2018-12-25 2019-05-03 杭州世平信息科技有限公司 A kind of remote sensing images city superpixel segmentation method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111931811B (en) * 2020-06-29 2024-03-29 南京巨鲨显示科技有限公司 Calculation method based on super-pixel image similarity

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180012365A1 (en) * 2015-03-20 2018-01-11 Ventana Medical Systems, Inc. System and method for image segmentation
CN108600865A (en) * 2018-05-14 2018-09-28 西安理工大学 A kind of video abstraction generating method based on super-pixel segmentation
CN109712153A (en) * 2018-12-25 2019-05-03 杭州世平信息科技有限公司 A kind of remote sensing images city superpixel segmentation method
CN109712143A (en) * 2018-12-27 2019-05-03 北京邮电大学世纪学院 A kind of Fast image segmentation method based on super-pixel multiple features fusion

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022001571A1 (en) * 2020-06-29 2022-01-06 南京巨鲨显示科技有限公司 Computing method based on super-pixel image similarity
CN112669346A (en) * 2020-12-25 2021-04-16 浙江大华技术股份有限公司 Method and device for determining road surface emergency
CN112669346B (en) * 2020-12-25 2024-02-20 浙江大华技术股份有限公司 Pavement emergency determination method and device
CN117173175A (en) * 2023-11-02 2023-12-05 湖南格尔智慧科技有限公司 Image similarity detection method based on super pixels
CN117173175B (en) * 2023-11-02 2024-02-09 湖南格尔智慧科技有限公司 Image similarity detection method based on super pixels

Also Published As

Publication number Publication date
WO2022001571A1 (en) 2022-01-06
CN111931811B (en) 2024-03-29

Similar Documents

Publication Publication Date Title
CN111931811B (en) Calculation method based on super-pixel image similarity
CN106056595B (en) Based on the pernicious assistant diagnosis system of depth convolutional neural networks automatic identification Benign Thyroid Nodules
CN107644420B (en) Blood vessel image segmentation method based on centerline extraction and nuclear magnetic resonance imaging system
Lin et al. Multispectral MR images segmentation based on fuzzy knowledge and modified seeded region growing
US7876938B2 (en) System and method for whole body landmark detection, segmentation and change quantification in digital images
CN108364288A (en) Dividing method and device for breast cancer pathological image
CN108257135A (en) The assistant diagnosis system of medical image features is understood based on deep learning method
Zhang et al. Automated semantic segmentation of red blood cells for sickle cell disease
Cover et al. Computational methods for corpus callosum segmentation on MRI: a systematic literature review
CN104933711A (en) Automatic fast segmenting method of tumor pathological image
CN110415234A (en) Brain tumor dividing method based on multi-parameter magnetic resonance imaging
Wu et al. A supervoxel classification based method for multi-organ segmentation from abdominal ct images
CN114693933A (en) Medical image segmentation device based on generation of confrontation network and multi-scale feature fusion
Mustafa et al. A comparison between different segmentation techniques used in medical imaging
CN115136189A (en) Automated detection of tumors based on image processing
CN111583385A (en) Personalized deformation method and system for deformable digital human anatomy model
Chuang et al. Efficient triple output network for vertebral segmentation and identification
Banerjee et al. A CADe system for gliomas in brain MRI using convolutional neural networks
Akselrod-Ballin et al. An integrated segmentation and classification approach applied to multiple sclerosis analysis
CN114926486B (en) Thyroid ultrasound image intelligent segmentation method based on multi-level improvement
CN114862799B (en) Full-automatic brain volume segmentation method for FLAIR-MRI sequence
CN111415350B (en) Colposcope image identification method for detecting cervical lesions
CN114419309A (en) High-dimensional feature automatic extraction method based on brain T1-w magnetic resonance image
Miao et al. CoWRadar: Visual Quantification of the Circle of Willis in Stroke Patients.
Tak et al. Segmentation of medical image using region based statistical model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant