CN117710696A - Method and system for defining picture characteristics and comparing similarity - Google Patents

Method and system for defining picture characteristics and comparing similarity Download PDF

Info

Publication number
CN117710696A
CN117710696A CN202311780493.0A CN202311780493A CN117710696A CN 117710696 A CN117710696 A CN 117710696A CN 202311780493 A CN202311780493 A CN 202311780493A CN 117710696 A CN117710696 A CN 117710696A
Authority
CN
China
Prior art keywords
value
array
picture
similarity
gray
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311780493.0A
Other languages
Chinese (zh)
Inventor
谢蜀岷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Yiwo Tech Development Co ltd
Original Assignee
Chengdu Yiwo Tech Development Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Yiwo Tech Development Co ltd filed Critical Chengdu Yiwo Tech Development Co ltd
Priority to CN202311780493.0A priority Critical patent/CN117710696A/en
Publication of CN117710696A publication Critical patent/CN117710696A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a method and a system for defining picture characteristics and carrying out similarity comparison, which belong to the technical field of picture similarity comparison, wherein the method for defining the picture characteristics comprises the following steps: preprocessing a target picture, and outputting a gray level image with a highlighted outline; constructing an array G with 180 elements for recording gradient histogram characteristics of the gray level map; traversing the pixel points of the gray level diagram, calculating the gradient value of each pixel point, and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing. According to the invention, the gradient histogram is used as the characteristic of the picture to carry out similarity comparison, so that the robustness of the picture similarity comparison is improved.

Description

Method and system for defining picture characteristics and comparing similarity
Technical Field
The invention belongs to the technical field of image similarity comparison, and particularly relates to a method and a system for defining image characteristics and performing similarity comparison.
Background
Picture similarity comparison is increasingly applied to real life, and judging whether pictures are similar is mainly determined by judging whether pictures have enough similar characteristics or not, and expressing picture characteristics is mainly determined by the following two dimensions and corresponding related algorithms at present, and the method comprises the following steps of:
1) The average value/difference value is hashed, a feature related to the position is generated by comparing the average value of the whole picture with the difference value of the adjacent position after the picture is scaled to the preset resolution, so that the feature number is the resolution.
2) The color histogram and a feature with a small position relation are used for counting how many pixels hit 0-255 by taking a picture designated color channel (which can be gray scale) as statistics, so that a feature is generated, and the feature number is 256.
At present, when the characteristics are used for comparing the images similar to each other, the average value/difference value hash has poor effect on the images shot after the lens is moved, and the accuracy is obviously reduced.
However, when the color (gray) histogram is used for the same picture, the result of the histogram is severely distorted after the brightness is enhanced or reduced, so that a judgment error is generated.
It can be seen that, for image comparison, the current common method has an enhanced requirement for the feature acquisition dimension of the image, and features of other dimensions are required to further expand the accuracy of image similarity matching.
Disclosure of Invention
In view of this, the present invention provides a method and a system for defining image features and performing similarity comparison, wherein gradient histograms are used as image features to perform similarity comparison, so as to improve robustness of image similarity comparison.
In order to solve the above technical problems, the technical solution of the present invention is to adopt a method for defining a characteristic value of a picture, including:
preprocessing a target picture, and outputting a gray level image with a highlighted outline;
constructing an array G with 180 elements for recording gradient histogram characteristics of the gray level map; during initialization, the value of each element in the array G is 0;
traversing the pixel points of the gray level diagram, calculating the gradient value of each pixel point, and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
As an improvement, the method for preprocessing the target picture comprises the following steps:
scaling the target image to a threshold size;
performing color space conversion on the zoomed picture to a gray level image;
identifying the outline in the gray level diagram and acquiring the position information of the outline;
and highlighting the contour in the gray scale according to the position information of the contour.
As a further improvement, after the position information of the contour is taken, the length and the amplitude of the contour are judged, and the contour is reserved when the length and the amplitude are respectively larger than a length threshold value and an amplitude threshold value.
As another further improvement, the pixels of the intermediate portion of the gray-scale image are selected while traversing the pixels of the gray-scale image.
As an improvement, the method for calculating the gradient value of the pixel point comprises the following steps:
acquiring gray values M (x, y) of pixel points;
using the formula
Dx = (x-3,y)- (x+3,y);
Dy =(x,y-3)-(x,y+3)
Respectively calculating gray level difference values of an x coordinate and a y coordinate; wherein Dx is the gray difference value of the x coordinate, dy is the gray difference value of the y coordinate;
using the formula
R=ArcTan(Dx/Dy)
Calculating an radian value; wherein R is an radian value, dx is a gray scale difference value of an x coordinate, and Dy is a gray scale difference value of a y coordinate;
using the formula
A=(R/π)*180+90
Converting the radian value into an angle value so as to obtain a gradient value of the pixel point; wherein A is a gradient value, and R is an radian value; if dy=0, a is directly 90.
As an improvement, the method for calculating the pixel point weight value comprises the following steps:
using the formula t=sqrt (dx+dy+dy)
Calculating a pixel point weight value; wherein T is a weight value, dx is a gray difference value of an x coordinate, and Dy is a gray difference value of a y coordinate.
The invention also provides a system for defining the characteristic value of the picture, which comprises:
the preprocessing module is used for preprocessing the target picture and outputting a gray level image with a highlighted outline;
the array construction module is used for constructing an array G with 180 elements and recording gradient histogram characteristics of the gray level map; during initialization, the value of each element in the array G is 0;
the assignment module is used for traversing the pixel points of the gray level image, calculating the gradient value of each pixel point and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
The invention also provides a picture similarity comparison method, which comprises the following steps:
acquiring characteristic value arrays G [ A ] and G [ B ] of two pictures to be compared according to the method for defining the characteristic values of the pictures;
setting a similarity coefficient g, wherein the initial value of the similarity coefficient g is 0;
comparing each element of the arrays G [ A ] and G [ B ] one by one, if G [ A ] n=G [ B ] n, G [ A ] n is the nth element in the array G [ A ], G [ B ] n is the nth element in the array G [ B ], assigning the result of g+1 to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g;
and (5) assigning g/180 to a similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold, considering the two pictures to be compared as similar pictures.
As an improvement, the similarity threshold is 0.81-0.85.
The invention also provides a picture similarity comparison system, which comprises:
the characteristic value acquisition module is used for acquiring characteristic value arrays G [ A ] and G [ B ] of two pictures to be compared according to the method for defining the characteristic value of the picture in any one of claims 1-6;
the similarity coefficient initialization module is used for setting a similarity coefficient g, and the initial value of the similarity coefficient g is 0;
the comparison module is used for comparing each element of the arrays G [ A ] and G [ B ] one by one, if G [ A ] n=G [ B ] n, G [ A ] n is the nth element in the array G [ A ], G [ B ] n is the nth element in the array G [ B ], the result of g+1 is assigned to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g;
and the judging module is used for assigning g/180 to the similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold value, the two pictures to be compared are considered to be similar pictures.
The invention has the advantages that:
compared with the existing image similarity comparison method with mean value/difference value hash and color histogram as characteristic values, the invention has the following advantages:
1) Camera movement has less impact: the gradient rectangularity is calculated based on gradient information of the image, is not influenced by lens movement on brightness and color, and can better process similarity change caused by shooting angle or movement of a picture photographer. 2) The robustness is stronger: the gradient histogram can better handle the effects of image geometric and optical distortions, such as pixel value changes caused by distortions such as scaling, rotation, affine, projective transformation, and differential motion. Meanwhile, compared with a color histogram, the gradient histogram has stronger robustness to the change of brightness and contrast. 3) More sensitive to miscellaneous textures: the gradient histogram is calculated by utilizing image gradient information, is sensitive to the extraction of image textures and details, and can better handle the complex and excessively abstract situations of the image textures.
4) In the prior art, the gray level map is used for directly acquiring the gradient image, so that the processing precision is poor for the situation that the contrast is poor and the image source is fuzzy. The invention preprocesses the gray level map, and enhances the outline by a series of methods so as to achieve better output effect. Therefore, the image similarity comparison method based on the gradient histogram has stronger robustness, higher reliability and wider applicability, and can better solve the problem of image similarity comparison in real life.
Drawings
FIG. 1 is a flow chart of a method for defining a picture feature value according to the present invention.
Fig. 2 is a schematic structural diagram of a system for defining a picture feature value according to the present invention.
Fig. 3 is a flowchart of a picture similarity comparison method in the present invention.
Fig. 4 is a schematic structural diagram of a picture similarity comparison system in the present invention.
Detailed Description
In order to make the technical scheme of the present invention better understood by those skilled in the art, the present invention will be further described in detail with reference to the following specific embodiments.
Interpretation of the terms
Characteristic value: referring to different characteristics that a specific picture can exhibit under a specific matching algorithm, the following description can determine that the characteristic value is only within a fixed range.
Gradient histogram: taking the gradient as a basic dimension (0-180), and taking the gradient weighted value statistics of each pixel into the overall data formed after each dimension, called a gradient histogram (or HOG), to be used as a basic parameter of an image in further calculation.
Histogram: a method for classifying image parameters includes such steps as determining basic dimensions according to different algorithms, and counting the weighted values of each dimension for each pixel.
Contour extraction: a basic image processing algorithm retrieves a contour from a binary image and forms an output in the form of contour point coordinates.
Gray value: gray scale domain representation of images
ArcTan: and an arctangent function, wherein the arctangent function is used for calculating the basic dimension of 0-180 of the current pixel in the description.
Sqrt: square root, the weight used in the description to calculate the gradient of the current pixel.
Example 1
As shown in fig. 1, the present invention provides a method for defining a picture feature value, including:
s11, preprocessing the target picture, and outputting a gray level image with a highlighted outline.
Specifically, the method for preprocessing the target picture in the invention comprises the following steps:
s111 scales the target image to a threshold size.
In this embodiment, the target picture is scaled to 256×256 pixels. Of course, the size can be adaptively adjusted according to calculation force, demand and the like, and the invention is not limited.
S112 performs color space conversion on the scaled picture to a gray scale.
The gradation map is an image containing only gradation values (luminance values) and no color information. In the gray-scale, the gray-scale value of each pixel represents the brightness level of the pixel, typically between 0 (black) and 255 (white). The gray scale image may be obtained by weighted averaging of the red, green, and blue channels of the color image, or by luminance conversion of the color image. Since the intensity image contains only luminance information, it can reduce memory space and computational complexity in some applications and is also a useful input in some image processing tasks such as edge detection and image segmentation.
The method for converting gray level image of the invention is to weight average the pixel values of the red, green and blue channels of the color image to obtain the pixel value of the gray level image. The conversion is typically performed using the formula shown below:
gray pixel value= (red channel pixel value 0.299 + green channel pixel value 0.587 + blue channel pixel value 0.114)
The method is simple in calculation and good in universality, but for color images with certain specific color distribution, good gray scale effects can be obtained.
S113 identifies the contour in the gray-scale image, and acquires position information of the contour.
The invention adopts a canny operator to carry out contour recognition. The Canny operator is a commonly used edge detection algorithm. Its function is to identify edges in the image and to locate them accurately. The Canny operator enables edge detection through multiple steps: firstly, performing Gaussian filtering on an image to reduce noise; then, calculating the degree of the image to obtain the edge strength and direction of each pixel point; next, non-maximum suppression is used to refine the edges, leaving only pixels with large local gradients as edges; finally, the weak edges are filtered out by thresholding, leaving pixels with intensities exceeding the threshold as the final edges. Canny arithmetic is excellent in calculation accuracy and noise suppression, and is widely used in the fields of image processing and computer vision.
After the image contour is identified, the position information of the contour is obtained through a query contour algorithm. The position information of the outline refers to the position of the pixel where it is located.
S114, judging the length and the amplitude of the profile, and reserving if the length and the amplitude are larger than a length threshold value and an amplitude threshold value respectively.
After the contours are identified, a portion of the contours may not meet the requirements for subsequent processing, e.g., the lines may be too short or of too small an amplitude. By amplitude is meant the span of the lines, one line being even long enough that it simply is crimped in a cluster if it is not stretched. Therefore, in this step, only if one identified contour is greater than the length threshold and greater than the amplitude threshold, for example, the length is greater than 60 pixels and the amplitude is greater than 30 pixels. When judging the amplitude, only the abscissa or the ordinate of the two farthest lines is needed to judge whether the amplitude exceeds the threshold value.
S115 highlights the contour in the gray scale map based on the position information of the contour.
In the step, the position information of the image outline in the gray level diagram is obtained, the pixel at the position is highlighted according to the position information, and the image outline is highlighted.
S12, constructing an array G with 180 elements for recording gradient histogram characteristics of the gray level map; at initialization, the value of each element in array G is 0.
In this step, the gradient histogram feature of the gray map is recorded by an array G, which contains 180 elements,namely G [ A ]] 1 ~G[A] 179 Element G [ A ] at initialization] 1 ~G[A] 179 All 0.
S13, traversing pixel points of the gray level diagram, calculating a gradient value of each pixel point, and accessing corresponding elements in the array G according to the gradient values; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
When traversing the pixels of the gray scale map, in order to avoid the interference of the image edge information on the characteristics, selecting the pixels in the middle part of the gray scale map. For example, for a 256 x 256 pixel gray scale map, one may choose to traverse from (30, 30) to (235 ).
In this embodiment, the method for calculating the gradient value of the pixel point includes:
s131, acquiring gray values M (x, y) of pixel points;
s132 utilizes the formula
Dx = (x-3,y)- (x+3,y);
Dy =(x,y-3)-(x,y+3)
Respectively calculating gray level difference values of an x coordinate and a y coordinate; wherein Dx is the gray difference value of the x coordinate, dy is the gray difference value of the y coordinate;
s133 utilizes the formula
R=ArcTan(Dx/Dy)
Calculating an radian value; wherein R is an radian value, dx is a gray scale difference value of an x coordinate, and Dy is a gray scale difference value of a y coordinate;
s134 utilizes the formula
A=(R/π)*180+90
Converting the radian value into an angle value so as to obtain a gradient value of the pixel point; wherein A is a gradient value, and R is an radian value; if dy=0, a is directly 90.
The gradient value is an angle value of [0,180 ], which corresponds exactly to G [ A ] in the array G] 1 ~G[A] 179 The elements. If the gradient value of a pixel is 150, then it represents the 150 th element in the pixel pair array, namely G [ A ]] 149 The operation is performed.
In this embodiment, the method for calculating the pixel point weight value includes:
using the formula t=sqrt (dx+dy+dy)
Calculating a pixel point weight value; wherein T is a weight value, dx is a gray difference value of an x coordinate, and Dy is a gray difference value of a y coordinate.
In the above example, if the weight of the pixel is 5, then 5 is added with element G [ A ]] 149 The value itself is given by G A] 149 G [ A ]] 149 =G[A] 149 +5。
When all pixels in the gray scale map are traversed, the array G becomes the characteristic value of the gray scale map.
Example 2
As shown in fig. 2, the present invention further provides a system for defining a feature value of a picture, including:
the preprocessing module is used for preprocessing the target picture and outputting a gray level image with a highlighted outline;
the array construction module is used for constructing an array G with 180 elements and recording gradient histogram characteristics of the gray level map; during initialization, the value of each element in the array G is 0;
the assignment module is used for traversing the pixel points of the gray level image, calculating the gradient value of each pixel point and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
Example 3
As shown in fig. 3, the present invention further provides a picture similarity comparison method, which includes:
s21, obtaining characteristic value arrays G [ A ] and G [ B ] of two pictures to be compared according to the method for defining the characteristic values of the pictures.
Similarly, arrays G [ A ] and G [ B ] each contain 180 elements.
S22, setting a similarity coefficient g, wherein the initial value of the similarity coefficient g is 0.
S23, comparing each element of the arrays G A and G B one by one, if G A n=G B n, G A n is the nth element in the array G A, G B n is the nth element in the array G B, then assigning the result of g+1 to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g.
In this step, the elements in the two arrays G [ A ] and G [ B ] are compared one by one in sequence. When the values of the same position elements in the two arrays are equal, the result of g+1 is assigned to the similarity coefficient g, i.e., g=g+1. When the values of the same position elements in the two arrays are not equal, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n)) is assigned to the similarity coefficient g, namely g=g+ (1- (GA n-GB n)/MAX (GA n, GB n)).
S24, assigning g/180 to a similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold, considering the two pictures to be compared as similar pictures.
After the comparison is completed, g/180 is assigned to the similarity coefficient g, that is, g=g/180, and it is obvious that the value range of g is [0,1],0 indicates that the two pictures are completely dissimilar, and 1 indicates that the two pictures are completely similar. In practice, two pictures can be considered similar only if the similarity coefficient g is greater than a similarity threshold. In this embodiment, the similarity threshold is 0.81-0.85.
Example 4
As shown in fig. 4, the present invention further provides a system for comparing similarity of pictures, including:
the characteristic value acquisition module is used for acquiring characteristic value arrays G [ A ] and G [ B ] of two pictures to be compared according to the method for defining the characteristic value of the picture in any one of claims 1-6;
the similarity coefficient initialization module is used for setting a similarity coefficient g, and the initial value of the similarity coefficient g is 0;
the comparison module is used for comparing each element of the arrays G [ A ] and G [ B ] one by one, if G [ A ] n=G [ B ] n, G [ A ] n is the nth element in the array G [ A ], G [ B ] n is the nth element in the array G [ B ], the result of g+1 is assigned to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g;
and the judging module is used for assigning g/180 to the similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold value, the two pictures to be compared are considered to be similar pictures.
The foregoing is merely a preferred embodiment of the present invention, and it should be noted that the above-mentioned preferred embodiment should not be construed as limiting the invention, and the scope of the invention should be defined by the appended claims. It will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the spirit and scope of the invention, and such modifications and adaptations are intended to be comprehended within the scope of the invention.

Claims (10)

1. A method of defining a picture feature value, comprising:
preprocessing a target picture, and outputting a gray level image with a highlighted outline;
constructing an array G with 180 elements for recording gradient histogram characteristics of the gray level map; during initialization, the value of each element in the array G is 0;
traversing the pixel points of the gray level diagram, calculating the gradient value of each pixel point, and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
2. A method of defining picture feature values as claimed in claim 1, and wherein the method of pre-processing the target picture comprises:
scaling the target image to a threshold size;
performing color space conversion on the zoomed picture to a gray level image;
identifying the outline in the gray level diagram and acquiring the position information of the outline;
and highlighting the contour in the gray scale according to the position information of the contour.
3. A method of defining picture feature values as claimed in claim 2, wherein: after the position information of the contour is obtained, the length and the amplitude of the contour are judged, and the contour is reserved when the length and the amplitude are respectively larger than a length threshold value and an amplitude threshold value.
4. A method of defining picture feature values as claimed in claim 1, wherein: when traversing the pixels of the gray scale, selecting the pixels of the middle part of the gray scale.
5. A method of defining a picture feature value as claimed in claim 1, characterized in that the method of calculating the pixel gradient values comprises:
acquiring gray values M (x, y) of pixel points;
using the formula
Dx = (x-3,y)- (x+3,y);
Dy =(x,y-3)-(x,y+3)
Respectively calculating gray level difference values of an x coordinate and a y coordinate; wherein Dx is the gray difference value of the x coordinate, dy is the gray difference value of the y coordinate;
using the formula
R=ArcTan(Dx/Dy)
Calculating an radian value; wherein R is an radian value, dx is a gray scale difference value of an x coordinate, and Dy is a gray scale difference value of a y coordinate;
using the formula
A=(R/π)*180+90
Converting the radian value into an angle value so as to obtain a gradient value of the pixel point; wherein A is a gradient value, and R is an radian value; if dy=0, a is directly 90.
6. A method of defining picture feature values as claimed in claim 1, wherein the method of calculating pixel weight values comprises:
using the formula t=sqrt (dx+dy+dy)
Calculating a pixel point weight value; wherein T is a weight value, dx is a gray difference value of an x coordinate, and Dy is a gray difference value of a y coordinate.
7. A system for defining a picture feature value, comprising:
the preprocessing module is used for preprocessing the target picture and outputting a gray level image with a highlighted outline;
the array construction module is used for constructing an array G with 180 elements and recording gradient histogram characteristics of the gray level map; during initialization, the value of each element in the array G is 0;
the assignment module is used for traversing the pixel points of the gray level image, calculating the gradient value of each pixel point and accessing the corresponding element in the array G according to the gradient value; calculating the weight value of the pixel point, adding the weight value and the value of the corresponding element, and giving the corresponding element; and taking the array G as a target picture characteristic value after traversing.
8. A picture similarity comparison method is characterized by comprising the following steps:
the method for defining the characteristic values of the pictures according to any one of claims 1-6, wherein the characteristic value arrays G [ A ] and G [ B ] of the two pictures to be compared are obtained;
setting a similarity coefficient g, wherein the initial value of the similarity coefficient g is 0;
comparing each element of the arrays G [ A ] and G [ B ] one by one, if G [ A ] n=G [ B ] n, G [ A ] n is the nth element in the array G [ A ], G [ B ] n is the nth element in the array G [ B ], assigning the result of g+1 to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g;
and (5) assigning g/180 to a similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold, considering the two pictures to be compared as similar pictures.
9. The method for comparing similarity according to claim 8, wherein: the similarity threshold is 0.81-0.85.
10. A picture similarity comparison system, comprising:
the characteristic value acquisition module is used for acquiring characteristic value arrays G [ A ] and G [ B ] of two pictures to be compared according to the method for defining the characteristic value of the picture in any one of claims 1-6;
the similarity coefficient initialization module is used for setting a similarity coefficient g, and the initial value of the similarity coefficient g is 0;
the comparison module is used for comparing each element of the arrays G [ A ] and G [ B ] one by one, if G [ A ] n=G [ B ] n, G [ A ] n is the nth element in the array G [ A ], G [ B ] n is the nth element in the array G [ B ], the result of g+1 is assigned to the similarity coefficient G; otherwise, the value of g+ (1- (GA n-GB n)/MAX (GA n, GB n) is assigned to the similarity coefficient g;
and the judging module is used for assigning g/180 to the similarity coefficient g, and if the similarity coefficient is larger than a similarity threshold value, the two pictures to be compared are considered to be similar pictures.
CN202311780493.0A 2023-12-22 2023-12-22 Method and system for defining picture characteristics and comparing similarity Pending CN117710696A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311780493.0A CN117710696A (en) 2023-12-22 2023-12-22 Method and system for defining picture characteristics and comparing similarity

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311780493.0A CN117710696A (en) 2023-12-22 2023-12-22 Method and system for defining picture characteristics and comparing similarity

Publications (1)

Publication Number Publication Date
CN117710696A true CN117710696A (en) 2024-03-15

Family

ID=90147765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311780493.0A Pending CN117710696A (en) 2023-12-22 2023-12-22 Method and system for defining picture characteristics and comparing similarity

Country Status (1)

Country Link
CN (1) CN117710696A (en)

Similar Documents

Publication Publication Date Title
CN109886896B (en) Blue license plate segmentation and correction method
CN111415363B (en) Image edge identification method
CN102426649B (en) Simple steel seal digital automatic identification method with high accuracy rate
CN111080661B (en) Image-based straight line detection method and device and electronic equipment
CN110546651B (en) Method, system and computer readable medium for identifying objects
CN108470356B (en) Target object rapid ranging method based on binocular vision
CN112233181A (en) 6D pose recognition method and device and computer storage medium
CN108389215B (en) Edge detection method and device, computer storage medium and terminal
Navarrete et al. Color smoothing for RGB-D data using entropy information
CN111047615A (en) Image-based line detection method and device and electronic equipment
CN112164117A (en) V-SLAM pose estimation method based on Kinect camera
CN117036641A (en) Road scene three-dimensional reconstruction and defect detection method based on binocular vision
CN109255792B (en) Video image segmentation method and device, terminal equipment and storage medium
CN110009670A (en) The heterologous method for registering images described based on FAST feature extraction and PIIFD feature
CN115239661A (en) Mechanical part burr detection method and system based on image processing
CN115660990A (en) Endoscope image mirror reflection detection and restoration method based on brightness classification
CN114241438B (en) Traffic signal lamp rapid and accurate identification method based on priori information
US20040131236A1 (en) Method and apparatus for processing an image
CN111723805A (en) Signal lamp foreground area identification method and related device
EP4276755A1 (en) Image segmentation method and apparatus, computer device, and readable storage medium
CN109801281B (en) Method for detecting edges of normalized straight-line segments of non-textured metal part images
CN110188640B (en) Face recognition method, face recognition device, server and computer readable medium
CN117710696A (en) Method and system for defining picture characteristics and comparing similarity
CN111241911A (en) Self-adaptive lane line detection method
CN111062384A (en) Vehicle window accurate positioning method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination