CN114202665A - Image similarity determining method and device, equipment and storage medium - Google Patents

Image similarity determining method and device, equipment and storage medium Download PDF

Info

Publication number
CN114202665A
CN114202665A CN202010893742.7A CN202010893742A CN114202665A CN 114202665 A CN114202665 A CN 114202665A CN 202010893742 A CN202010893742 A CN 202010893742A CN 114202665 A CN114202665 A CN 114202665A
Authority
CN
China
Prior art keywords
image
pixel
pixels
target
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010893742.7A
Other languages
Chinese (zh)
Inventor
石云柯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Mobile Communications Group Co Ltd
China Mobile Suzhou Software Technology Co Ltd
Original Assignee
China Mobile Communications Group Co Ltd
China Mobile Suzhou Software Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Mobile Communications Group Co Ltd, China Mobile Suzhou Software Technology Co Ltd filed Critical China Mobile Communications Group Co Ltd
Priority to CN202010893742.7A priority Critical patent/CN114202665A/en
Publication of CN114202665A publication Critical patent/CN114202665A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method, a device, equipment and a storage medium for determining image similarity, wherein the method comprises the following steps: acquiring a first image and a second image which are subjected to cell division processing respectively; respectively acquiring a first pixel set and a second pixel set; if the pixel value of the pixel in the first pixel set and the pixel value of the corresponding pixel in the second pixel set meet the set relationship, determining the pixel in the first pixel set as a target pixel; and determining the ratio of the number of the target pixels to the total number of the pixels in the first pixel set as the similarity of the first image and the second image. The pixel set positioned at the intersection point of the grids is sampled and selected from all the pixels of the first image and the second image, the pixels in the pixel set are analyzed, and the image similarity is determined according to the analysis result, so that the efficiency and the accuracy of determining the image similarity can be improved.

Description

Image similarity determining method and device, equipment and storage medium
Technical Field
The present application relates to computer technologies, and in particular, to, but not limited to, a method, an apparatus, a device, and a storage medium for determining image similarity.
Background
Image recognition refers to a technology of processing, analyzing and understanding an image by using a computer to recognize various targets and objects in different modes, and in an automatic or semi-automatic test of a User Interface (UI) of software, a powerful support of the image recognition technology is required because a large number of different controls are recognized and screen display results are collated.
At present, the similarity between different images is usually determined by identifying feature points of an image and generating an image fingerprint through a Hash method (Hash) or a Scale-invariant feature transform (SIFT) algorithm, and the like, and comparing the image fingerprints of different images. However, the method for determining the image similarity has the problem of complicated calculation.
Disclosure of Invention
In view of this, embodiments of the present application provide an image similarity determining method, an image similarity determining device, an image similarity determining apparatus, and a storage medium, which are used to solve the problem of complicated calculation when determining image similarity.
In a first aspect, an embodiment of the present application provides an image similarity determining method, where the method includes: acquiring a first image and a second image which are subjected to cell division processing respectively, wherein the size of at least one edge of the first image is equal to that of at least one edge of the second image; respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image; determining a pixel in the first pixel value set as a target pixel if the pixel value of the pixel in the first pixel set and the pixel value of the corresponding pixel in the second pixel set satisfy a set relationship; and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
In some embodiments, the method further comprises: and respectively adjusting the sizes of the first image and the second image so as to enable the size of at least one edge of the first image and the second image to be equal. By adjusting the sizes of the first image and the second image to be equal to each other at least in one edge, when the first image and the second image are two images formed before and after the scaling of the same image, the pixels compared in the first pixel set and the second pixel set are ensured to be the same pixels before and after the scaling, so that the similarity of the first image and the second image is more accurately determined.
In some embodiments, the resizing the first image and the second image, respectively, comprises: adjusting both the length of the first image and the length of the second image to a target length, and/or adjusting both the width of the first image and the width of the second image to a target width; wherein the target length is a smaller length of the first image and the length of the second image, and the target width is a smaller width of the first image and the width of the second image. The first image and the second image are adjusted to the smaller length in the lengths corresponding to the first image and the second image, and/or the first image and the second image are adjusted to the smaller width in the widths corresponding to the first image and the second image, so that the influence on the definition of the images caused by the size of the amplified images can be avoided, the determination of the similarity of the first image and the second image is more accurate, and the diversity of the image size adjustment is increased.
In some embodiments, the method further comprises: the resolutions of the first image and the second image are respectively adjusted so that the resolutions of the first image and the second image are equal. By unifying the resolutions of the two images, the problem of inaccurate determination of the similarity of different images caused by unequal resolutions of the different images can be avoided.
In some embodiments, the adjusting the resolution of the first image and the second image, respectively, comprises: adjusting both the resolution of the first image and the resolution of the second image to a target resolution; wherein the target resolution is a smaller resolution of the first image and the resolution of the second image. The first image and the second image are adjusted to the lower resolution ratio of the resolution ratios corresponding to the first image and the second image, so that the influence on the definition of the image caused by the fact that the resolution ratio of the image is increased can be avoided, and the determination of the similarity of the first image and the second image is more accurate.
In some embodiments, the method further comprises: dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1. By carrying out the binning processing on the first image and the second image respectively, respectively sampling and selecting a first pixel set and a second pixel set which are only located at the position of a binning intersection point from all pixels in the first image and the second image, and analyzing and comparing only corresponding pixels in the first pixel set and the second pixel set without analyzing other pixels in the first image and the second image, the efficiency of determining the similarity between the first image and the second image can be improved.
In some embodiments, the m is equal to the n. If the second image is an image obtained by rotating the first image by ninety degrees counterclockwise/clockwise, n and m are set to be equal, so that the pixels compared in the first pixel set and the second pixel set can be the same pixel before and after rotation, and the determination of the similarity of the first image and the second image is more accurate.
In some embodiments, the pixel values of the pixels include R, G, and B values of the pixels, and if the pixel values of the pixels in the first set of pixels and the pixel values of the corresponding pixels in the second set of pixels satisfy a set relationship, the method includes: if the difference value between the R values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set first threshold value, the difference value between the G values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set second threshold value, and the difference value between the B values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set third threshold value. The accuracy of determining the similarity of the first image and the second image can be improved by respectively determining whether the difference values among the R value, the G value and the B value are smaller than the corresponding preset threshold value and determining the pixel as the target pixel when the difference values are smaller than the corresponding preset threshold value.
In a second aspect, an embodiment of the present application provides an image similarity determining apparatus, including: the device comprises a first acquisition module, a second acquisition module and a control module, wherein the first acquisition module is used for acquiring a first image and a second image which are subjected to division processing respectively, and the size of at least one edge of the first image is equal to that of at least one edge of the second image; the second acquisition module is used for respectively acquiring the first pixel set and the second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image; a first determining module, configured to determine a pixel in the first set of pixel values as a target pixel if a pixel value of the pixel in the first set of pixels and a pixel value of a corresponding pixel in the second set of pixels satisfy a set relationship; a second determining module, configured to determine a ratio of the number of the target pixels to a total number of pixels in the first pixel set as a similarity between the first image and the second image.
In a third aspect, an embodiment of the present application provides a computer device, including a memory and a processor, where the memory stores a computer program that is executable on the processor, and the processor implements, when executing the computer program, the steps in the image similarity determination method according to any one of the embodiments of the present application.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the image similarity determination method according to any one of the embodiments of the present application.
In the embodiment of the application, the first image and the second image are subjected to the binning processing respectively, so that a first pixel set and a second pixel set which are only located at the positions of the intersection points of the binning points are sampled and selected from all pixels in the first image and the second image respectively, and when the pixel values of the pixels in the first pixel set and the corresponding pixels in the second pixel set meet the set relationship, the similarity between the first image and the second image is determined according to the ratio of the number of the pixels in the first pixel set which meet the set relationship to the total number of the pixels in the first pixel set, so that the convenience and the accuracy of determining the similarity between the first image and the second image can be improved.
Drawings
FIG. 1 is a schematic diagram of a first image and a second image according to an embodiment of the present application;
fig. 2 is a schematic flowchart of an image similarity determining method according to an embodiment of the present application;
FIG. 3 is a diagram illustrating a method for resizing a first image and a second image according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a binning method for processing an image according to an embodiment of the present disclosure;
FIG. 5 is a diagram illustrating a target pixel determination method according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a structure of an image similarity determination apparatus according to an embodiment of the present disclosure;
fig. 7 is a hardware entity diagram of a computer device according to an embodiment of the present application.
Detailed Description
The technical solution of the present application is further elaborated below with reference to the drawings and the embodiments.
In this embodiment, an application scenario for determining image similarity is provided first, fig. 1 is a schematic diagram of a first image and a second image in an embodiment of the present application, and as shown in fig. 1, in a UI automation test process of software to be tested, when a test result is verified, first, a screenshot operation may be performed on a test result interface of the software to be tested to obtain an actual test result: a first image (which may be image Y) and obtaining a desired test result for comparison with image Y: a second image (which may be image X); secondly, the image X and the image Y may be respectively subjected to the resizing process shown in fig. 3 and the binning process shown in fig. 4, and a first pixel set and a second pixel set at the intersection point position of the binning are obtained; then, if the pixels in the first pixel set and the corresponding pixels in the second pixel set meet the set relationship, determining the pixels in the first pixel set as target pixels; then, according to the number of the target pixels and the total number of the pixels of the first pixel set, determining the similarity of the image X and the image Y; finally, whether the expected test result is consistent with the actual test result can be determined according to the similarity.
The embodiment proposes an image similarity determining method, which is applied to a computer device, and the functions implemented by the method may be implemented by a processor in the computer device calling a program code, which may be stored in a computer storage medium, so that the computer device at least includes the processor and the storage medium.
Fig. 2 is a schematic flow chart of an implementation of the image similarity determining method according to the embodiment of the present application, and as shown in fig. 2, the method includes:
step 202: acquiring a first image and a second image which are subjected to cell division processing respectively;
wherein the first image and the second image may have at least one edge of equal size; the size of the image can be used to describe the size of an image, the size of the image can be in pixels, and can also be in length units (centimeters, millimeters, inches, and the like), and the size of the image can be in length of the image, and can also be in width of the image.
Step 204: respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
referring to fig. 4, the image X has 22 × 15 to 330 pixels in total, the pixels are inseparable elements in the image, each pixel has a clear position and an assigned color value, the position and the color value of the pixel included in the image determine the appearance of the image, only the pixel at the position of the intersection of the squares of the image X can be sampled from 330 pixels to analyze the similarity between different images, the pixel at the position of the intersection of the squares of the image X is represented by a black dot, the other pixels except the position of the intersection of the squares of the image X are represented by gray dots, if the image X is the first image, the set of 64 pixels at the position of the intersection of the 64 squares is the first pixel set, and if the image X is the second image, the set of 64 pixels at the position of the intersection of the 64 squares is the second pixel set.
Step 206: determining a pixel in the first pixel value set as a target pixel if the pixel value of the pixel in the first pixel set and the pixel value of the corresponding pixel in the second pixel set satisfy a set relationship;
the method for determining the correspondence between the pixels in the first pixel set and the pixels in the second pixel set may include the following steps 2061 to 2063: step 2061: determining the coordinates of each grid intersection point position in the first image as first position information of pixels corresponding to the grid intersection point position in the first pixel set; step 2062: determining the coordinates of each cell intersection point position in the second image as second position information of pixels corresponding to the cell intersection point position in the second pixel set; step 2063: and determining the corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information.
Referring to fig. 4, assuming that the first image and the second image are separately subjected to binning processing, and both of the first image and the second image are divided into 7 × 7 regions, coordinates of the intersection point positions of the bins in the first image may be { (0,0), (1,0), … …, (7,0), (0,1), (1,1), … …, (0,7), … …, (7,7) }; the position information of a certain pixel in the first pixel set may be a coordinate of a position of a grid intersection where the pixel is located in the first image; correspondingly, the coordinates of the cell intersection position in the second image may be { (0,0), (1,0), … …, (7,0), (0,1), (1,1), … …, (0,7), … …, (7,7) }; the position information of a certain pixel in the second pixel set can be the coordinate of the position of the intersection point of the grids where the pixel is located in the second image; a pixel having the same position information in the first pixel set and the second pixel set may be determined as a corresponding pixel, if the coordinate of a certain cell intersection position in the first image is (1,0), the pixel of the cell intersection position is the first pixel, (1,0) is determined as the first position information of the first pixel, the coordinate of a certain cell intersection position in the second image is (1,0), the pixel of the cell intersection position is the second pixel, (1,0) is determined as the second position information of the second pixel, and since the first position information and the second position are the same, the first pixel having the first position information and the second pixel having the second position information may be determined as corresponding pixels, that is, the corresponding pixel of the pixel having the position information of (1,0) in the first pixel set may be determined as the corresponding pixel of the second pixel set as the position information of (1,0) the pixel of (2).
The image modes may include a bitmap mode, a grayscale mode, a two-tone mode, an RGB (Red, Green, Blue) mode, a CMYK (Cyan, Magenta, Yellow, Black) mode, and the like, where the image modes are different, and the pixel values of the corresponding pixels in the first pixel set and the second pixel set are different from each other; when the image mode is a gray scale mode, the set relationship may be that gray scale values of corresponding pixels in the first pixel set and the second pixel set satisfy a set relationship; when the image mode is an RGB mode, the set relationship may be at least two relationships that satisfy the setting between R values, G values, and B values of corresponding pixels in the first pixel set and the second pixel set, or may be a relationship that satisfies the setting between R values, G values, and B values of corresponding pixels in the first pixel set and the second pixel set.
Referring to fig. 5, assuming that the image X is a first image, the image Y is a second image, and the pixel values of the 20 pixels in the upper left corner of the first pixel set and the pixel values of the 20 pixels in the corresponding upper left corner of the second pixel set satisfy a set relationship, the 20 pixels in the upper left corner of the first pixel set may be determined as a target pixel.
Step 208: and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
Referring to fig. 5, assuming that the image X is a first image, the total number of pixels in the first pixel set is 64, and the ratio of the number of target pixels to the total number of pixels in the first pixel set may be: 20/64 × 100% ═ 31.25%, i.e., the similarity between the first image and the second image was 31.25%.
In the embodiment of the application, the first image and the second image are subjected to binning processing respectively, so that a first pixel set and a second pixel set which are only located at the positions of the intersection points of the binning are sampled and selected from all pixels in the first image and the second image respectively, and when the pixel values of the pixels in the first pixel set and the corresponding pixels in the second pixel set meet a set relationship, the similarity between the first image and the second image is determined according to the ratio of the number of the pixels in the first pixel set which meet the set relationship to the total number of the pixels in the first pixel set, so that the convenience and the accuracy of determining the similarity between the first image and the second image can be improved.
An embodiment of the present application further provides an image similarity determining method, where the method may include steps 302 to 314:
step 302: respectively adjusting the sizes of the first image and the second image so as to enable the size of at least one edge of the first image to be equal to that of at least one edge of the second image;
when the unit of the size of the image is pixel, the original size of the first image may be 1920 × 1080, the original size of the second image may be 960 × 540, and the size of the first image may be reduced or the size of the second image may be enlarged so that the size of at least one edge of the first image is equal to that of at least one edge of the second image; the size of the image may be set by some image processing software.
In one embodiment, the first image may be resized to 960 x 1080, leaving the second image unchanged in size, so that the first and second images are equal in size in length.
In another embodiment, the first image may be resized to 1920 x 540, leaving the second image unchanged, so that the widths of the first and second images are equal in size.
In yet another embodiment, the first image may be resized to 960 x 540, leaving the second size unchanged, so that the first and second images are equal in both length and width dimensions.
In step 302 of this embodiment of the application, assuming that the first image and the second image are two images formed before and after the scaling of the same image, the size of at least one side of the first image and the size of at least one side of the second image are adjusted to be equal, so that the pixels compared in the first pixel set and the second pixel set can be ensured to be the same pixels before and after the scaling, and the determination of the similarity between the first image and the second image is more accurate.
Step 304: adjusting the resolutions of the first image and the second image respectively so that the resolutions of the first image and the second image are equal;
the image resolution may be the number of pixels included in a unit area, and the unit is dpi (dots per inch), and the image resolution determines the degree of sharpness of the image; the resolution of the first image may be 687 × 480 and the resolution of the second image may be 640 × 513, and the resolution of the first image may be adjusted lower or the resolution of the second image may be adjusted higher to equalize the resolution of the first image and the second image; the resolution of the image may be set by some image processing software.
In one embodiment, the resolution of the first image may be adjusted to 640 x 513, and the resolution of the second image may be kept unchanged to equalize the resolutions of the first and second images.
In another embodiment, the resolution of the second image may be adjusted to 687 × 480, and the resolution of the first image may be kept unchanged to equalize the resolutions of the first and second images.
In yet another embodiment, the resolution of both the first image and the second image may be adjusted to 687 x 513 to equalize the resolution of the first image and the second image.
In yet another embodiment, the resolution of both the first image and the second image may be adjusted to 640 x 480 so that the resolution of the first image and the second image are equal.
In step 304 of the embodiment of the present application, assuming that the first image and the second image are an interface screenshot of an actual test result and an expected test result in the UI automatic test process, respectively, since the resolution of the interface screenshot is generally low, the resolutions of the interface screenshot of the expected test result and the interface screenshot of the actual test result need to be adjusted to be equal, so that the problem of inaccurate determination of the similarity of different images due to unequal resolutions of different images can be avoided.
Step 306: dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1.
In one embodiment, n +1 laterally equidistant grid lines and m +1 longitudinally equidistant grid lines may be established on the first image to divide the first image into n × m regions; similarly, n +1 transverse equidistant grid lines and m +1 longitudinal equidistant grid lines may also be established on the second image to divide the second image into n × m regions.
In step 306 of this embodiment of the present application, by performing binning processing on the first image and the second image respectively, so as to sample and select, from all pixels in the first image and the second image, the first pixel set and the second pixel set only at the position of the intersection of the binning points respectively, and perform analysis and comparison on only corresponding pixels in the first pixel set and the second pixel set, without performing analysis on other pixels in the first image and the second image, thereby improving the efficiency of determining the similarity between the first image and the second image.
In one embodiment, n may be equal to m, and referring to fig. 4, 8 transverse equidistant grid lines and 8 longitudinal equidistant grid lines may be established on the image X to divide the image X into 7 × 7 regions; if the second image is an image obtained by rotating the first image by ninety degrees counterclockwise/clockwise, n and m are set to be equal, so that the pixels compared in the first pixel set and the second pixel set can be the same pixel before and after rotation, and the determination of the similarity of the first image and the second image is more accurate.
Step 308: acquiring a first image and a second image which are subjected to cell division processing respectively, wherein the size of at least one edge of the first image is equal to that of at least one edge of the second image;
step 310: respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
step 312: determining a pixel in the first pixel value set as a target pixel if the pixel value of the pixel in the first pixel set and the pixel value of the corresponding pixel in the second pixel set satisfy a set relationship;
step 314: and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
An embodiment of the present application further provides an image similarity determining method, where the method may include steps 402 to 420:
step 402: adjusting the length of the first image and the length of the second image to a target length, wherein the target length is the smaller of the length of the first image and the length of the second image;
when the unit of the size of the image is a pixel, the original size of the first image may be 1920 × 1080, the original size of the second image may be 960 × 540, the determined target length is 960, which is the smaller of the lengths corresponding to the first image and the second image, the adjusted size of the first image is 960 × 1080, and the adjusted size of the second image is 960 × 540.
In step 402 of the embodiment of the present application, the first image and the second image are adjusted to the smaller length of the lengths corresponding to the first image and the second image, so that the influence on the definition of the image due to the size of the enlarged image can be avoided, and the determination of the similarity between the first image and the second image is more accurate.
Step 404: adjusting the resolution of the first image and the resolution of the second image to a target resolution, wherein the target resolution is the smaller resolution of the first image and the resolution of the second image;
the resolution of the first image may be 687 × 480, the resolution of the second image may be 640 × 513, and the determined target resolution is 640 × 513 which is the smaller of the resolutions corresponding to the first image and the second image, so that the adjusted resolutions of the first image and the second image are 640 × 513.
In this embodiment 404, the first image and the second image are adjusted to the lower resolution of the two corresponding resolutions, so that the influence on the definition of the image due to the increase of the resolution of the image can be avoided, and the determination of the similarity between the first image and the second image is more accurate.
Step 406: dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1.
Wherein m is equal to n.
Step 408: acquiring a first image and a second image which are subjected to cell division processing respectively, wherein the size of at least one edge of the first image is equal to that of at least one edge of the second image;
step 410: respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
step 412: determining the coordinates of each grid intersection point position in the first image as first position information of pixels corresponding to the grid intersection point position in the first pixel set;
step 414: determining the coordinates of each cell intersection point position in the second image as second position information of pixels corresponding to the cell intersection point position in the second pixel set;
step 416: determining a corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information;
step 418: determining a pixel in the first set of pixels as a target pixel if a difference between R values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set first threshold, a difference between G values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set second threshold, and a difference between B values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set third threshold;
step 420: and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
An embodiment of the present application further provides an image similarity determining method, where the method may include steps 502 to 514:
step 502: adjusting both the width of the first image and the width of the second image to a target width, wherein the target width is the smaller of the width of the first image and the width of the second image;
when the unit of the size of the image is pixel, the original size of the first image may be 1920 × 1080, the original size of the second image may be 960 × 540, and the determined target width is 540 which is smaller of the widths corresponding to the first image and the second image, the adjusted size of the first image is 1920 × 540, and the adjusted size of the second image is 960 × 540.
In step 502 of the embodiment of the present application, by adjusting the first image and the second image to the smaller width of the widths corresponding to the first image and the second image, it is able to avoid that the size of the enlarged image affects the definition of the image, so that the determination of the similarity between the first image and the second image is more accurate, and the diversity of size adjustment of the first image and the second image is increased.
Step 504: adjusting the resolution of the first image and the resolution of the second image to a target resolution, wherein the target resolution is the smaller resolution of the first image and the resolution of the second image;
step 506: dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1.
Wherein m is equal to n.
Step 508: acquiring a first image and a second image which are subjected to cell division processing respectively, wherein the size of at least one edge of the first image is equal to that of at least one edge of the second image;
step 510: respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
step 512: determining a pixel in the first set of pixels as a target pixel if a difference between R values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set first threshold, a difference between G values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set second threshold, and a difference between B values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set third threshold;
the method for determining the correspondence between the pixels in the first pixel set and the pixels in the second pixel set may include the following steps 5121 to 5123: step 5121: determining the coordinates of each grid intersection point position in the first image as first position information of pixels corresponding to the grid intersection point position in the first pixel set; step 5122: determining the coordinates of each cell intersection point position in the second image as second position information of pixels corresponding to the cell intersection point position in the second pixel set; step 5123: and determining the corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information.
Step 514: and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
An embodiment of the present application further provides an image similarity determining method, where the method may include steps 602 to 614:
step 602: adjusting both the length of the first image and the length of the second image to a target length, and adjusting both the width of the first image and the width of the second image to a target width, wherein the target length is the smaller of the length of the first image and the length of the second image, and the target width is the smaller of the width of the first image and the width of the second image;
when the unit of the size of the image is pixel, the original size of the first image may be 1920 × 1080, the original size of the second image may be 960 × 540, the determined target length is 960, which is the smaller of the lengths corresponding to the first image and the second image, and the target width is 540, which is the smaller of the widths corresponding to the first image and the second image, so that the adjusted sizes of the first image and the second image are both 960 × 540.
In step 602 of the embodiment of the present application, by adjusting the first image and the second image to the smaller of the two corresponding lengths and the smaller of the two corresponding widths, it is possible to avoid that the size of the enlarged image affects the sharpness of the image, so that the determination of the similarity between the first image and the second image is more accurate, and the diversity of the size adjustment of the first image and the second image is increased.
Step 604: adjusting the resolution of the first image and the resolution of the second image to a target resolution, wherein the target resolution is the smaller resolution of the first image and the resolution of the second image;
step 606: dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1.
Wherein m is equal to n.
Step 608: acquiring a first image and a second image which are subjected to cell division processing respectively, wherein the size of at least one edge of the first image is equal to that of at least one edge of the second image;
step 610: respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
step 612: determining a pixel in the first set of pixels as a target pixel if a difference between R values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set first threshold, a difference between G values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set second threshold, and a difference between B values of the pixel in the first set of pixels and a corresponding pixel in the second set of pixels is less than a set third threshold;
the method for determining the correspondence between the pixels in the first pixel set and the pixels in the second pixel set may include the following steps 6121 to 6123: step 6121: determining the coordinates of each grid intersection point position in the first image as first position information of pixels corresponding to the grid intersection point position in the first pixel set; step 6122: determining the coordinates of each cell intersection point position in the second image as second position information of pixels corresponding to the cell intersection point position in the second pixel set; step 6123: and determining the corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information.
Wherein, assuming that the image mode is an RGB mode, the RGB value of a first pixel in the first pixel set is (122, 164, 64), the RGB value of a corresponding pixel in the second pixel set (which may be, for example, a second pixel) may be (115, 160, 60), and the first threshold, the second threshold and the third threshold are all 10, then since the difference 7 between 122 and 115 is less than the first threshold, the difference 4 between 164 and 160 is less than the second threshold, and the difference 4 between 64 and 60 is less than the third threshold, the first pixel is determined as the target pixel.
In step 612 of the embodiment of the present application, it is determined whether the difference values between the R value, the G value, and the B value are smaller than corresponding preset threshold values, and when the difference values are smaller than the corresponding preset threshold values, the pixel is determined as a target pixel, so that the accuracy of determining the similarity between the first image and the second image can be improved.
Step 614: and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
In the related technology, a hash algorithm can be generally used for image recognition, the image is reduced to a set size, the details of the image are removed, only basic information such as structure, brightness and the like is kept, image differences caused by different sizes and proportions are abandoned, and colors are simplified. The reduced image is converted into the gray scale of the size level (some input images are single-channel gray scale values, some input images are RGB three-channel color images, some input images are RGBA (Red, Green, Blue, Alpha) four-channel color images, and non-single-channel images are converted into single-channel gray scale values). The gray level average of all pixels is calculated. The gray scale of each pixel is compared to the average. Greater than or equal to the average value, noted 1; less than the average, noted as 0. Calculating the fingerprints of the images formed by the hash values, and determining the similarity between different images by comparing the fingerprints of different images.
In the related art, image recognition may also be performed by a Scale-invariant feature transform (SIFT) algorithm, and first, Scale-space extremum detection is performed: image locations at all scales are searched and potential scale and rotation invariant points of interest are identified by gaussian differential functions. Secondly, key point positioning is carried out: at each candidate location, the location and scale are determined by fitting a fine model, the keypoints being chosen according to their degree of stability. Then, a direction determination is performed: assigning one or more directions to each keypoint location based on the local gradient direction of the image; all subsequent operations on the image data are transformed with respect to the orientation, scale and location of the keypoints, providing invariance to these transformations. Finally, key point description is performed: local gradients of the image are measured at a selected scale in a neighborhood around each keypoint. These gradients are transformed into a representation that allows for relatively large local shape deformations and illumination variations.
Regardless of the Hash algorithm or the size-invariant feature transformation algorithm, a complex image processing algorithm is adopted to identify feature points of an image and generate an image fingerprint. The image processing has higher requirements on the performance of the equipment, more calculation steps, large occupied resources and longer time consumption.
Generally, in order to make the face of a person finer and smoother and eliminate spots, flaws or mottles of the skin part of the person, the face of the person can be ground by utilizing a gaussian fuzzy command and the like in some image processing software; gaussian blur, also known as gaussian smoothing, is a widely used processing effect in various image processing software, and is commonly used to reduce image noise and reduce detail levels; regardless of the hash algorithm or the size-invariant feature transformation algorithm, details of the images need to be mastered, if the similarity of two images before and after the gaussian blurring processing needs to be judged, or the similarity of at least one image of the two images after the gaussian blurring processing needs to be judged, the similarity of the two images can be difficult to judge by the algorithm (the hash algorithm or the size-invariant feature transformation algorithm), and the cost performance of the hash algorithm and the size-invariant feature transformation algorithm is not high under the condition that the similarity of the images does not need to be judged with high precision.
According to the method and the device, the mode of combining equidistant sampling with the color value difference threshold value can be utilized, and the similarity of the image data after Gaussian blur smoothing processing can be judged with certain precision under the condition that the sampling coefficient and the color value difference threshold value are properly adjusted.
The embodiment of the application can be applied to image recognition in Artificial Intelligence (AI), and image recognition is performed by judging the similarity between a reference image and an image to be recognized.
The embodiment of the application can also be applied to the semi-automatic test of the UI, and particularly can be applied to three links of the UI test: automatically generating a test script, namely automatically positioning and determining a control contained in a software interface through an image recognition technology so as to generate the test script; secondly, in the testing process, the software to be tested is subjected to screen capture, an image recognition algorithm is adopted to recognize whether the screen capture contains a predefined operable control, and if the screen capture exists, a control instruction is triggered, so that the aim of guiding the testing process by image recognition is fulfilled; and thirdly, verifying the test result, namely performing screenshot operation on the interface of the software to be tested, and matching the screenshot with an expected result by using an image recognition technology, so that the test result is automatically obtained, the capability of a UI designer can be evaluated by comparing the actual test result with the expected test result, and the production quality is improved to a certain extent.
An image similarity determining method according to an embodiment of the present application may include steps 702 to 712:
step 702: inputting an original image X and an image Y needing to be compared into a system;
step 704: as in fig. 3, scaling the two images so that one edge of each image is equal to the length of the other image (i.e., scaling image X and image Y to have at least one edge of equal size);
step 706: taking the minimum resolution of the two images, and unifying the images into the resolution;
step 708: as shown in fig. 4, the sampling points are respectively divided into (dividing both image X and image Y into (α -1) × (α -1) regions according to the density coefficient α, where α is a user-defined variable, and the sampling step 708 may include steps 7081 to 7085:
step 7081: establishing alpha transverse equidistant grid lines on the image;
step 7082: establishing alpha longitudinal equidistant grid lines on the image;
step 7083: selecting pixels at the intersection points of the grid lines as samples (a first pixel set and a second pixel set), and if the intersection points are located at the pixel corners, selecting the pixels above the left of the intersection points;
step 7084: describing the positions of the pixel points by a rectangular coordinate system;
step 7085: recording a total number of samples M (total number of pixels in the first set of pixels, and total number of pixels in the second set of pixels);
step 710: as shown in fig. 5, the pixel color value similarity determination includes: comparing the RGB color values of the pixel points of the corresponding coordinate points, if the difference between the three values R, G, B is not more than beta (namely the first threshold, the second threshold and the third threshold are beta), judging the pixel points to be similar (target pixel), otherwise, judging the pixel points to be dissimilar, wherein the beta is a sensitivity coefficient defined by a user. The number of all similar points judged to be similar is recorded as N (the number of target pixels);
step 712: and calculating the similarity, wherein the similarity of the image X and the image Y is N/M100%.
According to the method and the device, the similarity of the image data after the Gaussian blur smoothing processing can be judged with certain precision under the condition that the coefficient is properly debugged by using the mode of combining equidistant sampling and the color value difference threshold.
In step 708 of the embodiment of the present application, the sampling is performed by dividing the horizontal and vertical lines equally, the distances between the horizontal and vertical lines are different, and the number of grids is the same, so that the determination can be successful even if the length and width are subjected to tensile deformation.
The method and the device have the advantages of low requirement on the performance of the device, simple calculation, small occupied resource and high cost performance under the condition of not needing to judge the image similarity with high precision.
Based on the foregoing embodiments, the present application provides an image similarity determining apparatus, where the apparatus includes units and modules included in the units, and may be implemented by a processor in a computer device; of course, the implementation can also be realized through a specific logic circuit; in the implementation process, the processor may be a Central Processing Unit (CPU), a Microprocessor Unit (MPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), or the like.
Fig. 6 is a schematic structural diagram of the image similarity determining apparatus in the embodiment of the present application, and as shown in fig. 6, the apparatus 600 includes a first obtaining module 601, a second obtaining module 602, a first determining module 603, and a second determining module 604, where:
a first obtaining module 601, configured to obtain a first image and a second image that are subjected to binning processing respectively;
a second obtaining module 602, configured to obtain the first pixel set and the second pixel set respectively; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
a first determining module 603, configured to determine a pixel in the first set of pixel values as a target pixel if a pixel value of the pixel in the first set of pixels and a pixel value of a corresponding pixel in the second set of pixels satisfy a set relationship;
a second determining module 604, configured to determine a ratio of the number of the target pixels to the total number of pixels in the first pixel set as a similarity between the first image and the second image.
In the embodiment of the application, the first image and the second image are subjected to the binning processing respectively, so that a first pixel set and a second pixel set which are only located at the positions of the intersection points of the binning points are sampled and selected from all pixels in the first image and the second image respectively, and when the pixel values of the pixels in the first pixel set and the corresponding pixels in the second pixel set meet the set relationship, the similarity between the first image and the second image is determined according to the ratio of the number of the pixels in the first pixel set which meet the set relationship to the total number of the pixels in the first pixel set, so that the convenience and the accuracy of determining the similarity between the first image and the second image can be improved.
In some embodiments, the apparatus further comprises: the first adjusting module is configured to adjust sizes of the first image and the second image, respectively, so that sizes of at least one edge of the first image and at least one edge of the second image are equal to each other.
In some embodiments, the first adjusting module includes a first adjusting unit configured to adjust both the length of the first image and the length of the second image to a target length, where the target length is a smaller length of the first image and the length of the second image.
In some embodiments, the first adjusting module includes a second adjusting unit configured to adjust both the width of the first image and the width of the second image to a target width, where the target width is a smaller width of the first image and the width of the second image.
In some embodiments, the apparatus further comprises: and the second adjusting module is used for respectively adjusting the resolutions of the first image and the second image so as to enable the resolutions of the first image and the second image to be equal.
In some embodiments, the second adjusting module comprises a third adjusting unit for adjusting both the resolution of the first image and the resolution of the second image to a target resolution; wherein the target resolution is a smaller resolution of the first image and the resolution of the second image.
In some embodiments, the apparatus further comprises a dividing module for dividing the first image and the second image into n × m regions, respectively, to perform binning processing on the first image and the second image, respectively; wherein n and m are integers greater than 1.
In some embodiments, the m is equal to the n.
In some embodiments, the pixel values of the pixels include R, G and B values of the pixels, the first determination module includes a first determination unit configured to determine if a difference between R values of pixels in the first set of pixels and corresponding pixels in the second set of pixels is less than a set first threshold, a difference between G values of pixels in the first set of pixels and corresponding pixels in the second set of pixels is less than a set second threshold, and a difference between B values of pixels in the first set of pixels and corresponding pixels in the second set of pixels is less than a set third threshold.
In some embodiments, the apparatus further comprises: a third determining module, configured to determine coordinates of each intersection point position of the bins in the first image as first position information of a pixel corresponding to the intersection point position of the bins in the first pixel set; a fourth determining module, configured to determine coordinates of each intersection point position of the bins in the second image as second position information of a pixel corresponding to the intersection point position of the bins in the second pixel set; and the fifth determining module is used for determining the corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information.
The above description of the apparatus embodiments, similar to the above description of the method embodiments, has similar beneficial effects as the method embodiments. For technical details not disclosed in the embodiments of the apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be noted that, in the embodiment of the present application, if the image similarity determination method is implemented in the form of a software functional module and is sold or used as a standalone product, the method may also be stored in a computer-readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or a part contributing to the related art may be embodied in the form of a software product stored in a storage medium, and including a plurality of instructions for enabling a computer device (which may be a mobile phone, a tablet computer, a desktop computer, a personal digital assistant, a navigator, a digital phone, a video phone, a television, a sensing device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Thus, embodiments of the present application are not limited to any specific combination of hardware and software.
Correspondingly, an embodiment of the present application provides a computer device (for example, the computer device may include a mobile phone, a tablet, a desktop, a personal digital assistant, a navigator, a digital phone, a video phone, a television, a sensing device, and the like), fig. 7 is a schematic diagram of a hardware entity of the computer device according to the embodiment of the present application, and as shown in fig. 7, the hardware entity of the computer device 700 includes: comprising a memory 701 and a processor 702, said memory 701 storing a computer program operable on the processor 702, said processor 702 implementing the steps in the image similarity determination method provided in the above embodiments when executing said program.
The Memory 701 is configured to store instructions and applications executable by the processor 702, and may also buffer data (e.g., image data, audio data, voice communication data, and video communication data) to be processed or already processed by the processor 702 and modules in the computer device 700, and may be implemented by a FLASH Memory (FLASH) or a Random Access Memory (RAM).
Correspondingly, the present application provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the steps in the image similarity determination method provided in the above embodiments.
Here, it should be noted that: the above description of the storage medium and device embodiments is similar to the description of the method embodiments above, with similar advantageous effects as the method embodiments. For technical details not disclosed in the embodiments of the storage medium and apparatus of the present application, reference is made to the description of the embodiments of the method of the present application for understanding.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present application. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. It should be understood that, in the various embodiments of the present application, the sequence numbers of the above-mentioned processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment. In addition, all functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as a removable Memory device, a Read Only Memory (ROM), a magnetic disk, or an optical disk. Alternatively, the integrated units described above in the present application may be stored in a computer-readable storage medium if they are implemented in the form of software functional modules and sold or used as independent products. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or a part contributing to the related art may be embodied in the form of a software product stored in a storage medium, and including a plurality of instructions for enabling a computer device (which may be a mobile phone, a tablet computer, a desktop computer, a personal digital assistant, a navigator, a digital phone, a video phone, a television, a sensing device, etc.) to execute all or part of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a removable storage device, a ROM, a magnetic or optical disk, or other various media that can store program code.
The methods disclosed in the several method embodiments provided in the present application may be combined arbitrarily without conflict to obtain new method embodiments. Features disclosed in several of the product embodiments provided in the present application may be combined in any combination to yield new product embodiments without conflict. The features disclosed in the several method or apparatus embodiments provided in the present application may be combined arbitrarily, without conflict, to arrive at new method embodiments or apparatus embodiments.
The above description is only for the embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (11)

1. An image similarity determination method, characterized in that the method comprises:
acquiring a first image and a second image which are subjected to cell division processing respectively;
respectively acquiring a first pixel set and a second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
determining a pixel in the first pixel value set as a target pixel if the pixel value of the pixel in the first pixel set and the pixel value of the corresponding pixel in the second pixel set satisfy a set relationship;
and determining the ratio of the number of the target pixels to the total number of pixels in the first pixel set as the similarity of the first image and the second image.
2. The method of claim 1, further comprising:
and respectively adjusting the sizes of the first image and the second image so as to enable the size of at least one edge of the first image and the second image to be equal.
3. The method of claim 2, wherein the resizing the first image and the second image, respectively, comprises:
adjusting the length of the first image and the length of the second image to a target length;
and/or the presence of a gas in the gas,
adjusting both the width of the first image and the width of the second image to a target width;
wherein the target length is a smaller length of the first image and the length of the second image, and the target width is a smaller width of the first image and the width of the second image.
4. The method according to any one of claims 1 to 3, further comprising:
the resolutions of the first image and the second image are respectively adjusted so that the resolutions of the first image and the second image are equal.
5. The method of claim 4, wherein the adjusting the resolution of the first image and the second image, respectively, comprises:
adjusting both the resolution of the first image and the resolution of the second image to a target resolution;
wherein the target resolution is a smaller resolution of the first image and the resolution of the second image.
6. The method according to any one of claims 1 to 3, further comprising:
dividing the first image and the second image into n x m areas respectively so as to perform binning processing on the first image and the second image respectively; wherein n and m are integers greater than 1.
7. The method according to any one of claims 1 to 3, wherein the pixel values of the pixels comprise R, G and B values of the pixels, and wherein if the pixel values of the pixels in the first set of pixels and the pixel values of the corresponding pixels in the second set of pixels satisfy a set relationship, the method comprises:
if the difference value between the R values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set first threshold value, the difference value between the G values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set second threshold value, and the difference value between the B values of the pixels in the first pixel set and the corresponding pixels in the second pixel set is smaller than a set third threshold value.
8. The method according to any one of claims 1 to 3, further comprising:
determining the coordinates of each grid intersection point position in the first image as first position information of pixels corresponding to the grid intersection point position in the first pixel set;
determining the coordinates of each cell intersection point position in the second image as second position information of pixels corresponding to the cell intersection point position in the second pixel set;
and determining the corresponding relation between the pixel with the first position information and the pixel with the second position information according to the corresponding relation between the first position information and the second position information.
9. An image similarity determination apparatus, comprising:
the first acquisition module is used for acquiring a first image and a second image which are subjected to division processing respectively;
the second acquisition module is used for respectively acquiring the first pixel set and the second pixel set; the first set of pixels comprises pixels at least one bin intersection position in the first image, and the second set of pixels comprises pixels at least one bin intersection position in the second image;
a first determining module, configured to determine a pixel in the first set of pixel values as a target pixel if a pixel value of the pixel in the first set of pixels and a pixel value of a corresponding pixel in the second set of pixels satisfy a set relationship;
a second determining module, configured to determine a ratio of the number of the target pixels to a total number of pixels in the first pixel set as a similarity between the first image and the second image.
10. A computer device comprising a memory and a processor, the memory storing a computer program operable on the processor, wherein the processor implements the steps in the image similarity determination method according to any one of claims 1 to 8 when executing the program.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the image similarity determination method according to any one of claims 1 to 8.
CN202010893742.7A 2020-08-31 2020-08-31 Image similarity determining method and device, equipment and storage medium Pending CN114202665A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010893742.7A CN114202665A (en) 2020-08-31 2020-08-31 Image similarity determining method and device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010893742.7A CN114202665A (en) 2020-08-31 2020-08-31 Image similarity determining method and device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN114202665A true CN114202665A (en) 2022-03-18

Family

ID=80644197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010893742.7A Pending CN114202665A (en) 2020-08-31 2020-08-31 Image similarity determining method and device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114202665A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361584A (en) * 2022-08-22 2022-11-18 广东电网有限责任公司 Video data processing method and device, electronic equipment and readable storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115361584A (en) * 2022-08-22 2022-11-18 广东电网有限责任公司 Video data processing method and device, electronic equipment and readable storage medium
CN115361584B (en) * 2022-08-22 2023-10-03 广东电网有限责任公司 Video data processing method and device, electronic equipment and readable storage medium

Similar Documents

Publication Publication Date Title
US6839466B2 (en) Detecting overlapping images in an automatic image segmentation device with the presence of severe bleeding
US10769473B2 (en) Image processing apparatus, image processing method, and non-transitory computer-readable storage medium
JP4603512B2 (en) Abnormal region detection apparatus and abnormal region detection method
WO2014160433A2 (en) Systems and methods for classifying objects in digital images captured using mobile devices
EP1081648B1 (en) Method for processing a digital image
JP6393230B2 (en) Object detection method and image search system
US8873839B2 (en) Apparatus of learning recognition dictionary, and method of learning recognition dictionary
JP2011014152A (en) Abnormal region detecting device and abnormal region detection method
CN112164055A (en) Photovoltaic cell color difference classification method based on color segmentation
CN108960247B (en) Image significance detection method and device and electronic equipment
US9378405B2 (en) Determining barcode locations in documents
CN111626145A (en) Simple and effective incomplete form identification and page-crossing splicing method
JP2012150730A (en) Feature extraction device, feature extraction method, feature extraction program and image processing device
CN116883698B (en) Image comparison method and related device
CN112926463A (en) Target detection method and device
CN114202665A (en) Image similarity determining method and device, equipment and storage medium
JP5335554B2 (en) Image processing apparatus and image processing method
US20230316697A1 (en) Association method, association system, and non-transitory computer-readable storage medium
JP2006323779A (en) Image processing method and device
CN110824451A (en) Processing method and device of radar echo map, computer equipment and storage medium
WO2020107196A1 (en) Photographing quality evaluation method and apparatus for photographing apparatus, and terminal device
JP4346620B2 (en) Image processing apparatus, image processing method, and image processing program
JP2018156544A (en) Information processing device and program
KR20230020448A (en) Automated Artifact Detection
CN117952906A (en) PIN skew detection method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination