CN117115477A - Mass image comparison method, device and equipment based on image information network - Google Patents

Mass image comparison method, device and equipment based on image information network Download PDF

Info

Publication number
CN117115477A
CN117115477A CN202310693001.8A CN202310693001A CN117115477A CN 117115477 A CN117115477 A CN 117115477A CN 202310693001 A CN202310693001 A CN 202310693001A CN 117115477 A CN117115477 A CN 117115477A
Authority
CN
China
Prior art keywords
image
normalized
feature
alternative
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310693001.8A
Other languages
Chinese (zh)
Inventor
刘世章
王全宁
汪昭辰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Chenyuan Technology Information Co ltd
Original Assignee
Qingdao Chenyuan Technology Information Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Chenyuan Technology Information Co ltd filed Critical Qingdao Chenyuan Technology Information Co ltd
Priority to CN202310693001.8A priority Critical patent/CN117115477A/en
Publication of CN117115477A publication Critical patent/CN117115477A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Databases & Information Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Library & Information Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a mass image comparison method, device and equipment based on an image information network, and relates to the field of image processing, wherein the method comprises the steps of obtaining a normalized image by normalizing an image to be compared; calculating image features of the normalized image; traversing root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes; according to the image characteristics of the normalized image and the image characteristics of the alternative image, calculating the image characteristic difference rate of the normalized image and the image characteristics of the alternative image, wherein the alternative image is an image corresponding to the alternative root node, judging whether the normalized image is similar to the alternative image or not according to the image characteristic difference rate, and obtaining a similar image set after traversing the root node. The invention can rapidly complete comparison in mass image resources, find out the resource image set similar to the designated image, and improve the accuracy and efficiency of image searching.

Description

Mass image comparison method, device and equipment based on image information network
Technical Field
The present invention relates to the field of image processing, and in particular, to a method, an apparatus, and a device for comparing a mass image based on an image information network.
Background
Currently, for comparison and analysis of massive images, image color histogram features of the images or two-dimensional discrete cosine transform of the images are mostly adopted to obtain image fingerprints, and content analysis of the massive images is carried out according to the image fingerprints to judge whether the two images are similar.
However, the dependence of the mode on the sample library is high, and as the images are increased, for comparison of massive images, the prior related art needs to perform model training according to a large number of sample images, and has the disadvantages of high training cost, long training time and poor noise resistance, so that the comparison speed and accuracy of image content are not high.
Disclosure of Invention
In view of the above, the invention aims to provide a method, a device and equipment for comparing mass images based on an image information network, which can pointedly solve the problems of low comparison speed and low accuracy of the existing mass images.
Based on the above object, in a first aspect, the present invention provides a method for comparing a huge amount of images based on an image information network, where the image information network is a forest structure based on an image information space based on a multi-level tree set, the multi-level tree includes a root node and sub-nodes, a difference rate between images corresponding to any two root nodes is greater than a preset threshold, a difference rate between a sub-node of each root node and an image corresponding to the root node of each root node is less than or equal to the preset threshold, the image information space is a multidimensional vector space in which an image feature vector is located, and the image feature vector is obtained by extracting a feature matrix from an image under the same coordinate system, and the method includes: obtaining an image to be compared, and carrying out normalization processing on the image to be compared to obtain a normalized image; calculating image features of the normalized image, wherein the image features comprise an image feature matrix and a module of the image feature matrix; traversing root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes; calculating the image feature difference rate of the normalized image and the alternative image according to the image features of the normalized image and the image features of the alternative image, and judging whether the normalized image is similar to the alternative image or not according to the image feature difference rate, wherein the alternative image is the image corresponding to the alternative root node; and outputting a similar image set, wherein the similar image set comprises all candidate images similar to the normalized image.
Optionally, before the image to be compared is acquired, the method includes: acquiring a resource image, the resource image comprising a plurality of images derived from at least one database; after normalizing the resource images, obtaining a plurality of reserve images, wherein the normalizing process at least comprises normalizing process of resolution, amplitude-shape ratio and color space; calculating image features of the reserve image, wherein the image features of the reserve image comprise a feature matrix of the image and a module of the image feature matrix; and constructing the image information network by taking the reserve image as a root node or a child node according to the image characteristics of the reserve image.
Optionally, the image feature further comprises: calculating the image features of the normalized image by the image feature vector and the modulus of the image feature vector, comprising: extracting features of the normalized image to obtain an image feature matrix of the normalized image; performing modular value calculation on the image feature matrix to obtain a module of the image feature matrix; calculating the image feature vector according to the feature value of the image feature matrix; and carrying out modular value calculation on the image feature vector to obtain a module of the image feature vector.
Optionally, the root node traversing the image information network screens the root node by normalizing the feature quantity and the feature vector of the image to obtain an alternative root node, which comprises: calculating a vector difference value of the normalized image and the image corresponding to the root node according to the feature vector of the normalized image and the feature vector of the image corresponding to the root node; calculating the feature vector difference rates of the normalized image and the root node corresponding image according to the modulus of the feature vector of the normalized image, the modulus of the feature vector of the root node corresponding image and the vector difference value of the feature vector of the normalized image and the feature vector of the root node corresponding image; taking the difference value between the feature quantity of the normalized image and the feature quantity of the image corresponding to the root node as a first preset condition, wherein the difference value is smaller than or equal to a first preset threshold value; taking the characteristic vector difference rate of the normalized image and the image corresponding to the root node as a second preset condition, wherein the characteristic vector difference rate is smaller than or equal to a second preset threshold value; and when the corresponding image of the root node meets the first preset condition and the second preset condition simultaneously, determining the image of the root node as an alternative root node.
Optionally, calculating an image feature difference rate of the normalized image and the candidate image according to the image feature of the normalized image and the image feature of the candidate image, and judging whether the normalized image is similar to the candidate image or not according to the image feature difference rate, including: obtaining the image characteristic difference rate of the normalized image and the candidate image according to an image characteristic difference rate calculation formula; judging whether the image characteristic difference rate meets a third preset condition, if not, determining that the normalized image is dissimilar to the alternative image; if yes, determining that the normalized image is similar to the alternative image, and adding the alternative image into the similar image set;
Wherein, the third preset condition is:
wherein dis (p, q) represents the difference rate of the image features of the normalized image q and the candidate image p, θ is an intrinsic error,to calculate the error dis max A third preset threshold.
Optionally, the image feature difference rate calculation formula is:
where p represents the candidate image, q represents the normalized image, diff (p, q) represents the difference value of the normalized image q and the image feature matrix of the candidate image p, modULBPM (p) represents the modulus of the feature matrix of the candidate image, modULBPM (q) represents the modulus of the feature matrix of the normalized image, modULBPM (p) and modULBPM (q) cannot be 0 as denominators, and dis (p, q) =0 when both modULBPM (p) and modULBPM (q) are 0.
Optionally, the method further comprises: under the condition that the normalized image is similar to the alternative image, traversing all sub-nodes associated with the alternative root node corresponding to the alternative image and calculating the image characteristic difference rate of the normalized image and the image corresponding to each sub-node of the alternative root node; and adding all the sub-node corresponding images associated with the alternative root node and the image characteristic difference rates of the sub-node corresponding images and the normalized image to a similar image set.
Optionally, the method further comprises: counting the total number of images in the similar image set; and under the condition that the total number of the images is larger than 1, the images in the similar image set are forward ordered according to the image difference rate of each image in the similar image set and the normalized image.
In a second aspect, there is provided a mass image contrast device based on an image information network, the device comprising: the image processing module is used for acquiring an image to be compared, and carrying out normalization processing on the image to be compared to obtain a normalized image; the computing module is used for computing the image characteristics of the normalized image, wherein the image characteristics comprise an image characteristic matrix and a module of the image characteristic matrix; the screening module is used for traversing the root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes; the comparison module is used for calculating the image feature difference rate of the normalized image and the alternative image according to the image features of the normalized image and the image features of the alternative image, and judging whether the normalized image is similar to the alternative image or not according to the image feature difference rate, wherein the alternative image is the image corresponding to the alternative root node; and the result output module is used for outputting a similar image set, wherein the similar image set comprises all candidate images similar to the normalized image.
In a third aspect, there is also provided an electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor runs the computer program to implement the method of the first aspect.
In a fourth aspect, there is also provided a computer readable storage medium having stored thereon a computer program for execution by a processor to perform the method of any of the first aspects.
In general, the present invention has at least the following benefits:
according to the massive image comparison method based on the image information network, normalization processing is carried out on images to be compared to obtain normalized images; calculating image characteristics of the normalized image, traversing root nodes of the image information network, and screening the root nodes through characteristic quantity and characteristic vectors of the normalized image to obtain alternative root nodes; according to the image characteristics of the normalized image and the image characteristics of the candidate image, calculating the image characteristic difference rate of the normalized image and the candidate image, and judging whether the normalized image is similar to the candidate image or not according to the image characteristic difference rate so as to obtain a similar image set. The embodiment of the invention can rapidly complete comparison in mass image resources, find out the resource image set similar to the designated image, and improve the accuracy and efficiency of image searching.
Drawings
In the drawings, the same reference numerals refer to the same or similar parts or elements throughout the several views unless otherwise specified. The figures are not necessarily drawn to scale. It is appreciated that these drawings depict only some embodiments according to the disclosure and are not therefore to be considered limiting of its scope. The exemplary embodiments of the present invention and the descriptions thereof are for explaining the present invention and do not constitute an undue limitation of the present invention. In the drawings:
FIG. 1 is a schematic diagram of an application environment of an alternative image information network-based massive image contrast method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an application environment of another alternative image information network-based massive image contrast method according to an embodiment of the present invention;
fig. 3 is a schematic diagram showing the structure of an image information space according to an embodiment of the present invention;
FIG. 4 illustrates a tree structure creation process according to an embodiment of the present invention;
FIG. 5 is a flow chart showing the steps of a method for comparing mass images based on an image information network according to an embodiment of the present invention;
FIG. 6 shows a neighborhood of an embodiment of the inventionAnd->Schematic of (2);
fig. 7 is a schematic structural diagram of a mass image comparing device based on an image information network according to an exemplary embodiment of the present invention;
Fig. 8 shows a schematic diagram of an electronic device of an embodiment of the invention in one example.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
In one aspect of the embodiment of the present invention, a method for comparing a massive image based on an image information network is provided, as an optional implementation manner, where the method for comparing a massive image based on an image information network may be applied, but is not limited to, to an application environment as shown in fig. 1. The application environment comprises the following steps: a terminal device 102, a network 104 and a server 106 which interact with a user in a man-machine manner. Human-computer interaction can be performed between the user 108 and the terminal device 102, and a massive image rapid comparison application program based on an image information network runs in the terminal device 102. The terminal device 102 includes a man-machine interaction screen 1022, a first processor 1024 and a first memory 1026. The man-machine interaction screen 1022 is used for displaying images; the first processor 1024 is configured to acquire images to be compared and execute a method for fast comparing mass images. The first memory 1026 is used to store images.
In addition, the server 106 includes a database 1062 and a processing engine 1064, and the database 1062 is used to store images. The processing engine 1064 is configured to: normalizing the image to be compared to obtain a normalized image; calculating image features of the normalized image; traversing root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes; calculating the image feature difference rate of the normalized image and the candidate image according to the image features of the normalized image and the image features of the candidate image, and judging whether the normalized image is similar to the candidate image or not according to the image feature difference rate; and outputting the similar image set.
In one or more embodiments, the above-mentioned method for comparing massive images based on an image information network according to the present invention may be applied to the application environment shown in fig. 2. As shown in fig. 2, human-machine interaction may be performed between user 108 and user device 204. The user device 204 includes a second memory 206 and a second processor 208. The user equipment 204 in this embodiment may, but is not limited to, construct an image information network with reference to performing the operations performed by the terminal equipment 102.
Optionally, the terminal device 102 and the user device 204 include, but are not limited to, a mobile phone, a tablet computer, a notebook computer, a PC, a vehicle-mounted electronic device, a wearable device, and the like, and the network 104 may include, but is not limited to, a wireless network or a wired network. Wherein the wireless network comprises: WIFI and other networks that enable wireless communications. The wired network may include, but is not limited to: wide area network, metropolitan area network, local area network. The server 106 may include, but is not limited to, any hardware device that may perform calculations. The server may be a single server, a server cluster composed of a plurality of servers, or a cloud server. The above is merely an example, and is not limited in any way in the present embodiment.
In the related art, the comparison and analysis of massive images mostly adopt image color histogram features of the images or two-dimensional discrete cosine transform of the images to obtain image fingerprints, and the content analysis of the massive images is carried out according to the image fingerprints to judge whether the two images are similar. However, the dependence of the mode on the sample library is high, and as the images are increased, for comparison of massive images, the prior related art needs to perform model training according to a large number of sample images, and has the disadvantages of high training cost, long training time and poor noise resistance, so that the comparison speed and accuracy of image content are not high.
In order to solve the above technical problems, as an optional implementation manner, the embodiment of the invention provides a method, a device and equipment for comparing mass images based on an image information network.
In this embodiment, the image information network is a forest structure constructed based on a multi-level tree set based on an image information space, the multi-level tree includes root nodes and child nodes, a difference rate between images corresponding to any two root nodes is greater than a preset threshold, a difference rate between child nodes of each root node and images corresponding to the root nodes is less than or equal to the preset threshold, the image information space is a multi-dimensional vector space in which image feature vectors are located, and the image feature vectors are obtained by extracting feature matrices from images under the same coordinate system and then calculating the feature matrices.
The image information space and the image information network of the present embodiment are explained below.
In order to ensure that all image contents are in the same dimensional information space, i.e., the coordinate system is the same, the present embodiment uses the multidimensional vector space in which the image feature vectors are located as the image information space. In the image information space, each image has its coordinates, through which the distance between images can be calculated, the same images have the same coordinates, the distances between similar images are small, and the distances between different images are large. By calculating the distance of the image, the image information space can be divided into a plurality of areas, the image content in the center of each area represents the main content of the whole area, the relation of each circular area in the image information space comprises three relations of separation, tangency and intersection, wherein the separation is that no common area exists between the areas, the tangency is that there is only one common point between the areas, the point is a tangent point, and the intersection is that there is a common area between the areas.
This results in an image information space as shown in fig. 3, where four points A, B, C, D in fig. 3 are the center positions of the respective circular areas, the radius of the circle represents the maximum distance from the center of the circle in the image information space, and the image content of A, B, C, D represents the main content of each circular area, as shown in fig. 3. c1, C2 are images similar to the content of the image C, B1, B2 are images similar to the content of the image B, D1, D2, D3 are images similar to the content of the image D, and the distances between C1, C2, B1, B2, D1, D2, D3 and the circle centers of the areas where the C1, C2, B2, D1, D3 are respectively located are not larger than the radius.
Based on the image information space shown in fig. 3, the whole image information space can be zoned by selecting a center point and a designated radius to divide the area, and a tree structure can be established according to the zoning characteristic to record the relationship among the areas, namely the multi-level tree set.
Fig. 4 shows a tree structure creation process, in which the tree structure can be divided into two stages according to the relationship between the regions in the image information space, the first stage is the center of the space region corresponding to the root node, and the second stage is the non-center point in the space region corresponding to the child node. If the space region is subdivided into multiple sub-regions, the tree structure will also generate corresponding multi-level sub-nodes, where the number of levels of the tree structure corresponds to the number of levels of the space region in the information space, and in this embodiment, a 2-level tree structure is described as an example.
As shown in fig. 4, a plurality of multi-level tree structures can be obtained according to the image information space, the multi-level tree includes a root node and sub-nodes, a forest structure constructed based on a multi-level tree set formed by the plurality of multi-level tree structures is an image information network, each sub-node in the image information network at least belongs to 1 root node, and no sub-node exists under the root node.
Based on the image information space and the image information network, the images of the sub-nodes in the image information network in the embodiment are similar to the root nodes of the sub-nodes, and the images corresponding to each root node are dissimilar, so that when the image information network is applied to mass image comparison, the image comparison can be rapidly performed according to the similarity relationship between the images in the image information network.
Fig. 5 shows a flow chart of steps of a method for comparing massive images based on an image information network according to an embodiment of the invention. As shown in fig. 5, the mass image comparison method based on the image information network includes the following steps S501 to S505:
s501, acquiring an image to be compared, and carrying out normalization processing on the image to be processed to obtain a normalized image.
In this embodiment, the image to be compared may be an image derived from one or more resource libraries, may be an image specified by a user, may be an image derived from the internet, or may be one or more images in a video clip.
It can be appreciated that the present embodiment is a massive image comparison method based on an image information network, and therefore, before acquiring an image to be compared, the image information network needs to be created according to a resource image, specifically, before acquiring the image to be compared, the present embodiment includes: and obtaining a resource image, carrying out normalization processing on the resource image to obtain a plurality of reserve images, calculating image characteristics of the reserve images, and constructing an image information network by taking the reserve images as root nodes or child nodes according to the image characteristics of the reserve images.
In this embodiment, the resource image includes a plurality of images derived from at least one database, for example, the resource image is a plurality of images from the database a, or the resource image is a plurality of images from the database a and a plurality of images from the database B, so as to further improve the application range of the image information network.
In this embodiment, the normalization processing at least includes normalization processing of resolution, amplitude-to-shape ratio and color space, so that each image to be compared has the same image dimension, so that analysis of contents of a large number of images under the same coordinate system is facilitated, similarity analysis between different image contents is also facilitated according to pixels of the images, and analysis and comparison rate of the image contents is accelerated.
After the normalization processing is carried out on the resource images, a plurality of reserve images are obtained, the process of constructing the image information network is known, the image information network is constructed according to the similarity between the images, and the similarity between the images is calculated on the basis of the feature matrix of the images and the mode of the feature matrix of the images, so that the image features of the reserve images comprise the feature matrix of the images and the mode of the feature matrix of the images, basic data are provided for constructing the image information network by calculating the image features of the reserve images, and the calculation amount of the subsequent image comparison can be reduced. The specific construction process of the image information network is described above, and will not be described herein.
It can be understood that the images constructing the image information network are normalized images, so in order to improve the contrast efficiency, in this embodiment, after the images to be compared are obtained, normalization processing is performed on the images to be processed, so as to obtain normalized images, and then the normalized images are compared with the images in the image information network.
S502, calculating the image characteristics of the normalized image.
In this embodiment, calculating the image features of the normalized image includes: and carrying out feature extraction on the normalized image to obtain an image feature matrix of the normalized image, carrying out modular value calculation on the image feature matrix to obtain a module of the image feature matrix, and carrying out feature comparison on the image features and images in an image information network to judge the similarity of the images.
In this embodiment, the image features include, but are not limited to, image features composed of UniformLBP features, where LBP (Local Binary Pattern ) is a theoretical simple, computationally small local feature descriptor, and UniformLBP refers to a uniform pattern or equivalent pattern of LBP. The UniformLBP feature has good sensitivity to image texture changes, so that the UniformLBP feature of the image is adopted as the image feature in the embodiment, and the content feature of the image can be reflected better.
In an alternative example, the image features may also be other image features, such as sift features (sift, scale-invariant feature transform, scale-invariant feature transforms), hog features (hog, histogram of Oriented Gradient, directional gradient histograms), haar features, etc., which are not listed here.
Taking image features as UniformLBP features as an example, in this embodiment, the image features include an image feature matrix and a modulus of the image feature matrix, and calculating the image feature matrix of the normalized image may obtain sixteen-bit feature data by normalizing the low-eight-bit feature data and the high-eight-bit feature data of the image, and obtain LBP according to the sixteen-bit feature data 16 And obtaining an image feature matrix by the feature matrix.
FIG. 6 shows a neighborhoodAnd->Specifically, as shown in FIG. 6, 3×3 neighborhood features are extracted for normalized image pixels as low-octet features ()>Distance 1, feature point number 8), extracting 5×5 neighborhood feature as high-octant feature (++>Distance 2, feature point number 8). According to equation 1 +.>And->Thereby obtaining sixteen-bit characteristic data (LBP) of the pixel point 16 ) Calculating LBP for all pixels in YUV component 16 LBP of three components of YUV is obtained by characteristic 16 A feature matrix, where Y, U, V represents three distinct components, Y represents brightness, U represents chromaticity, and V represents concentration.
In this embodiment, equation 1 is:
where c is the center pixel, i is the feature point in the neighborhood, and pixel is the pixel value.
It can be understood that when obtaining the feature matrix of the image according to the feature matrix of the pixels, the local deformation and rotation of the image need to be adapted, so that the embodiment obtains the LBP of YUV three components 16 After feature matrix, LBP of YUV three components 16 Performing rotation calculation on the basis of the characteristics to obtain an image UniformLBP 16 And the characteristic is taken as an image characteristic matrix.
In this embodiment, the modulus of the image feature matrix is obtained by performing a modulus calculation on the image feature matrix, and specifically, the modulus calculation formula of the image feature matrix is as follows:
modULBPM i =|UniLBPM i |=∑ v∈binsm<win<hi UniLBP v (m, n) (equation 2)
Where i is YUV component, w i And h i Respectively the width and the height under the components, (m, n) are the abscissa of the pixel point, m and n are non-negative integers, uniLBP v (m, n) is the characteristic value of the (m, n) pixel coordinate point in the v dimension, bin is the characteristic dimension, bin epsilon [0, 15]。
In this embodiment, the image features further include: calculating the image features of the normalized image from the image feature vector and the modulus of the image feature vector, comprising: calculating an image feature vector according to the feature value of the image feature matrix; and carrying out modular value calculation on the image feature vector to obtain a module of the image feature vector.
In this embodiment, the image information space refers to a multidimensional vector space in which image feature vectors are located, and the image feature vectors are calculated by extracting feature matrices from images under the same coordinate system.
For example, the feature matrix calculation is performed on the image f according to formula 1 to obtain a feature matrix ULBPM (f) of the image f, where each feature value is a 16-bit integer LBP 16 Features since UniformPattern LBP features have a number of different binary forms, an LBP operator containing P sample points in a rectangular region of radius R will yield 2 p In order to improve statistics, when the character is converted by the UniformPattern, the mode type of the LBP operator is reduced by adopting an equivalent mode, and when a cyclic binary number corresponding to a certain LBP is hopped from 0 to 1 or from 1 to 0 at most twice, the binary corresponding to the LBP is called an equivalent mode type. For example 00000000 (0 hops), 00000111 (only one hop from 0 to 1), 10001111 (first from 1 to 0 and then from 0 to 1, and two hops) are all equivalent pattern classes. In this way, the variety of binary patterns is greatly reduced without losing any information. For example, for 8 sampling points within a 3×3 neighborhood, the binary pattern is reduced from the original 256 to 58, namely: the values are classified into 59 classes, 58 unitorm patterns are classified into one class, and all other values are classified into 59 th class. So that the histogram changes from 256 to 59 dimensions. Therefore, after performing a form Pattern conversion on ULDPM (f), And->The dimension of (2) is reduced from 256 to 59, so that the form LBP 16 3481 in dimension and a corresponding value in k dimension of VLBP 16 (k)。
In the present embodiment, the image feature vector FV is expressed as:
FV=(v 1 ,v 2 ,...,v 3481 ) (equation 3)
Wherein FV has dimensions 3481, v k Component values representing the k dimension in the vector FV, v k The calculation formula of (2) is as follows:
where w and h are the width and height of the image in the component, ULDPM (m, n) is the eigenvalue of the image's eigenvalue matrix ULDPM at the point (m, n), VLBP 16 (k) Val (m, n, k) is used to determine whether the eigenvalue of (m, n) is VLBP for the value of vector k dimension 16 (k)。
In this embodiment, the modulo calculation formula of the image feature vector is:
wherein modFV is a modulus of the image feature vector FV, v k Representing the component values of the k-dimension in the vector FV.
By the above, the image feature matrix, the modulus of the image feature vector and the modulus of the image feature vector of the image can be calculated, and whether the two images are similar or not can be judged according to the image features of the two different images.
S503, traversing root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes.
It can be understood that the feature quantity of the image can reflect the content of the image, when the feature quantity of the two images differs more, the two images are different, so in order to increase the comparison speed of the images, in this embodiment, the image which is possibly similar to the image to be compared is found in the image information network according to the feature quantity of the image.
The number of features of the image can be represented by a mode of an image feature matrix, and the larger the mode of the image feature matrix is, the more features of the image are represented, and the smaller the mode of the image feature matrix is, the fewer features of the image are represented.
Specifically, the root node traversing the image information network screens the root node by normalizing the feature quantity and the feature vector of the image to obtain an alternative root node, which comprises the following steps: calculating a vector difference value of the normalized image and the image corresponding to the root node according to the image feature vector of the normalized image and the image feature vector of the image corresponding to the root node; calculating the feature vector difference rates of the normalized image and the root node corresponding image according to the modulus of the image feature vector of the normalized image, the modulus of the image feature vector of the root node corresponding image and the vector difference value of the normalized image and the root node corresponding image; taking the difference value between the feature quantity of the normalized image and the feature quantity of the image corresponding to the root node as a first preset condition, wherein the difference value is smaller than or equal to a first preset threshold value; taking the characteristic vector difference rate of the normalized image and the image corresponding to the root node as a second preset condition, wherein the characteristic vector difference rate is smaller than or equal to a second preset threshold value; and when the corresponding image of the root node meets the first preset condition and the second preset condition simultaneously, determining the image of the root node as an alternative root node.
In this embodiment, a vector difference value calculation formula of the normalized image and the image corresponding to the root node is:
wherein DiffFV (p, q) is the vector difference value of the normalized image and the image corresponding to the root node, v k (p) representing component values of k-dimension in the image feature vector FV (p) of the root node corresponding image, v k (q) represents the component value of k dimension in the image feature vector FV (q) of the normalized image.
The feature vector difference rate calculation formula of the normalized image and the root node corresponding image is as follows:
wherein DisFV (p, q) is the difference ratio of the image feature vectors of the normalized image and the image corresponding to the root node, min (modFV (p), modFV (q)) is the minimum value of the modulo modFV (p) of the image feature vector of the image corresponding to the root node and the modulo modFV (q) of the image feature vector of the normalized image, the denominators of the modulo fv (p), and the modFV (q) cannot be 0, and if the modulo fv (p), and the modFV (q) are simultaneously 0, the DisFV (p, q) =0.
The first preset condition is as follows:
|fnumULBPM(p)-fnumULBPM(q)|≤M aiff
wherein q represents a normalized image, fhumULBPM (q) is the feature quantity of the normalized image, p represents a root node corresponding image, fhumULBPM (p) is the feature matrix quantity of the root node corresponding image, and M diff A first preset threshold value.
DisFV max The second preset threshold is a second preset threshold, and the second preset condition is:
DisFV(p,q)>DisFV max
The above can be understood that, when the difference between the mode of the feature matrix of the root node corresponding image and the mode of the feature matrix of the normalized image is within the first preset threshold range, and the difference ratio of the feature vectors of the normalized image and the root node corresponding image is less than or equal to the second preset threshold, it is indicated that the root node corresponding image is likely to be similar to the normalized image, the node can be used as an alternative root node, and then whether the alternative root node corresponding image is similar to the normalized image is further judged according to the features of other images, so that the images dissimilar to the normalized image can be rapidly screened, and the image contrast speed is greatly improved.
S504, calculating the difference rate of the image features of the normalized image and the candidate image according to the image features of the normalized image and the image features of the candidate image, and judging whether the normalized image is similar to the candidate image or not according to the difference rate of the image features.
In this embodiment, the candidate image is an image corresponding to the candidate root node.
In this embodiment, the image feature difference rate of the normalized image and the candidate image may be obtained according to the image feature difference rate calculation formula, and it is determined whether the image feature difference rate satisfies a third preset condition, if not, it is determined that the normalized image and the candidate image are dissimilar, if so, it is determined that the normalized image and the candidate image are similar, and the candidate image is added to the similar image set.
It can be understood that the difference rate of the image features can reflect the difference between the two images, when the difference is too large, the difference indicates that the two images are dissimilar, otherwise, the difference indicates that the two images are similar, and the embodiment judges whether the normalized image is similar to the candidate image by normalizing the magnitude of the difference rate of the image features of the image and the candidate image within the error allowable range.
The image characteristic difference rate calculation formula is as follows:
where p represents the candidate image, q represents the normalized image, diff (p, q) represents the difference value of the normalized image q and the image feature matrix of the candidate image p, modULBPM (p) represents the modulus of the feature matrix of the candidate image, modULBPM (q) represents the modulus of the feature matrix of the normalized image, modULBPM (p) and modULBPM (q) cannot be 0 as denominators, and dis (p, q) =0 when both modULBPM (p) and modULBPM (q) are 0.
The calculation formula of the difference value diff (p, q) of the image feature matrix of the normalized image q and the candidate image p is as follows:
wherein diff (p, q) is the corresponding characteristic difference value of the normalized image q and the candidate image p under YUV component, diffLBP v (m, n) is a characteristic difference value corresponding to a pixel point of a coordinate point (m, n) in the normalized image and the alternative image under YUV component, i is YUV component, and w is i And h i Respectively the width and the height under the components, (m, n) are the abscissa of the pixel point, m and n are non-negative integers, and UniLBP V (m, n) is the characteristic value of the (m, n) pixel coordinate point in the v dimension, and bin epsilon [0, 15]。
Based on the above formula 9 and formula 10, the image feature difference ratio of the normalized image and the candidate image can be obtained, so as to determine whether the image feature difference ratio meets a third preset condition.
The third preset condition is as follows:
where dis (p, q) represents the difference rate of the image features of the normalized image q and the candidate image p, θ is the inherent error,to calculate the error dis max A third preset threshold.
The root node corresponding image meeting the third preset condition can be used as the image similar to the image to be compared by the method, and when a user wants to find out all images similar to the designated image, the user can quickly obtain a similar image set
S505, outputting a similar image set.
In this embodiment, the similar image set includes all candidate images similar to the normalized image, and it is understood that the candidate images are images in the image information network, and the image information network is created according to the resource images, so that the images in the similar image set are all resource images similar to the image to be compared in the image information network.
It may be understood that in the above image information network, the child nodes are similar to their own root nodes, and in this embodiment, if the normalized image and the candidate image are similar, all the child nodes associated with the candidate root node corresponding to the candidate image are traversed, the image feature difference rate of each child node corresponding to the normalized image and the candidate root node is calculated, and all the child node corresponding images associated with the candidate root node and the image feature difference rate of each child node corresponding to the normalized image are added to the similar image set, so that the images in the similar image set are ordered according to the image difference rate.
For example, the child nodes of the root node T include a child node S, a child node X, and a child node Y, and when the root node T corresponding image is similar to the normalized image, the child node S corresponding image, the child node X corresponding image, and the child node Y corresponding image are all considered to be similar to the normalized image and added to the similar image set.
In this embodiment, after obtaining the similar image set, the method further includes counting a total number of images in the similar image set, and under a condition that the total number of images is greater than 1, forward ordering the images in the similar image set according to an image difference rate of each image in the similar image set and the normalized image. For example, the first image in the set of similar images is the image that is most similar to the image to be compared for subsequent processing.
The above method for comparing mass images based on the image information network provided by the embodiment obtains a normalized image by normalizing the images to be compared; calculating image characteristics of the normalized image, traversing root nodes of the image information network, and screening the root nodes through characteristic quantity and characteristic vectors of the normalized image to obtain alternative root nodes; according to the image characteristics of the normalized image and the image characteristics of the candidate image, calculating the image characteristic difference rate of the normalized image and the candidate image, and judging whether the normalized image is similar to the candidate image or not according to the image characteristic difference rate so as to obtain a similar image set. The embodiment can rapidly complete comparison in mass image resources, find out the resource image set similar to the designated image, and improve the accuracy and efficiency of image searching.
It should be noted that, for simplicity of description, the foregoing method embodiments are all described as a series of acts, but it should be understood by those skilled in the art that the present invention is not limited by the order of acts described, as some steps may be performed in other orders or concurrently in accordance with the present invention. Further, those skilled in the art will also appreciate that the embodiments described in the specification are all preferred embodiments, and that the acts and modules referred to are not necessarily required for the present invention.
The following is an embodiment of a mass image comparison device based on an image information network, which can be used for executing the embodiment of the method of the invention. For details not disclosed in the embodiment of the mass image comparison device based on the image information network, please refer to the embodiment of the method of the present invention.
Fig. 7 is a schematic structural diagram of a mass image comparing device based on an image information network according to an exemplary embodiment of the present invention. The mass image comparison device based on the image information network can be realized into all or part of the terminal through software, hardware or a combination of the software and the hardware. The mass image contrast device 700 based on the image information network comprises:
the image processing module 701 is configured to obtain an image to be compared, and normalize the image to be compared to obtain a normalized image.
A calculation module 702, configured to calculate image features of the normalized image, where the image features include an image feature matrix and a modulus of the image feature matrix.
And a screening module 703, configured to traverse the root node of the image information network, and screen the root node through the feature quantity and the feature vector of the normalized image to obtain an alternative root node.
And a comparison module 704, configured to calculate an image feature difference ratio between the normalized image and the candidate image according to the image feature of the normalized image and the image feature of the candidate image, and determine whether the normalized image is similar to the candidate image according to the image feature difference ratio, where the candidate image is an image corresponding to the candidate root node.
And the result output module 705 is configured to output a similar image set, where the similar image set includes all candidate images similar to the normalized image.
The image information network is a forest structure constructed based on a multi-level tree set based on an image information space, the multi-level tree comprises root nodes and child nodes, the difference rate between images corresponding to any two root nodes is larger than a preset threshold value, the difference rate between the child node of each root node and the image corresponding to the root node is smaller than or equal to the preset threshold value, the image information space is a multi-dimensional vector space in which image feature vectors are located, and the image feature vectors are obtained by calculating after feature matrices are extracted from the images under the same coordinate system.
It should be noted that, when the image information network-based mass image comparison device provided in the above embodiment executes the image information network-based mass image comparison method, only the division of the above functional modules is used for illustration, in practical application, the above functional allocation may be completed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules, so as to complete all or part of the functions described above. In addition, the massive image comparison device based on the image information network and the massive image rapid comparison generation method based on the image information network provided in the foregoing embodiments belong to the same concept, and detailed implementation processes of the massive image comparison device and the massive image rapid comparison generation method based on the image information network are embodied in the method embodiments and are not described herein.
The embodiment of the invention also provides electronic equipment corresponding to the massive image comparison method based on the image information network provided by the previous embodiment, so as to execute the massive image comparison method based on the image information network.
Fig. 8 shows a schematic diagram of an electronic device according to an embodiment of the invention. As shown in fig. 8, the electronic device 800 includes: a third memory 801 and a third processor 802, the third memory 801 having stored therein a computer program executable on said third processor 802, the third processor 802 executing the method provided by any of the previous embodiments of the invention when said computer program is executed.
Alternatively, in this embodiment, the electronic device may be located in at least one network device of a plurality of network devices of the computer network.
Alternatively, in the present embodiment, the above processor may be configured to execute the steps of the above massive image comparison method based on the image information network by means of a computer program.
Alternatively, it will be understood by those skilled in the art that the structure shown in fig. 8 is only schematic, and the electronic device may also be a terminal device such as a smart phone (e.g. an Android phone, an iOS phone, etc.), a tablet computer, a palm computer, and a mobile internet device (Mobile Internet Devices, MID), a PAD, etc. Fig. 8 is not limited to the structure of the electronic device described above. For example, the electronic device may also include more or fewer components (e.g., network interfaces, etc.) than shown in FIG. 8, or have a different configuration than shown in FIG. 8.
The third memory 801 may be used to store software programs and modules, such as program instructions/modules corresponding to the method and apparatus for comparing a massive image based on an image information network in the embodiment of the present invention, and the third processor 802 executes various functional applications and data processing by running the software programs and modules stored in the third memory 801, thereby implementing the method for comparing a massive image based on an image information network. The third memory 801 may include high speed random access memory and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid state memory. In some examples, the third memory 801 may further include memory remotely located with respect to the third processor 802, which may be connected to the terminal through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof. Wherein the third memory 801 may be, but is not limited to, used for storing an image information network in particular. As an example, the third memory 801 may include, but is not limited to, an image processing module, a calculating module, a filtering module, a comparing module, and a result outputting module in the mass image comparing device based on the image information network. In addition, other module units in the massive image comparing device based on the image information network may be included, but are not limited to, and are not described in detail in this example.
Optionally, the electronic device comprises transmission means 803, the transmission means 803 being adapted to receive or transmit data via a network. Specific examples of the network described above may include wired networks and wireless networks. In one example, the transmission means 803 includes a network adapter (Network Interface Controller, NIC) that can be connected to other network devices and routers via a network cable to communicate with the internet or a local area network. In one example, the transmission device 803 is a Radio Frequency (RF) module, which is used to communicate with the internet wirelessly.
In addition, the electronic device further includes: the display 804 is configured to display the analysis result of rapid comparison of the massive images based on the image information network; and a connection bus 805 for connecting the respective module parts in the above-described electronic apparatus.
The present embodiments provide a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The computer instructions are read from a computer readable storage medium by a processor of a computer device, which computer instructions are executed by the processor, causing the computer device to perform the above-described mass image comparison method based on an image information network, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for executing the steps of the massive image comparison method based on the image information network.
Alternatively, in this embodiment, it will be understood by those skilled in the art that all or part of the steps in the methods of the above embodiments may be performed by a program for instructing a terminal device to execute the steps, where the program may be stored in a computer readable storage medium, and the storage medium may include: flash disk, read-Only Memory (ROM), random-access Memory (Random Access Memory, RAM), magnetic or optical disk, and the like.
The foregoing embodiment numbers of the present invention are merely for the purpose of description, and do not represent the advantages or disadvantages of the embodiments.
The integrated units in the above embodiments may be stored in the above-described computer-readable storage medium if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present invention may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, comprising several instructions for causing one or more computer devices (which may be personal computers, servers or network devices, etc.) to perform all or part of the steps of the method of the various embodiments of the present invention.
In the foregoing embodiments of the present invention, the descriptions of the embodiments are emphasized, and for a portion of this disclosure that is not described in detail in this embodiment, reference is made to the related descriptions of other embodiments.
In several embodiments provided by the present invention, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely exemplary, and are merely a logical functional division, and there may be other manners of dividing the apparatus in actual implementation, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some interfaces, units or modules, or may be in electrical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.

Claims (10)

1. The method is characterized in that the image information network is a forest structure constructed based on a multi-level tree set based on an image information space, the multi-level tree comprises root nodes and sub-nodes, the difference rate between any two images corresponding to the root nodes is larger than a preset threshold, the difference rate between the sub-node of each root node and the image corresponding to the root node is smaller than or equal to the preset threshold, the image information space is a multidimensional vector space in which image feature vectors are located, and the image feature vectors are obtained by calculating after feature matrices are extracted from the images in the same coordinate system, and the method comprises the following steps:
Obtaining an image to be compared, and carrying out normalization processing on the image to be compared to obtain a normalized image;
calculating image features of the normalized image, wherein the image features comprise an image feature matrix and a module of the image feature matrix;
traversing root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes;
calculating the image feature difference rate of the normalized image and the alternative image according to the image features of the normalized image and the image features of the alternative image, and judging whether the normalized image is similar to the alternative image or not according to the image feature difference rate, wherein the alternative image is the image corresponding to the alternative root node;
and outputting a similar image set, wherein the similar image set comprises all candidate images similar to the normalized image.
2. The method according to claim 1, characterized in that before the acquisition of the image to be compared, it comprises:
acquiring a resource image, the resource image comprising a plurality of images derived from at least one database;
after normalizing the resource images, obtaining a plurality of reserve images, wherein the normalizing process at least comprises normalizing process of resolution, amplitude-shape ratio and color space;
Calculating image features of the reserve image, wherein the image features of the reserve image comprise a feature matrix of the image and a module of the image feature matrix;
and constructing the image information network by taking the reserve image as a root node or a child node according to the image characteristics of the reserve image.
3. The method of claim 1, wherein the image features further comprise: image feature vectors and a modulus of the image feature vectors,
calculating image features of the normalized image, comprising:
extracting features of the normalized image to obtain an image feature matrix of the normalized image;
performing modular value calculation on the image feature matrix to obtain a module of the image feature matrix;
calculating the image feature vector according to the feature value of the image feature matrix;
and carrying out modular value calculation on the image feature vector to obtain a module of the image feature vector.
4. A method according to claim 3, wherein traversing the root node of the image information network screens the root node for candidate root nodes by normalizing the feature quantity and feature vector of the image, comprising:
calculating a vector difference value of the normalized image and the image corresponding to the root node according to the image feature vector of the normalized image and the image feature vector of the image corresponding to the root node;
Calculating the feature vector difference rates of the normalized image and the root node corresponding image according to the modulus of the image feature vector of the normalized image, the modulus of the image feature vector of the root node corresponding image and the vector difference value of the normalized image and the root node corresponding image;
taking the difference value between the feature quantity of the normalized image and the feature quantity of the image corresponding to the root node as a first preset condition, wherein the difference value is smaller than or equal to a first preset threshold value;
taking the characteristic vector difference rate of the normalized image and the image corresponding to the root node as a second preset condition, wherein the characteristic vector difference rate is smaller than or equal to a second preset threshold value;
and when the corresponding image of the root node meets the first preset condition and the second preset condition simultaneously, determining the image of the root node as an alternative root node.
5. The method of claim 4, wherein calculating an image feature difference ratio of the normalized image and the candidate image from image features of the normalized image and image features of the candidate image, and determining whether the normalized image is similar to the candidate image based on the image feature difference ratio, comprises:
obtaining the image characteristic difference rate of the normalized image and the candidate image according to an image characteristic difference rate calculation formula;
Judging whether the image characteristic difference rate meets a third preset condition, if not, determining that the normalized image is dissimilar to the alternative image; if yes, determining that the normalized image is similar to the alternative image, and adding the alternative image into the similar image set;
wherein, the third preset condition is:
dis(o,q)++≤dis max
wherein dis (p, q) represents the difference rate of the image features of the normalized image q and the candidate image p, θ is an intrinsic error,to calculate the error dis max A third preset threshold.
6. The method of claim 5, wherein the image feature difference rate calculation formula is:
where p represents the candidate image, q represents the normalized image, diff (p, q) represents the difference value of the normalized image q and the image feature matrix of the candidate image p, modULBPM (p) represents the modulus of the feature matrix of the candidate image, modULBPM (q) represents the modulus of the feature matrix of the normalized image, modULBPM (p) and modULBPM (q) cannot be 0 as denominators, and dis (p, q) =0 when both modULBPM (p) and modULBPM (q) are 0.
7. The method of claim 5, wherein the method further comprises:
under the condition that the normalized image is similar to the alternative image, traversing all sub-nodes associated with the alternative root node corresponding to the alternative image, and calculating the image characteristic difference rate of the normalized image and the image corresponding to each sub-node of the alternative root node;
And adding all the sub-node corresponding images associated with the alternative root node and the image characteristic difference rates of the sub-node corresponding images and the normalized image to a similar image set.
8. The method of claim 7, wherein the method further comprises:
counting the total number of images in the similar image set;
and under the condition that the total number of the images is larger than 1, the images in the similar image set are forward ordered according to the image difference rate of each image in the similar image set and the normalized image.
9. A mass image contrast device based on an image information network, the device comprising:
the image processing module is used for acquiring an image to be compared, and carrying out normalization processing on the image to be compared to obtain a normalized image;
the computing module is used for computing the image characteristics of the normalized image, wherein the image characteristics comprise an image characteristic matrix and a module of the image characteristic matrix;
the screening module is used for traversing the root nodes of the image information network, and screening the root nodes through the feature quantity and the feature vector of the normalized image to obtain alternative root nodes;
The comparison module is used for calculating the image feature difference rate of the normalized image and the alternative image according to the image features of the normalized image and the image features of the alternative image, and judging whether the normalized image is similar to the alternative image or not according to the image feature difference rate, wherein the alternative image is the image corresponding to the alternative root node;
the result output module is used for outputting a similar image set, wherein the similar image set comprises all candidate images similar to the normalized image;
the image information network is a forest structure constructed based on a multi-level tree set based on an image information space, the multi-level tree comprises root nodes and child nodes, the difference rate between images corresponding to any two root nodes is larger than a preset threshold value, the difference rate between the child node of each root node and the image corresponding to the root node is smaller than or equal to the preset threshold value, the image information space is a multi-dimensional vector space in which image feature vectors are located, and the image feature vectors are obtained by calculating after feature matrices are extracted from the images under the same coordinate system.
10. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor runs the computer program to implement the method of any one of claims 1-8.
CN202310693001.8A 2023-06-09 2023-06-09 Mass image comparison method, device and equipment based on image information network Pending CN117115477A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310693001.8A CN117115477A (en) 2023-06-09 2023-06-09 Mass image comparison method, device and equipment based on image information network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310693001.8A CN117115477A (en) 2023-06-09 2023-06-09 Mass image comparison method, device and equipment based on image information network

Publications (1)

Publication Number Publication Date
CN117115477A true CN117115477A (en) 2023-11-24

Family

ID=88795467

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310693001.8A Pending CN117115477A (en) 2023-06-09 2023-06-09 Mass image comparison method, device and equipment based on image information network

Country Status (1)

Country Link
CN (1) CN117115477A (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116188821A (en) * 2023-04-25 2023-05-30 青岛尘元科技信息有限公司 Copyright detection method, system, electronic device and storage medium
CN116188805A (en) * 2023-04-26 2023-05-30 青岛尘元科技信息有限公司 Image content analysis method and device for massive images and image information network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116188821A (en) * 2023-04-25 2023-05-30 青岛尘元科技信息有限公司 Copyright detection method, system, electronic device and storage medium
CN116188805A (en) * 2023-04-26 2023-05-30 青岛尘元科技信息有限公司 Image content analysis method and device for massive images and image information network

Similar Documents

Publication Publication Date Title
CN116188805B (en) Image content analysis method and device for massive images and image information network
CN111753041B (en) Data aggregation rendering method, device and system, electronic equipment and storage medium
CN116188821B (en) Copyright detection method, system, electronic device and storage medium
CN116188822B (en) Image similarity judging method, device, electronic equipment and storage medium
CN110399487B (en) Text classification method and device, electronic equipment and storage medium
Celebi et al. An effective real-time color quantization method based on divisive hierarchical clustering
CN113762280A (en) Image category identification method, device and medium
CN111583274A (en) Image segmentation method and device, computer-readable storage medium and electronic equipment
KR20220051162A (en) Visual positioning methods, training methods for related models, and related devices and devices
CN109598250B (en) Feature extraction method, device, electronic equipment and computer readable medium
CN113657087B (en) Information matching method and device
WO2015001416A1 (en) Multi-dimensional data clustering
US6853374B2 (en) Image space display method and apparatus
CN113674425B (en) Point cloud sampling method, device, equipment and computer readable storage medium
CN108363740B (en) IP address analysis method and device, storage medium and terminal
CN113436223B (en) Point cloud data segmentation method and device, computer equipment and storage medium
CN108388869B (en) Handwritten data classification method and system based on multiple manifold
CN117115477A (en) Mass image comparison method, device and equipment based on image information network
CN116386048A (en) Seal removing method, device, equipment and storage medium
CN109213515A (en) Normalizing method and device and an electronic equipment are buried under multi-platform
CN116721113A (en) Three-dimensional point cloud plane segmentation method and system
CN111931794B (en) Sketch-based image matching method
CN117152650B (en) Video content analysis method and video event information network for massive videos
CN109949076B (en) Method for establishing hypersphere mapping model, information recommendation method and device
CN116152530B (en) Image difference determining method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination