CN110378425B - Intelligent image comparison method and system - Google Patents

Intelligent image comparison method and system Download PDF

Info

Publication number
CN110378425B
CN110378425B CN201910665589.XA CN201910665589A CN110378425B CN 110378425 B CN110378425 B CN 110378425B CN 201910665589 A CN201910665589 A CN 201910665589A CN 110378425 B CN110378425 B CN 110378425B
Authority
CN
China
Prior art keywords
image
comparison
target image
texture feature
feature vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910665589.XA
Other languages
Chinese (zh)
Other versions
CN110378425A (en
Inventor
苑贵全
李慧
骞一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan luosiyashi Technology Co.,Ltd.
Original Assignee
Wuhan Luosiyashi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Luosiyashi Technology Co ltd filed Critical Wuhan Luosiyashi Technology Co ltd
Priority to CN201910665589.XA priority Critical patent/CN110378425B/en
Publication of CN110378425A publication Critical patent/CN110378425A/en
Application granted granted Critical
Publication of CN110378425B publication Critical patent/CN110378425B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
    • G06F16/5862Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Library & Information Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Image Analysis (AREA)

Abstract

The application discloses a method and a system for intelligent image comparison, wherein the method for intelligent image comparison specifically comprises the following steps: obtaining a target image; calculating a texture feature vector of the target image; comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database to obtain a comparison rate; forming a standby image set by the images in the comparison database with the comparison rate higher than a specified threshold value; comparing the space relation characteristic vectors of each image and the target image in the standby image set to obtain a relative rate; and combining the relative rate and the comparison rate to obtain the image comparison rate. The method and the device can accurately compare the similarity between the image needing authentication and each image stored in the database, and improve the comparison accuracy.

Description

Intelligent image comparison method and system
Technical Field
The present application relates to the field of images, and in particular, to a method and a system for intelligent image comparison.
Background
In the prior art, a plurality of images are usually stored in a database in advance, and the image to be authenticated is compared with the image in the database, so as to complete the comparison of the images. Further, in the prior art, the feature quantity of the image to be authenticated is simply compared with the feature quantity of the image in the database in the comparison process, and if the comparison result is consistent, the authentication is passed, but such a rough comparison method easily causes an error in the result, for example, the image is only partially similar but the authentication is passed. Therefore, a more accurate intelligent image comparison method is needed, which accurately compares the image to be authenticated with the image stored in the database, and reduces the possibility of comparison errors.
Disclosure of Invention
The invention aims to provide an intelligent image comparison method and an intelligent image comparison system, which can accurately compare the similarity between an image needing authentication and each image stored in a database and improve the comparison accuracy.
In order to achieve the above object, the present application provides an intelligent image comparison method, which specifically includes the following steps: obtaining a target image; calculating a texture feature vector of the target image; comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database to obtain a comparison rate; forming a standby image set by the images in the comparison database with the comparison rate higher than a specified threshold value; comparing the space relation characteristic vectors of each image and the target image in the standby image set to obtain a relative rate; combining the relative rate and the comparison rate to obtain the image comparison rate,
the above, wherein before calculating the texture feature vector of the target image, analyzing the target image is further included.
As described above, the luminance information and the frequency characteristic information of the target object, and the shape, the position, and the size information of each part of the target object are analyzed.
As above, the calculating the texture feature vector of the target image specifically includes the following steps: determining a test area; setting the test zone to high resolution; calculating a texture feature vector in the trial region; wherein the texture feature vector is represented as:
Figure BDA0002140004720000021
wherein L represents a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, p2 d(i, j) represents the square of the probability that the pixel gray i goes from the spatial positional relationship d to the pixel gray j.
As above, wherein the texture feature vector is represented as:
Figure BDA0002140004720000022
wherein L represents a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, pd(i, j) represents the probability of the pixel gray i from the spatial position relation d to the pixel gray j, lg represents constantLogarithmic numbers are used.
As above, wherein the plurality of images are registered before the comparison of the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database is performed.
The registration process includes collecting registration information of a plurality of images, wherein the registration information includes names of persons or codes of objects, and the names of persons or codes of objects on the registration information are identified or called.
As above, wherein the comparison ratio is expressed as:
Figure BDA0002140004720000023
wherein a is 1, 2, 3, 4; f. ofaTexture feature vector representing target image, fa' represents a texture feature vector of an image in the database.
An intelligent image comparison system comprises an acquisition unit, a vector calculation unit, a comparison unit and a merging unit; an acquisition unit configured to acquire a target image; the vector calculation unit is used for calculating a texture feature vector of the target image; the comparison unit is used for comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database, determining the comparison rate and finally forming a standby image set according to the comparison rate; and the merging unit is used for acquiring the relative rate and merging the relative rate and the comparison rate to form an image comparison rate.
As above, the comparing unit specifically includes the following sub-modules: the device comprises a registration module, a judgment module and an aggregation module; the registration module is used for collecting registration information of a plurality of images in advance and registering the registration information in a database; the judging module is used for comparing the similarity degree of the plurality of images in the database with the target image and judging whether the similarity degree is higher than a threshold value; and the collection module is used for forming a standby image collection by the images in the database with the similarity degree higher than the threshold value if the similarity degree is higher than the threshold value.
The application has the following beneficial effects:
(1) the intelligent image comparison method and the intelligent image comparison system can accurately compare the similarity between the image to be authenticated and each image stored in the database, and improve the comparison accuracy.
(2) The method and the system for comparing the intelligent images can not only simply compare the similarity between the target image and the images in the database to obtain the comparison result, but also finally obtain the comparison result according to layer-by-layer calculation, so that the comparison or authentication result is more accurate.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments described in the present application, and other drawings can be obtained by those skilled in the art according to the drawings.
FIG. 1 is a flowchart of a method for intelligent image comparison provided according to an embodiment of the present application;
FIG. 2 is an internal structural diagram of an intelligent image matching system provided according to an embodiment of the present application;
FIG. 3 is a block diagram of internal sub-modules of an intelligent image matching system according to an embodiment of the present disclosure;
FIG. 4 is a diagram of another internal sub-module of the intelligent image comparison system according to the embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application are clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The application relates to an intelligent image comparison method and system. According to the method and the device, the similarity between the image needing authentication and each image stored in the database can be accurately compared, and the comparison accuracy is improved.
Fig. 1 is a flowchart illustrating a method for comparing intelligent images according to the present application.
Step S110: a target image is obtained.
Specifically, the target image is an input image that needs to be compared or authenticated, and the target image may be an object or a person, or a part of the object or the person.
Step S120: and calculating the texture feature vector of the target image.
Specifically, before calculating the texture feature vector of the target image, the method further includes analyzing the target image, such as analyzing luminance information and frequency characteristic information according to the target object, and analyzing information obtained by digitizing the shape, position, size, and the like of each part of the target object. And finally, judging whether the target image is complete or not and calculating the texture feature vector.
Further, if the luminance information, the frequency characteristic information, and the information obtained by digitizing the shape, position, size, and the like of each part of the target object all reach normal values or ranges, it is considered that the texture feature vector can be further calculated for the target image. The analysis method can be referred to the prior art.
Specifically, the texture feature vector represents feature data of the target object. The texture feature vector can be expressed in the form of energy features, information entropy, contrast, correlation and the like. The above representations may each represent a texture feature vector, wherein one or more texture feature vectors for the target image may be computed.
The calculating of the texture feature vector of the target image specifically includes the following steps:
step D1: the test area is determined.
Wherein a reception window for receiving the target image is divided before the trial zone is determined.
Specifically, the receiving window is divided into a plurality of small squares, wherein the target image and the surrounding squares thereof are selected as an experimental area.
Step D2: the test area was set to high resolution.
In particular, texture sign extraction in the data of the high-resolution image can make the subsequent calculation result more accurate.
Step D3: texture feature vectors are calculated in the trial regions.
Specifically, if the energy feature is calculated as the texture feature vector, it can be expressed as:
Figure BDA0002140004720000051
wherein f is1Representing a texture feature vector, L representing a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, pd(i, j) represents the probability of the pixel gray i from the spatial positional relationship d to the pixel gray j, and lg represents the common logarithm.
If the information entropy feature is calculated as the texture feature vector, it can be expressed as:
Figure BDA0002140004720000052
wherein f is2Representing a texture feature vector, L representing a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, pd(i, j) represents the probability of the pixel gray i from the spatial positional relationship d to the pixel gray j, and lg represents the common logarithm.
If the contrast feature is calculated as the texture feature vector, it can be expressed as:
Figure BDA0002140004720000053
wherein f is3Representing a texture feature vector, n representing the number of divided squares, and L representing a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, pd(i, j) represents the probability that the pixel gray level i goes from the spatial positional relationship d to the pixel gray level j.
If the correlation is calculated as a texture feature vector, it can be expressed as:
Figure BDA0002140004720000054
wherein in the formula IV, f4A feature vector of the texture is represented,
Figure BDA0002140004720000055
Figure BDA0002140004720000056
Figure BDA0002140004720000057
specifically, one or more of the formulas one, two, three, and four may be used as texture feature vectors, which may respectively represent the uniformity, complexity, sharpness, and linear relationship of the target image.
Step S130: comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database to obtain a comparison rate;
before comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database, the method also comprises the steps of collecting registration information of the plurality of images, registering in the database and storing. Wherein the target image is compared to the images in the database for authentication.
Specifically, the registration information may be attached with a name of a person or a code number of an object so as to be recognized or called. In addition, the registration information further includes at least one registration image. The retrieval of the registered image can be performed based on the name of the person or the code number of the object.
Further, the registration image is an image including an image for authentication and its associated information, specifically including a structure of identification information, a captured image, and a texture feature vector.
Specifically, before comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database, the selection category of the texture feature vector of the target image in step S120 needs to be determined. The class of the texture feature vector of the image in the database should be consistent with the selected class of the texture feature vector of the target image.
For the convenience of distinction, the texture feature vector of the target image is defined as a "target texture feature vector", and the texture feature vectors of the plurality of images in the database are defined as "original texture feature vectors".
Illustratively, if the energy characteristic f is selected in step S1201And information entropy f2As the target texture feature vector, the corresponding energy features f of the multiple images in the database still need to be determined1' sum information entropy f2' as the original texture feature vector. Preferably, the selection of the original feature vector can refer to the methods of formula one, two, three, and four in steps D1-D3.
Specifically, the comparison ratio is used to illustrate how similar the plurality of images in the database are to the target image. Can be expressed as
Figure BDA0002140004720000061
Wherein a is 1, 2, 3, 4. If the comparison rate is higher, the image in the database is more similar to the target image.
Step S140: images in the comparison database having a comparison rate above a specified threshold form a set of backup images.
Preferably, the specified threshold mentioned in the present embodiment is a fixed range set according to actual conditions, and is set artificially and can be modified.
Illustratively, if the comparison rate of image a and image B in the database is above a specified threshold, more similar to the target image, image a and image B form a set of alternate images for further comparison with the target image.
Step S150: and comparing the space relation characteristic vectors of each image in the standby image set and the target image to obtain the relative rate.
Specifically, the relative rate is a fraction of how close an image in the set of standby images is to the target image relative to other images. The relative ratio can be further derived by comparing the spatial relationship features.
The method for obtaining the relative rate specifically comprises the following steps:
step P1: in the spare image set, a common region of the spare image and the target image is determined.
Illustratively, taking image a as an example, image a and the target image are preferably still divided into small squares, and division and determination of the shared region are performed.
Illustratively, image a and the target image are divided into 16 × 16 squares, and 2 × 2 target areas are selected from image a and the target image for comparison one by one. And determining whether approximate regions exist in the target regions of 2 x 2.
If so, the approximate area is defined as a common area in both the standby image and the target image, and step P2 is performed. Otherwise, replacing 2 x 2 with small squares with other values and continuing to compare.
Preferably, the determination of the common region is made with reference to prior art methods of comparison between images.
Specifically, the spare image and the common region in the target image are in a one-to-one correspondence relationship. For example, if the region 2 × 2 at the upper left of the standby image coincides with the region 2 × 2 at the upper left of the target image, the two regions are common regions and there is a correspondence relationship.
Step P2: and calculating the spatial relation of the target image common region.
Illustratively, if there are 5 common regions in the image a and the target image, any two common regions in the 3 common regions are selected for the calculation of the spatial relationship.
Wherein the spatial relationship is determined by the size of the distance, in particular the common distance DCDCan be expressed as:
Figure BDA0002140004720000081
wherein theta isCDThe angle of the corresponding directed line segment formed for any two common regions C, D.
Step P3: the spatial relationship of the common regions in the alternative images is calculated.
Specifically, the common region in the candidate image for calculating the spatial relationship needs to be in a corresponding relationship with the region in the target image, which can be represented as C 'and D'. The spatial relationship in the alternative image can be calculated with reference to the distance formula in step P2, and can be specifically expressed as
And if the distance between the common areas in the standby images and the distance between the common areas in the target images are the same or do not exceed a specified threshold, the two images are considered to be matched, and the spatial relationship is the same.
Step P4: the relative rate is calculated.
Illustratively, if there are 5 common regions in the image a similar to the target region, 2 pairs of common regions corresponding to the target image are selected from the image a for comparison, and whether the common regions match each other is checked (note that, since the common regions are similar, the similarity between the common regions, that is, whether the common regions match each other, needs to be further compared).
For example, C ', D', C ", D" 2 pairs of common regions are selected. If only the common distances of C 'and D' in the image A are the same as the common distances in the target image, the number of matched common regions is 2. The other alternative images determine the matching common region in the same way as the determination of the matching common region in image a.
Specifically, the relative ratio S may be expressed as S ═ N/N ', where N denotes the number of matching common regions and N' denotes the number of approximate common regions in the target image and the standby image.
Step S160: and combining the relative rate and the comparison rate to obtain the image comparison rate.
Specifically, the combination of the relative ratio and the comparison ratio can be performed by the following formula, and the image comparison ratio is specifically expressed as:
if the range difference between the correlation rate and the comparison rate is large, the difference between the correlation rate and the comparison rate can be adjusted through the weight so as to achieve the effect of more accurate result.
Wherein the image comparison rate is the final comparison result. Preferably, if the image comparison ratio is higher, it indicates that the target image is more similar to one of the images in the standby image set.
The present application further provides an intelligent image comparison system, as shown in fig. 2, the intelligent image comparison system includes an obtaining unit 201, a vector calculating unit 202, a comparing unit 203, and a merging unit 204.
Wherein the acquisition unit 201 is used for acquiring a target image.
The vector calculation unit 202 is connected to the acquisition unit 201, and is configured to calculate a texture feature vector of the target image.
The comparing unit 203 is connected to the vector calculating unit 202, and configured to compare the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database, determine a comparison rate, and finally form a standby image set according to the comparison rate.
The merging units 204 are respectively connected to the comparing units 203, and are configured to obtain the relative rates, and merge the relative rates and the comparison rates to form an image comparison rate.
Further, as shown in fig. 3, the comparing unit 203 includes a registering module 301, a determining module 302, and an aggregating module 303.
Wherein the registration module 301 is configured to collect registration information of a plurality of images in advance and register the registration information in the database.
The judging module 302 is connected to the registering module 301, and is configured to compare the similarity degrees of the plurality of images in the database with the target image, and judge whether the similarity degree is higher than a threshold.
The collecting module 303 is connected to the determining module 302, and is configured to form a backup image set from the images in the database with the similarity degree higher than the threshold value if the similarity degree is higher than the threshold value.
Still further, as shown in fig. 4, the merging unit 204 specifically includes the following sub-modules: a common region determining module 401, a spatial relationship calculating module 402, a relative ratio calculating module 403, and an image comparison ratio calculating module 404.
The common region determining module 401 is configured to determine a common region between the standby image and the target image.
The spatial relationship calculation module 402 is connected to the common region determination module 401, and is configured to calculate a spatial relationship between the target image common region and the spare image common region.
The relative ratio calculating module 403 is connected to the spatial relationship calculating module 402 for calculating the relative ratio.
The image comparison rate calculating module 404 is connected to the relative rate calculating module 403, and is configured to calculate an image comparison rate according to the comparison rate and the relative rate.
The application has the following beneficial effects:
(1) the intelligent image comparison method and the intelligent image comparison system can accurately compare the similarity between the image to be authenticated and each image stored in the database, and improve the comparison accuracy.
(2) The method and the system for comparing the intelligent images can not only simply compare the similarity between the target image and the images in the database to obtain the comparison result, but also finally obtain the comparison result according to layer-by-layer calculation, so that the comparison or authentication result is more accurate.
Although the present application has been described with reference to examples, which are intended to be illustrative only and not to be limiting of the application, changes, additions and/or deletions may be made to the embodiments without departing from the scope of the application.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present application, and shall be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An intelligent image comparison method is characterized by comprising the following steps:
obtaining a target image;
calculating a texture feature vector of the target image;
comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database to obtain a comparison rate;
forming a standby image set by the images in the comparison database with the comparison rate higher than a specified threshold value;
comparing the space relation characteristic vectors of each image and the target image in the standby image set to obtain a relative rate;
combining the relative rate and the comparison rate to obtain an image comparison rate;
the step of comparing the spatial relationship feature vectors of each image in the standby image set with the target image to obtain the relative rate comprises the following substeps:
determining a shared area of the standby image and the target image in the standby image set;
calculating the spatial relation of the common region of the target image;
calculating the spatial relationship of the shared regions in the standby image;
calculating a relative rate;
wherein the spatial relationship is determined by the size of the distance, in particular the common distance DCDCan be expressed as:
Figure FDA0003198644340000011
wherein theta isCDThe included angle of the corresponding directed line segment formed for any two common regions C, D;
where the relative ratio S is expressed as S ═ N/N ', N denotes the number of matching common regions, and N' denotes the number of approximate common regions in the target image and the standby image.
2. The method of claim 1, wherein the analyzing the target image is further performed before calculating the texture feature vector of the target image.
3. The method of claim 2, wherein the analysis is performed based on the brightness information and the frequency characteristic information of the target object and the shape, position, and size information of each part of the target object.
4. The method of claim 1, wherein calculating the texture feature vector of the target image comprises the steps of:
determining a test area;
setting the test zone to high resolution;
calculating a texture feature vector in the trial region;
wherein the texture feature vector is represented as:
Figure FDA0003198644340000021
wherein L represents a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, p2 d(i, j) represents the square of the probability that the pixel gray i goes from the spatial positional relationship d to the pixel gray j.
5. The method of intelligent image matching as claimed in claim 1, wherein the texture feature vector is expressed as:
Figure FDA0003198644340000022
wherein L represents a gray level of the image; i, j respectively represent the gray scale of the pixel; d represents a spatial positional relationship between two pixels, pd(i, j) represents the probability of the pixel gray i from the spatial positional relationship d to the pixel gray j, and lg represents the common logarithm.
6. The method of claim 1, wherein the plurality of images are registered prior to comparing the texture feature vector of the target image to the texture feature vectors of the plurality of images in the comparison database.
7. The method of claim 6, wherein the registration process comprises collecting registration information of the plurality of images, the registration information comprising names of persons or codes of objects, and identifying or calling the names of persons or codes of objects in the registration information.
8. The method of intelligent image matching as claimed in claim 1, wherein the comparison ratio is expressed as:
Figure FDA0003198644340000031
wherein a is 1, 2, 3, 4; f. ofaTexture feature vector representing target image, fa' represents a texture feature vector of an image in the database.
9. An intelligent image comparison system is characterized by comprising an acquisition unit, a vector calculation unit, a comparison unit and a merging unit;
an acquisition unit configured to acquire a target image;
the vector calculation unit is used for calculating a texture feature vector of the target image;
the comparison unit is used for comparing the texture feature vector of the target image with the texture feature vectors of the plurality of images in the comparison database, determining the comparison rate and finally forming a standby image set according to the comparison rate;
the merging unit is used for acquiring the relative rate and merging the relative rate and the comparison rate to form an image comparison rate;
the step of comparing the spatial relationship feature vectors of each image in the standby image set with the target image to obtain the relative rate comprises the following substeps:
determining a shared area of the standby image and the target image in the standby image set;
calculating the spatial relation of the common region of the target image;
calculating the spatial relationship of the shared regions in the standby image;
calculating a relative rate;
wherein the spatial relationship is determined by the size of the distance, the specific common distanceFrom DCDCan be expressed as:
Figure FDA0003198644340000032
wherein theta isCDThe included angle of the corresponding directed line segment formed for any two common regions C, D;
where the relative ratio S is expressed as S ═ N/N ', N denotes the number of matching common regions, and N' denotes the number of approximate common regions in the target image and the standby image.
10. The intelligent image comparison system of claim 9, wherein the comparison unit comprises the following sub-modules: the device comprises a registration module, a judgment module and an aggregation module;
the registration module is used for collecting registration information of a plurality of images in advance and registering the registration information in a database;
the judging module is used for comparing the similarity degree of the plurality of images in the database with the target image and judging whether the similarity degree is higher than a threshold value;
and the collection module is used for forming a standby image collection by the images in the database with the similarity degree higher than the threshold value if the similarity degree is higher than the threshold value.
CN201910665589.XA 2019-07-23 2019-07-23 Intelligent image comparison method and system Active CN110378425B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910665589.XA CN110378425B (en) 2019-07-23 2019-07-23 Intelligent image comparison method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910665589.XA CN110378425B (en) 2019-07-23 2019-07-23 Intelligent image comparison method and system

Publications (2)

Publication Number Publication Date
CN110378425A CN110378425A (en) 2019-10-25
CN110378425B true CN110378425B (en) 2021-10-22

Family

ID=68255051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910665589.XA Active CN110378425B (en) 2019-07-23 2019-07-23 Intelligent image comparison method and system

Country Status (1)

Country Link
CN (1) CN110378425B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111368932B (en) * 2020-03-16 2021-05-28 赢技科技发展(杭州)有限公司 Image comparison method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (en) * 2006-05-19 2007-11-21 华为技术有限公司 Long-distance identity-certifying system, terminal, servo and method
CN102866871A (en) * 2012-08-03 2013-01-09 甲壳虫(上海)网络科技有限公司 Method for dynamically displaying image
CN104572971A (en) * 2014-12-31 2015-04-29 安科智慧城市技术(中国)有限公司 Image retrieval method and device
US9767348B2 (en) * 2014-11-07 2017-09-19 Noblis, Inc. Vector-based face recognition algorithm and image search system
CN109543535A (en) * 2018-10-23 2019-03-29 华南理工大学 Three-dimensional refers to vena characteristic extracting method and its matching process

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106067172B (en) * 2016-05-27 2018-10-26 哈尔滨工程大学 A method of slightly matching matches combination to the underwater topography image based on suitability analysis with essence
CN106991419A (en) * 2017-03-13 2017-07-28 特维轮网络科技(杭州)有限公司 Method for anti-counterfeit based on tire inner wall random grain

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101075868A (en) * 2006-05-19 2007-11-21 华为技术有限公司 Long-distance identity-certifying system, terminal, servo and method
CN102866871A (en) * 2012-08-03 2013-01-09 甲壳虫(上海)网络科技有限公司 Method for dynamically displaying image
US9767348B2 (en) * 2014-11-07 2017-09-19 Noblis, Inc. Vector-based face recognition algorithm and image search system
CN104572971A (en) * 2014-12-31 2015-04-29 安科智慧城市技术(中国)有限公司 Image retrieval method and device
CN109543535A (en) * 2018-10-23 2019-03-29 华南理工大学 Three-dimensional refers to vena characteristic extracting method and its matching process

Also Published As

Publication number Publication date
CN110378425A (en) 2019-10-25

Similar Documents

Publication Publication Date Title
US11403839B2 (en) Commodity detection terminal, commodity detection method, system, computer device, and computer readable medium
CN102662949B (en) Method and system for retrieving specified object based on multi-feature fusion
CN106529559A (en) Pointer-type circular multi-dashboard real-time reading identification method
CN109784270B (en) Processing method for improving face picture recognition integrity
CN105139011B (en) A kind of vehicle identification method and device based on mark object image
CN106817677A (en) A kind of indoor objects information identifying method, apparatus and system based on multisensor
CN103136525A (en) High-precision positioning method for special-shaped extended target by utilizing generalized Hough transformation
CN110197185B (en) Method and system for monitoring space under bridge based on scale invariant feature transform algorithm
EP2410467A1 (en) System and method for identifying image locations showing the same person in different images
CN108205645B (en) Reference image quality evaluation method of heterogeneous image matching system
CN108710841A (en) A kind of face living body detection device and method based on MEMs infrared sensor arrays
CN114925348B (en) Security verification method and system based on fingerprint identification
CN111563896A (en) Image processing method for catenary anomaly detection
CN103093243A (en) High resolution panchromatic remote sensing image cloud discriminating method
CN110378425B (en) Intelligent image comparison method and system
CN116596428B (en) Rural logistics intelligent distribution system based on unmanned aerial vehicle
US20030044067A1 (en) Apparatus and methods for pattern recognition based on transform aggregation
CN114550074B (en) Image recognition method and system based on computer vision
CN116246308A (en) Multi-target tracking early warning method and device based on visual recognition and terminal equipment
CN116188826A (en) Template matching method and device under complex illumination condition
CN116230253A (en) Pharmacy medicine checking method and device based on image recognition and storage medium
CN114373203A (en) Picture archiving method and device, terminal equipment and computer readable storage medium
CN114677428A (en) Power transmission line icing thickness detection method based on unmanned aerial vehicle image processing
CN114694042A (en) Disguised person target detection method based on improved Scaled-YOLOv4
JPH0991432A (en) Method for extracting doubtful person

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210923

Address after: 430000 C1-10, room 02, floor 5, building 2, international enterprise center, No. 1, Guanshan Second Road, East Lake New Technology Development Zone, Wuhan, Hubei Province (Wuhan area of free trade zone) (one site with multiple photos)

Applicant after: Wuhan luosiyashi Technology Co.,Ltd.

Address before: 101300 Beijing Shunyi District Airport Street, No. 1 Anhua Street, 1st Building, 1st Floor, No. 2159

Applicant before: BEIJING LONGPU INTELLIGENT TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant