CN114581853A - Image matching method, system, device and medium - Google Patents

Image matching method, system, device and medium Download PDF

Info

Publication number
CN114581853A
CN114581853A CN202210240581.0A CN202210240581A CN114581853A CN 114581853 A CN114581853 A CN 114581853A CN 202210240581 A CN202210240581 A CN 202210240581A CN 114581853 A CN114581853 A CN 114581853A
Authority
CN
China
Prior art keywords
image
target image
target
images
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210240581.0A
Other languages
Chinese (zh)
Inventor
胡舒瀚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Unisinsight Technology Co Ltd
Original Assignee
Chongqing Unisinsight Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Unisinsight Technology Co Ltd filed Critical Chongqing Unisinsight Technology Co Ltd
Priority to CN202210240581.0A priority Critical patent/CN114581853A/en
Publication of CN114581853A publication Critical patent/CN114581853A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides an image matching method, a system, equipment and a medium, the method obtains an initial image collected by image collecting equipment through a first mode and a second mode to obtain a plurality of first images and a plurality of second images, one of the first images is taken as a target image, image feature vector extraction is respectively carried out on the target image and each second image, and the cosine similarity of the image features between the target image and each second image is determined, determining an image set according to the image characteristic cosine similarity and a preset image characteristic cosine similarity threshold, respectively acquiring image information of a target image and each second image in the image set, comparing the image information, the method and the device have the advantages that the matching relation between the target image and each second image in the image set is determined, an automatic image matching mode is provided, and accuracy and efficiency of image matching are improved.

Description

Image matching method, system, device and medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to an image matching method, system, device, and medium.
Background
The road traffic camera is the most main equipment for traffic violation detection at present, the traffic camera can automatically detect a target vehicle, the illegal vehicle is captured and evidence is obtained, an evidence obtaining photo is returned to a command center for verification, and in an actual application scene, a large number of cameras are generally connected into a single or a plurality of platforms and crossing hosts to intensively process violation incidents.
Because the transmission picture relates to the interaction between the camera and the platform, and the problem that the snapshot picture of the equipment is inconsistent with the transmission picture exists when the picture quantity is large, the violation audit is hindered. The larger the picture quantity is, the larger the inconsistency probability of the snapshot picture of the equipment and the transmission picture is. In the related art, matching between a snapshot image of equipment and a transmission image mainly depends on manual comparison and verification, the testing efficiency is low when the number of images is large, and wrong judgment exists, so that a convenient and quick image matching method is urgently needed.
Disclosure of Invention
In view of the above drawbacks of the prior art, the present invention provides an image matching method, system, device and medium, so as to solve the technical problems of low efficiency and poor accuracy due to manual matching between a snapshot and a transmission image of a device.
The invention provides an image matching method, which comprises the following steps:
acquiring a plurality of initial images acquired by image acquisition equipment through a first mode and a second mode respectively to obtain a plurality of first images and a plurality of second images;
determining a first image as a target image, respectively extracting image feature vectors of the target image and each second image, determining image feature cosine similarity between the target image and each second image, and determining an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, wherein the image set comprises the target image and at least one second image;
and respectively acquiring image information of each second image in the image set and the target image in the image set, and comparing the image information to determine the matching relationship between the target image in the image set and each second image in the image set, wherein the image information comprises at least one of resolution, image characteristic vector and color space value.
Optionally, before the image feature vector extraction is performed on the target image and each second image, the method further includes:
respectively acquiring a first number of first images and a second number of second images;
and determining the quantity stability according to the first quantity and the second quantity, respectively extracting image feature vectors of the target image and each second image if the quantity stability is less than the preset stability, and prompting to investigate the network environment if the quantity stability is greater than the preset stability.
Optionally, the method further comprises at least one of:
if the image information comprises the resolution, the image information comparison mode comprises the steps of respectively obtaining the resolutions of the target image and each second image, determining the resolution similarity between the target image and each second image, and determining the matching relation between the target image and each second image in the image set according to the resolution similarity and a preset resolution similarity threshold;
if the image information comprises color space values, the image information comparison mode comprises the steps of respectively obtaining RGB color values of the target image and each second image, determining the color space values of the target image and each second image, and determining the matching relation between the target image and each second image in the image set, wherein the color space values comprise brightness, saturation and chroma;
if the image information comprises image feature vectors, the image information comparison mode comprises the steps of determining the feature vector error rate of the target image and each second image according to the image feature vectors of the target image and each second image, and determining the matching relation between the target image and each second image in the image set according to the feature vector error rate and a preset feature vector error rate threshold value.
Optionally, if the resolution of the target image is hpx × wpx and the resolution of the second image is hpx × wpx, the determining of the resolution similarity includes,
Figure BDA0003541417790000021
wherein, RESsimiIs the resolution similarity;
the manner in which the error rate of the feature vector is determined includes,
Figure BDA0003541417790000022
wherein Tx is a feature vector error rate, Tp is an image feature vector of the second image, and Te is an image feature vector of the target image;
the manner in which the color space value is determined includes,
I=aR+bG+cB,
Figure BDA0003541417790000023
Figure BDA0003541417790000024
wherein, I is brightness, a, B and c are preset coefficients, R is the R value of the target image or the second image, G is the G value of the target image or the second image, B is the B value of the target image or the second image, S is saturation, Max is the maximum value of the R value, the G value and the B value, Min is the minimum value of the R value, the G value and the B value, and H is chroma, wherein R, G, B, S, I E [0,1], H E [0,360 ].
Optionally, if the image information includes a color space value, the determining a matching relationship between the target image and each second image in the image set includes:
determining a color space error rate between the target image and each of the second images based on the color space values of the target image and each of the second images;
determining a matching relationship between the target image and each second image in the image set according to the color space error rate between the target image and each second image;
the manner in which the color space error rate is determined includes,
Figure BDA0003541417790000031
where HSIx is the color space error rate, Hp is the chroma of the second image, He is the chroma of the target image, Sp is the saturation of the second image, Se is the saturation of the target image, Ip is the luminance of the second image, and Ie is the luminance of the target image.
Optionally, the image information includes a resolution, an image feature vector, and a color space value, and the comparing the image information to determine a matching relationship between the target image and each second image in the image set includes:
respectively obtaining the resolutions of a target image and each second image, determining the resolution similarity between the target image and each second image, and determining a rough matching second image from each second image according to the resolution similarity and a preset resolution similarity threshold, wherein the resolution similarity between the rough matching second image and the target image is greater than the preset resolution similarity threshold;
respectively acquiring RGB color values of the target image and each coarse matching second image, determining color space values of the target image and each second image, and determining a color space error rate between the target image and each second image;
determining a feature vector error rate of the target image and each coarse matching second image according to the image feature vectors of the target image and each coarse matching second image;
and if the error rate of the color space is greater than a preset error rate threshold of the color space and the error rate of the feature vector is greater than a preset error rate threshold of the feature vector, determining the rough matching second image as a matching image of the target image.
Optionally, each first image is sequentially determined as a target image, and a matching relationship between each target image and each second image is determined, where the method further includes at least one of:
if a second image matched with the target image exists, storing the target image and the second image into a first storage space;
if a second image matched with the target image exists, adding the identification information of the second image into the file name of the target image;
if a plurality of second images matched with the target image exist, storing the target image and one second image into a first storage space, and storing other second images matched with the target image into a second storage space;
if the second image matched with the target image does not exist, storing the target image into a third storage space;
and if the second image is not matched with each first image, storing the second image into a fourth storage space.
Optionally, the method further comprises at least one of:
acquiring the number of images of a target image in a first storage space as a correct matching number;
acquiring the number of images of a second image in a second storage space as the number of repeated matches;
acquiring the number of images of the target image in the third storage space as the number of unmatched images;
the number of images of the second image in the fourth storage space is acquired as the number of error matches.
The present invention also provides an image matching system, the system comprising:
the acquisition module is used for respectively acquiring a plurality of initial images acquired by the image acquisition equipment in a first mode and a second mode to obtain a plurality of first images and a plurality of second images;
the image set determining module is used for determining a first image as a target image, respectively extracting image feature vectors of the target image and each second image, determining image feature cosine similarity between the target image and each second image, and determining an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, wherein the image set comprises the target image and at least one second image;
and the matching module is used for respectively acquiring the second images in the image set and the image information of the target image in the image set, and comparing the image information to determine the matching relationship between the target image in the image set and the second images in the image set, wherein the image information comprises at least one of resolution, image characteristic vector and color space value.
The invention also provides an electronic device, which comprises a processor, a memory and a communication bus;
the communication bus is used for connecting the processor and the memory;
the processor is configured to execute the computer program stored in the memory to implement the method according to any one of the embodiments described above.
The present invention also provides a computer-readable storage medium, having stored thereon a computer program,
the computer program is for causing a computer to perform a method as in any one of the embodiments described above.
The invention has the beneficial effects that: the invention provides an image matching method, a system, equipment and a medium, the method obtains an initial image collected by image collecting equipment through a first mode and a second mode to obtain a plurality of first images and a plurality of second images, one of the first images is taken as a target image, image feature vector extraction is respectively carried out on the target image and each second image, and the cosine similarity of the image features between the target image and each second image is determined, determining an image set according to the image characteristic cosine similarity and a preset image characteristic cosine similarity threshold, respectively acquiring image information of a target image in the image set and image information of each second image in the image set, comparing the image information, the method and the device have the advantages that the matching relation between the target image in the image set and each second image in the image set is determined, an automatic image matching mode is provided, and accuracy and efficiency of image matching are improved.
Drawings
FIG. 1 is a flow chart of an image matching method provided in an embodiment of the invention;
fig. 2 is a schematic flowchart of a process of determining an image set according to cosine similarity of image features and a preset threshold of cosine similarity of image features provided in an embodiment of the present invention;
FIG. 3 is a flowchart illustrating a specific image information comparison process according to an embodiment of the present invention;
FIG. 4 is another schematic flow chart of specific image information comparison provided in an embodiment of the present invention;
FIG. 5 is a flowchart illustrating an exemplary image matching method according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of an exemplary image matching system;
fig. 7 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The embodiments of the present invention are described below with reference to specific embodiments, and other advantages and effects of the present invention will be easily understood by those skilled in the art from the disclosure of the present specification. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention. It is to be noted that the features in the following embodiments and examples may be combined with each other without conflict.
It should be noted that the drawings provided in the following embodiments are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
In the following description, numerous details are set forth to provide a more thorough explanation of embodiments of the present invention, however, it will be apparent to one skilled in the art that embodiments of the present invention may be practiced without these specific details, and in other embodiments, well-known structures and devices are shown in block diagram form, rather than in detail, in order to avoid obscuring embodiments of the present invention.
Referring to fig. 1, the present embodiment provides an image matching method, including:
step S101: a plurality of initial images collected by the image collecting device are respectively obtained through a first mode and a second mode, and a plurality of first images and a plurality of second images are obtained.
The image acquisition equipment is equipment such as a camera and can be used for scenes such as violation snapshot. The first mode and the second mode are the same or different image acquisition modes, for example, modes such as communicating with a camera through a platform and a host to acquire an initial image, acquiring the initial image from the camera through a transmission mode such as ftp transmission, acquiring the initial image from the camera through a terminal such as a PC, directly acquiring the initial image from the camera, and acquiring the initial image from the camera through a mail server.
The initial image acquired in the first mode is used as a first image, and the initial image acquired in the second mode is used as a second image.
For example, taking motor vehicle violation snapshot as an example, a legal mode is adopted to obtain motor vehicle violation image materials through a camera, the violation image is snapshot according to preset camera violation snapshot parameters to obtain a plurality of initial images, the camera is accessed to a platform and a host which can receive violation snapshot images, and smooth data transmission is ensured to serve as a first mode. And after the environment test is finished, initializing the platform and the host picture database data. And configuring camera ftp transmission as a second mode, and transmitting and storing the camera violation snapshot pictures into the I folder. And after the P folder waits for the test to start, drawing the pictures in the platform and the host picture storage database into the P folder, wherein the pictures in the P folder are used as second images, and the pictures in the I folder are used as first images.
It should be understood that the first mode and the second mode may be other modes known to those skilled in the art, and are not limited herein.
In one embodiment, before step S102, the method further comprises:
respectively acquiring a first number of first images and a second number of second images;
and determining the quantity stability according to the first quantity and the second quantity, if the quantity stability is smaller than the preset stability, respectively extracting the image feature vectors of the target image and each second image, and if the quantity stability is larger than the preset stability, prompting to investigate the network environment.
Because the first mode and the second mode are only two modes of initial image transmission, ideally, the number of the first images and the number of the second images acquired by the two modes should be consistent, the first images and the second images are in one-to-one correspondence, but the images acquired by the two modes may be inconsistent due to factors such as data loss possibly existing in the transmission process, at this time, the first number of the first images acquired by the two modes and the second number of the second images acquired by the two modes can be preliminarily compared, if the difference of the numbers is large, the network environment is poor, the network environment can be checked first, and then the first images and the second images are acquired again for image matching.
The quantitative stability characterizes a degree of difference between the first and second amounts, and the quantitative stability may be determined in any one of the following manners:
number stability-first number-second number;
the number stability | first number — second number |/min (first number, second number).
Correspondingly, different preset stability degrees can be set according to different quantity stability degree determining modes.
Optionally, if the first mode is ftp transmission, and the second mode is host and platform transmission, the first number is identified by w, the second number is identified by e, and if w is equal to e, it may be preliminarily determined that the image transmission stability is high, and image matching may be performed; if w is greater than e, judging that image loss exists in image transmission; if w is less than e, the existence of multiple image captures can be preliminarily judged; if w is larger than e and the quantity difference is larger (the quantity stability is larger), stopping the test and checking the network environment. Alternatively, the preliminary comparison result (in the above three cases) may be displayed and recorded in a log and printed.
Step S102: determining a first image as a target image, respectively extracting image feature vectors of the target image and each second image, determining image feature cosine similarity between the target image and each second image, and determining an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold. Wherein the set of images includes the target image and at least one second image.
The target image may be any first image, and the specific determination manner of the target image may be set by a person skilled in the art as required, for example, the first image with the earliest acquisition time may be used as the target image.
The method of extracting the image feature vector from the target image and each second image and the method of determining the cosine similarity of the image feature between the target image and each second image may be implemented by methods known to those skilled in the art, and are not limited herein.
The preset image feature cosine similarity threshold may be set by a person skilled in the art as needed, for example, 0.85. The larger the cosine similarity of the image features (close to 1), the higher the degree of similarity between the target image and the second image. The cosine similarity Y of the image features belongs to [0,1 ].
And determining a second image and the target image, which have the image feature cosine similarity with the target image larger than a preset image feature cosine similarity threshold, as an image set. In other words, when there is at least one second image whose image feature cosine similarity is greater than the preset image feature cosine similarity threshold, the second images and the target image are taken as an image set. If the image feature cosine similarity of each second image is smaller than the preset image feature cosine similarity threshold, the target image has no image set, and the step is stopped, namely the target image has no second image matched with the target image.
Optionally, the determination of the cosine similarity of the image features of the target image and each second image may be implemented by traversing the second images, and the cosine similarity of the image features of the target image and the second images is sequentially determined. If the image feature cosine similarity between the target image and the second image is greater than the preset image feature cosine similarity threshold, it is indicated that the second image matched with the target image is matched, at this time, the determination of the image feature cosine similarity between the target image and other second images can be stopped, and the determination of the image feature cosine similarity between the target image and other second images can also be continued until the determination of the image feature cosine similarity between all the second images and the target image is completed. And forming an image set corresponding to the target image by the second image and the target image, wherein the cosine similarity of all the image features is greater than a preset image feature cosine similarity threshold. And if the image feature cosine similarity of the target image and the second image is smaller than the preset image feature cosine similarity threshold, skipping the second image, and continuing to determine and compare the image feature cosine similarity of the next second image. If the image feature cosine similarity of each second image is smaller than the preset image feature cosine similarity threshold, the step is stopped, and the second images which are not matched with the target image can be determined, namely the matching relationship between the target image and each second image is not matched. At this point, a new target image may be redetermined and a new round of image matching may continue.
Referring to fig. 2, fig. 2 is a schematic diagram illustrating an exemplary process of determining an image set according to an image feature cosine similarity and a preset image feature cosine similarity threshold, as shown in fig. 2, taking a first manner of directly acquiring an initial image of an image acquisition device such as a camera and the like as an example, a second manner of acquiring the initial image through a platform and a host as an example, and by determining a magnitude relationship between the image feature cosine similarity (cosine feature in the figure) and the preset image feature cosine similarity threshold (0.85 as an example), if a predetermined feature value is greater than 0.85, performing step S103, otherwise, skipping the second image, and comparing a magnitude relationship between the image feature cosine similarity (cosine feature in the figure) of the next second image and the target image and the preset image feature cosine similarity threshold (0.85 as an example).
Step S103: and respectively acquiring the second images in the image set and the image information of the target images in the image set, and comparing the image information to determine the matching relationship between the target images in the image set and the second images in the image set, wherein the image information comprises at least one of resolution, image characteristic vectors and color space values.
In one embodiment, if the image information includes a resolution, the image information comparison includes obtaining resolutions of the target image and each second image, respectively, determining a resolution similarity between the target image and each second image, and determining a matching relationship between the target image and each second image in the image set according to the resolution similarity and a preset resolution similarity threshold.
If the resolution similarity between a second image and the target image is greater than the preset resolution similarity threshold, the matching relationship between the second image and the target image may be determined as matching or rough matching (subsequently, the matching accuracy is further improved by combining other parameters). Otherwise, the matching relationship of the second image and the target image may be determined as not matching.
Optionally, if the resolution of the target image is hpx × wpx and the resolution of the second image is hpx × wpx, the determining of the resolution similarity includes,
Figure BDA0003541417790000081
wherein, RESsimiIs the resolution similarity.
The obtaining manner of the resolutions of the target image and the second image may be implemented by a manner known by those skilled in the art, and the preset resolution similarity threshold may be set by those skilled in the art as needed, for example, the preset resolution similarity threshold may be set to 0.95, and if the resolution similarity is greater than or equal to 0.95, it is described that the target image and the second image are substantially similar, and it may be determined that the target image and the second image are matched or coarsely matched for the next matching. If the resolution similarity is less than 0.95, the difference between the target image and the second image in the resolution is shown, and the target image is continuously compared with other second images in the image information. Wherein, the resolution similarity RESsimiE [0,1 ]. The closer the resolution similarity value is to 1, the higher the contrast picture similarity is.
Referring to fig. 3, fig. 3 is a schematic diagram of a specific image information comparison process, as shown in fig. 3, the resolution H px × W px of the camera and the resolution H px × W px of a second image of the host and the platform are respectively read, the resolution similarity (the similarity in the figure) is determined according to the above formula (1), and it is determined whether the resolution similarity is greater than a preset resolution similarity threshold, in the figure, 0.85 is taken as an example, if so, the next process is performed, where the next process may be to determine that the second image is matched with the target image, or to continue to determine the color space value, and the like. And if the resolution similarity is less than or equal to 0.95, skipping the second image and comparing the image information of the next second image.
In one embodiment, if the image information includes a color space value, the image information comparison includes obtaining RGB color values of the target image and each of the second images, determining the color space value of the target image and each of the second images, and determining a matching relationship between the target image and each of the second images in the image set, where the color space value includes luminance, saturation, and chromaticity.
Optionally, if the image information includes a color space value, the determining a matching relationship between the target image and each second image in the image set includes:
determining a color space error rate between the target image and each of the second images based on the color space values of the target image and each of the second images;
determining a matching relationship between the target image and each second image in the image set according to the color space error rate between the target image and each second image;
the manner in which the color space error rate is determined includes,
Figure BDA0003541417790000091
where HSIx is the color space error rate, Hp is the chroma of the second image, He is the chroma of the target image, Sp is the saturation of the second image, Se is the saturation of the target image, Ip is the luminance of the second image, and Ie is the luminance of the target image.
If the color space error rate of a second image and the target image is greater than the preset color space error rate threshold, the matching relationship between the second image and the target image can be determined as matching or rough matching (subsequently, the matching accuracy is further improved by combining other parameters). Otherwise, the matching relationship of the second image and the target image may be determined as not matching.
The RGB color values of the target image and the RGB color values of the second images may be obtained in a manner known by those skilled in the art, and are not limited herein. R, G, B ∈ (0, 255) in RGB color values.
Optionally, the color space value is determined in a manner including,
formula (3) I ═ aR + bG + cB;
Figure BDA0003541417790000092
Figure BDA0003541417790000101
wherein, I is brightness, a, B and c are preset coefficients, R is the R value of the target image or the second image, G is the G value of the target image or the second image, B is the B value of the target image or the second image, S is saturation, Max is the maximum value of the R value, the G value and the B value, Min is the minimum value of the R value, the G value and the B value, and H is chroma, wherein R, G, B, S, I E [0,1], H E [0,360 ].
Optionally, the Max and Min determination method includes:
max ═ Max (R, G, B) equation (6);
min (R, G, B) equation (7).
Alternatively, in the above formula 3, a is 0.229, b is 0.587, and c is 0.11.
In one embodiment, if the image information includes image feature vectors, the image information is compared by determining a feature vector error rate of the target image and each second image according to the image feature vectors of the target image and each second image, and determining a matching relationship between the target image and each second image in the image set according to the feature vector error rate and a preset feature vector error rate threshold.
If the error rate of the feature vector of a second image and the target image is greater than the preset threshold value of the error rate of the feature vector, the matching relationship between the second image and the target image can be determined as matching or rough matching (subsequently, the matching accuracy is further improved by combining other parameters). Otherwise, the matching relationship of the second image and the target image may be determined as not matching.
Optionally, the determining of the error rate of the feature vector comprises,
Figure BDA0003541417790000102
where Tx is the feature vector error rate, Tp is the image feature vector of the second image, and Te is the image feature vector of the target image.
Referring to fig. 4, fig. 4 is a schematic flow chart of another specific image information comparison method, and as shown in fig. 4, an image feature vector and a color space value of a camera picture (target image) are respectively obtained, where the image feature vector (feature vector) Te and the color space value HIS of the camera picture in the figure are He, Se, and Ie, respectively, the image feature vector (feature vector) Tp and the color space value HIS of a platform host picture (second image) are Hp, Sp, and Ip, respectively. Determining a feature vector error rate and an HIS error rate (color space error rate) according to a formula (8) and a formula (2), if the feature vector error rate Tx is less than a preset feature vector error rate threshold Tn% and the color space error rate HISx is less than a preset color space error rate threshold HISn%, wherein Tn% and HISn% are positive numbers, controlling the accuracy of image information comparison by adjusting the Tn% and the HISn%, if the conditions are met, performing next comparison, namely determining that the second image is matched with a target image, and otherwise, skipping the second image and comparing the next second image.
In one embodiment, the image information includes resolution, image feature vectors and color space values, and the comparing of the image information to determine the matching relationship between the target image and each second image in the image set includes:
respectively obtaining the resolutions of the target image and each second image, determining the resolution similarity between the target image and each second image, and determining a rough matching second image from each second image according to the resolution similarity and a preset resolution similarity threshold, wherein the resolution similarity between the rough matching second image and the target image is greater than the preset resolution similarity threshold;
respectively acquiring RGB color values of the target image and each coarse matching second image, determining color space values of the target image and each second image, and determining a color space error rate between the target image and each second image;
determining a feature vector error rate of the target image and each coarse matching second image according to the image feature vectors of the target image and each coarse matching second image;
and if the error rate of the color space is greater than the preset error rate threshold of the color space and the error rate of the feature vector is greater than the preset error rate threshold of the feature vector, determining the rough matching second image as the matching image of the target image. Alternatively, the matching image may be one or more.
The order of determining the color space error rate and the feature vector error rate is not limited herein.
If the color space error rate of each second image is less than the preset color space error rate threshold value and/or the feature vector error rate is less than the preset feature vector error rate threshold value, no rough matching second image matched with the target image exists, and the matching relation between the target image and each second image is not matched.
In one embodiment, each first image is sequentially determined as a target image, and a matching relationship between each target image and each second image is determined, the method further comprising at least one of:
if a second image matched with the target image exists, storing the target image and the second image into a first storage space;
if a second image matched with the target image exists, adding the identification information of the second image into the file name of the target image;
if a plurality of second images matched with the target image exist, storing the target image and one second image into a first storage space, and storing other second images matched with the target image into a second storage space;
if the second image matched with the target image does not exist, storing the target image into a third storage space;
and if the second image is not matched with each first image, storing the second image into a fourth storage space.
For example, if there is a second image matching the target image, that is, if there is a matching image in the target image, it indicates that the target image and the matching image are the same image, at this time, the matching image (the second image matching the target image) may be moved to a "correct matching" folder (the first storage space), and at the same time, the target image may be copied to the "correct matching" folder, and renamed in a manner of "matching image name" + "original image name" (identification information (image name) of the second image is added to the file name of the target image). If the file name of the target image is a and the file name of the matching image is a, the file name of the target image may be renamed to "Aa".
If there are at least two matching images of the target image, one of the matching images (e.g., the first-determined matching image) may be stored in the first storage space together with the target image, and the other matching image may be stored in the second storage space. For example, since the correct picture is matched, the matched annotation is performed on the target image, and the target image is continuously compared with the second image which is not compared in each second image for image information. If the target image is successfully compared in the comparison of the rest second images, the second image is moved to the repeated matching folder, and the operation of copying and naming the target image is completed (the file name of the second image is m at the moment, and the new file name of the target image is Aam at the moment). After the target image is compared with all the second images, if the target image has no matched label, skipping; and if the matched label exists in the target image, deleting the target image. If all the first images are used as target images and are compared with the second images, if the first images without matched labels exist, the first images are moved to an unmatched folder (a third storage space); if the second image does not match any of the first images, the second image is moved to an "error match" folder (fourth storage space).
In one embodiment, the method further comprises at least one of:
acquiring the number of images of a target image in a first storage space as a correct matching number;
acquiring the number of images of a second image in a second storage space as the number of repeated matches;
acquiring the number of images of the target image in the third storage space as the number of unmatched images;
the number of images of the second image in the fourth storage space is acquired as the number of error matches.
Optionally, if the number of the first images acquired in the first way of correctly matching the number ratio exceeds a first preset ratio, the image acquisition quality is high, otherwise, the image acquisition is prompted to be abnormal; if the quantity of the second images acquired by the second mode exceeds a second preset ratio, the quality of the acquired images by the second mode is low, otherwise, the images are prompted to be normally acquired; if the number of the first images acquired in the first mode exceeds a third preset ratio, the second mode is low in image acquisition quality, and otherwise, the images are prompted to be normally acquired; and if the number of the second images acquired in the second mode exceeds a fourth preset ratio in the number of the wrong matches, the image acquisition quality in the first mode is low, and otherwise, the image acquisition is prompted to be normal. The first preset proportion, the second preset proportion, the third preset proportion and the fourth preset proportion may be the same or different, and may be set by those skilled in the art as required. When the image acquisition quality is low, the image acquisition mode is abnormal, and related personnel can be further prompted to overhaul.
The image acquisition quality of the first mode and the image acquisition quality of the second mode are determined by respectively acquiring the image quantity in the first storage space, the second storage space, the third storage space and the fourth storage space. For example, if the number of images in the third storage space exceeds a preset threshold, the image acquisition quality is low, or if the number of images in the fourth storage space exceeds a preset threshold, the image acquisition quality is low, and the like, which indicates that the initial image quality acquired by the first mode or the second mode is not high and the error rate is high.
Taking the image migration and storage manner of the above embodiment as an example, finally, the numbers of the "correct matching", "wrong matching", "repeated matching" and "unmatched" folder images are counted, and the result of the image comparison test of this time can be obtained. Focus is on wrong, duplicate matched, unmatched pictures. The picture with objection can be checked manually.
In one embodiment, the first image obtained in the first manner may be stored in a first preset storage space, the second image obtained in the second manner may be stored in a second preset storage space, and after the first image is used as a target image and image matching is completed, the second image matched with the first image is stored, the first image is subjected to matching labeling, and is moved to a specified storage space (e.g., the first storage space described below) to be renamed. And if no second image matched with the first image exists, keeping the second image in the first preset storage space.
The image matching method provided in this embodiment obtains initial images acquired by an image acquisition device through different first and second modes to obtain a plurality of first images and a plurality of second images, uses one of the first images as a target image, performs image feature vector extraction on the target image and each of the second images respectively, determines image feature cosine similarity between the target image and each of the second images, determines an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, obtains image information of the target image and each of the second images in the image set respectively, and performs image information comparison to determine a matching relationship between the target image and each of the second images in the image set.
Optionally, in the method, the second image matched with the target image, the second image not matched with any target image, the target image not matched with any second image, and more than one second image matched with the target image are respectively stored, so that the image acquisition quality of the images acquired in the two modes, the consistency degree of image acquisition, and the problem backtracking can be conveniently counted.
Optionally, the image is analyzed in an evaluation mode of cosine similarity of image features, resolution similarity, error rate of feature vectors and error rate of color space, the image similarity is analyzed in a progressive mode, and accuracy of image matching is further guaranteed.
The above-mentioned image matching method is exemplarily described below by a specific embodiment. Referring to fig. 5, fig. 5 is a flowchart illustrating a specific image matching method. Firstly, preparing a section of motor vehicle violation video material to ensure that the number of violation vehicles is large and the video quality is clear; and configuring violation snapshot parameters of the camera, accessing the camera to a platform or a host capable of accepting violation snapshot pictures, and ensuring that the camera is communicated with the platform and the host side network. Initializing a platform and host picture database data.
Secondly, configuring camera picture processing parameters including but not limited to a synthesis mode, definition, overlay information and the like. Creating a folder I and a folder P at a test PC end, and configuring camera ftp transmission to store a camera violation snapshot picture (a first image) in the folder I; the folder P folder is used to store pictures (second images) in the platform or host picture storage database.
And playing the test video, generating an illegal picture (initial image) by the camera, and transmitting the illegal picture to the folder I. After the test video is played, w pictures (first images) are shared in the folder I. And automatically connecting the platform or the host image database, and drawing the database images into the folder P to obtain e images (second images). And comparing w with e to make a preliminary judgment. If w is equal to e, the picture transmission stability can be preliminarily judged; if w is greater than e, judging that lost images exist in image transmission; if w is less than e, the existence of multiple image grabbers can be preliminarily judged; and displaying and recording the preliminary comparison result to a log and printing. And if w is greater than e and the quantity difference is larger, stopping testing, checking the network environment, and carrying out next testing on the other two conditions.
The "match correct", "duplicate match", "mismatch" and "mismatch" folders are created. The multidimensional contrast matching adopts the sequence of 'feature vector' - > 'resolution' - > 'HSI color space' for comparison. And sequentially taking the picture In (first image) In the folder I and the picture Pn (second image) In the folder P for traversing comparison. Firstly, comparing the image characteristic values to calculate cosine values (image characteristic cosine similarity) of In and Pn characteristic vectors, wherein the result is (0,1) decimal, the value is close to 1, and the image similarity is higher. And if the cosine value is larger than 0.85, indicating that the In is matched with the same picture, and performing the next matching. And if other conditions exist, skipping the current Pn picture, and comparing the next picture until the comparison with all the pictures In P is finished, if none of the Pn pictures has the same picture as the In picture, skipping the current In picture, and selecting the next picture In I to perform the comparison until the comparison with all the pictures In I is finished.
Then, the similarity of the In and Pn resolution information is calculated, and as the result is (0,1), the closer the value is to 1, the higher the similarity of the contrast pictures is, and the resolution similarity calculation method is that the In picture resolution is H px W px, and the Pn picture resolution is H px W px
Figure BDA0003541417790000141
If the result (resolution similarity) is greater than or equal to 0.95, it indicates that In and Pn are basically similar, and the next matching is performed. If the result is less than 0.95, the In and Pn have difference In resolution, and the In is continuously matched with other Pn In P.
And finally, calculating the error rate of the picture by adopting the feature vector and the HSI color space for comparison. And extracting the picture feature vector by using cosine calculation, storing the extracted picture feature vector as T, and extracting the picture RGB color value, wherein the extraction mode of the RGB color value can be realized by adopting a mode known by a person skilled in the art, and R, G and B are belonged to (0, 255).
And simultaneously calculating the maximum and minimum values of RGB:
Max=max(R,G,B);
Min=min(R,G,B);
the values of I, S and H can be obtained by using a RGB to HSI space conversion formula, wherein I represents brightness, S represents saturation, and H represents chroma, and the conversion formula is as follows:
I=0.229R+0.587G+0.11B;
Figure BDA0003541417790000151
Figure BDA0003541417790000152
wherein R, G, B, S, I belongs to [0,1], H belongs to [0,360 ].
Respectively calculating the characteristic vectors of In and Pn and HSI (color space values), wherein the characteristic vector of the In image is Te, and the HSI values are He, Se and Ie; the Pn image feature vector is Tp, and the HSI values are Hp, Sp, and Ip, respectively. Calculating the error rate Tx and the error rate HSI HSIx of the image feature vector by the following method:
feature vector error rate:
Figure BDA0003541417790000153
HSI error rate (color space error rate):
Figure BDA0003541417790000154
if Tx < Tn% and HSIx < HSIn% (Tn is a predetermined eigenvector error rate threshold, HSIn is a predetermined color space error rate threshold, both are positive numbers, and the accuracy is controlled by adjusting the numerical values of the two), indicating that In and Pn are the same picture, moving Pn to the "correct matching" folder, and copying In to the "correct matching" folder, and renaming In the form of "Pn picture name" + "original picture name". And because the correct picture is matched, matching and marking In, and continuously comparing the In with the Pn which is not compared In the P. If the In is successfully compared In the comparison of the rest Pn, the Pn is moved to the 'repeated matching' folder, and the In copying and naming operation is completed. After In is compared with all Pn, if In has no matched label, skipping; and if the In has the matched label, deleting the In.
After all the pictures in the folder I are compared with all the pictures in the folder P, if the pictures still exist in the folder I, the pictures are moved to an unmatched folder; if there are still pictures in P, move the picture to the "mismatch" folder. And finally, counting the number of the pictures of the correct matching folder, the incorrect matching folder, the repeated matching folder and the unmatched folder to obtain a comparison test result of the picture. Focus is on wrong, duplicate matched, unmatched pictures. The picture with objection can be checked manually.
For another example, a motor vehicle violation video material with clear video for 30 minutes is prepared, wherein the number of violation vehicles is more than 100, meanwhile, the camera violation snapshot parameters are configured, and the camera is accessed to a platform and a host which can accept violation snapshot pictures, so that smooth data transmission is ensured. And after the environment test is finished, initializing the platform and the host picture database data. And configuring camera ftp transmission, and transmitting and storing the pictures taken by the camera against the regulations into a folder I. And drawing the pictures in the platform and the host picture storage database into the folder P after the test is started.
And playing the test video, and continuously capturing by the camera to generate an illegal picture (a first image) and continuously transmitting the illegal picture to the folder I. And after the test video is played, automatically connecting the image database of the platform and the host, and drawing the database image (second image) into the folder P. In this case, the folder I shares a pictures and the folder P shares b pictures. At the moment, the preliminary judgment is automatically carried out, and if a is equal to b, the picture transmission is preliminarily shown to be stable; if a > b, the missing graph is preliminarily shown; if a < b, it is preliminarily indicated that the picture has multiple catches. And displaying the preliminary analysis result in a log mode, pausing, and waiting for whether the test analysis is continued or not by manual input of a tester. If a is larger than b and the quantity difference is larger, stopping testing and checking the network environment; otherwise, the test continues to be performed.
Automatically creating "match correct", "duplicate match", "mismatch", "unmatched" folders. And sequentially taking the picture In I and the picture Pn In P for traversing comparison.
Firstly, comparing the eigenvectors, and comparing cosine values of the In eigenvector and the Pn eigenvector; and if the cosine value is greater than 0.85, performing next matching, and if the cosine value is less than or equal to 0.85, comparing with the other pictures in the P until the comparison of all the pictures in the I is finished. And after the cosine value comparison is successful, comparing the In resolution and the Pn resolution, and if the result is more than or equal to 0.95, carrying out the next matching, and if the result is other, continuing to match with other pictures In the P. And finally comparing the error rate of the HSI, if Tx is less than Tn% and HSIx is less than HSIn%, moving Pn to a ' correct matching ' folder, copying In to the ' correct matching ' folder, renaming by using ' Pn file name plus ' original name file name ', carrying out matched marking on In, continuously comparing In with other pictures In P, and if the comparison of Pn is successful, repeatedly matching the Pn movement value with the ' correct matching ' folder.
I. After the comparison of all the pictures in the P is finished, if pictures still exist in the I, moving the pictures to an unmatched folder; if there are still pictures In P, move the pictures to the "mismatch" folder and complete the In copy naming operation. After In is compared with all Pn, if In has no matched label, skipping; and if the In has the matched label, deleting the In.
By the image matching method, the test picture materials to be compared can be automatically processed, the processing efficiency and speed of comparison of a large number of picture materials are greatly improved compared with manual testing, and the working efficiency of testing is effectively improved. The pictures are analyzed through cosine vectors, resolution and HSI color distribution multilevel evaluation standards, the picture similarity is analyzed in a progressive mode, and the accuracy of picture analysis is guaranteed; and the pictures are stored in a classified manner, so that the statistical test data and problem backtracking are facilitated.
Referring to fig. 6, the present embodiment further provides an image matching system 600, including:
the acquiring module 601 is configured to acquire a plurality of initial images acquired by an image acquisition device through a first mode and a second mode, so as to obtain a plurality of first images and a plurality of second images;
the image set determining module 602 is configured to determine a first image as a target image, perform image feature vector extraction on the target image and each second image, determine image feature cosine similarity between the target image and each second image, and determine an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, where the image set includes the target image and at least one second image;
the matching module 603 is configured to obtain image information of each second image in the image set and the target image in the image set, and compare the image information to determine a matching relationship between the target image in the image set and each second image in the image set, where the image information includes at least one of resolution, image feature vector, and color space value.
In this embodiment, the system is substantially provided with a plurality of modules for executing the method in the above embodiments, and specific functions and technical effects may refer to the above method embodiments, which are not described herein again.
Referring to fig. 7, an embodiment of the present invention further provides an electronic device 1000, which includes a processor 1001, a memory 1002, and a communication bus 1003;
the communication bus 1003 is used to connect the processor 1001 and the memory 1002;
the processor 1001 is configured to execute the computer program stored in the memory 1002 to implement the method according to one or more of the first embodiment.
Embodiments of the present invention also provide a computer-readable storage medium, having a computer program stored thereon,
the computer program is for causing a computer to perform the method as in any one of the above embodiments one.
Embodiments of the present application also provide a non-transitory readable storage medium, where one or more modules (programs) are stored in the storage medium, and when the one or more modules are applied to a device, the device may execute instructions (instructions) included in an embodiment of the present application.
It should be noted that the computer readable medium of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing embodiments are merely illustrative of the principles and utilities of the present invention and are not intended to limit the invention. Any person skilled in the art can modify or change the above-mentioned embodiments without departing from the spirit and scope of the present invention. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical spirit of the present invention be covered by the claims of the present invention.

Claims (11)

1. An image matching method, characterized in that the method comprises:
acquiring a plurality of initial images acquired by image acquisition equipment through a first mode and a second mode respectively to obtain a plurality of first images and a plurality of second images;
determining a first image as a target image, respectively extracting image feature vectors of the target image and each second image, determining image feature cosine similarity between the target image and each second image, and determining an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, wherein the image set comprises the target image and at least one second image;
and respectively acquiring image information of each second image in the image set and the target image in the image set, and comparing the image information to determine the matching relationship between the target image in the image set and each second image in the image set, wherein the image information comprises at least one of resolution, image characteristic vector and color space value.
2. The method of claim 1, wherein prior to performing image feature vector extraction on the target image and each second image, respectively, the method further comprises:
respectively acquiring a first number of first images and a second number of second images;
and determining the quantity stability according to the first quantity and the second quantity, respectively extracting image feature vectors of the target image and each second image if the quantity stability is less than the preset stability, and prompting to investigate the network environment if the quantity stability is greater than the preset stability.
3. The method of claim 1, wherein the method further comprises at least one of:
if the image information comprises the resolution, the image information comparison mode comprises the steps of respectively obtaining the resolutions of the target image and each second image, determining the resolution similarity between the target image and each second image, and determining the matching relation between the target image and each second image in the image set according to the resolution similarity and a preset resolution similarity threshold;
if the image information comprises color space values, the image information comparison mode comprises the steps of respectively obtaining RGB color values of the target image and each second image, determining the color space values of the target image and each second image, and determining the matching relation between the target image and each second image in the image set, wherein the color space values comprise brightness, saturation and chroma;
if the image information comprises image feature vectors, the image information comparison mode comprises the steps of determining the feature vector error rate of the target image and each second image according to the image feature vectors of the target image and each second image, and determining the matching relation between the target image and each second image in the image set according to the feature vector error rate and a preset feature vector error rate threshold value.
4. The method of claim 3,
if the resolution of the target image is hpx × wpx and the resolution of the second image is hpx × wppx, the determining of the resolution similarity includes,
Figure FDA0003541417780000021
wherein, RESsimiIs the resolution similarity;
the manner in which the error rate of the feature vector is determined includes,
Figure FDA0003541417780000022
wherein Tx is a feature vector error rate, Tp is an image feature vector of the second image, and Te is an image feature vector of the target image;
the manner in which the color space value is determined includes,
I=aR+bG+cB,
Figure FDA0003541417780000023
Figure FDA0003541417780000024
wherein, I is brightness, a, B and c are preset coefficients, R is the R value of the target image or the second image, G is the G value of the target image or the second image, B is the B value of the target image or the second image, S is saturation, Max is the maximum value of the R value, the G value and the B value, Min is the minimum value of the R value, the G value and the B value, and H is chroma, wherein R, G, B, S, I E [0,1], H E [0,360 ].
5. The method of claim 3, wherein if the image information includes color space values, determining a matching relationship between the target image and each of the second images in the set of images comprises:
determining a color space error rate between the target image and each of the second images based on the color space values of the target image and each of the second images;
determining a matching relationship between the target image and each second image in the image set according to the color space error rate between the target image and each second image;
the manner in which the color space error rate is determined includes,
Figure FDA0003541417780000025
where HSIx is the color space error rate, Hp is the chroma of the second image, He is the chroma of the target image, Sp is the saturation of the second image, Se is the saturation of the target image, Ip is the luminance of the second image, and Ie is the luminance of the target image.
6. The method of claim 1, wherein the image information comprises a resolution, an image feature vector, and a color space value, and the comparing the image information to determine a matching relationship between the target image and each second image in the image set comprises:
respectively obtaining the resolutions of a target image and each second image, determining the resolution similarity between the target image and each second image, and determining a rough matching second image from each second image according to the resolution similarity and a preset resolution similarity threshold, wherein the resolution similarity between the rough matching second image and the target image is greater than the preset resolution similarity threshold;
respectively acquiring RGB color values of the target image and each rough matching second image, determining color space values of the target image and each second image, and determining a color space error rate between the target image and each second image;
determining the error rate of the feature vectors of the target image and each rough matching second image according to the image feature vectors of the target image and each rough matching second image;
and if the error rate of the color space is greater than a preset error rate threshold of the color space and the error rate of the feature vector is greater than a preset error rate threshold of the feature vector, determining the rough matching second image as a matching image of the target image.
7. The method of any one of claims 1-6, wherein each first image is sequentially determined as a target image and a matching relationship between each target image and each second image is determined, the method further comprising at least one of:
if a second image matched with the target image exists, storing the target image and the second image into a first storage space;
if a second image matched with the target image exists, adding the identification information of the second image into the file name of the target image;
if a plurality of second images matched with the target image exist, storing the target image and one second image into a first storage space, and storing other second images matched with the target image into a second storage space;
if the second image matched with the target image does not exist, storing the target image into a third storage space;
and if the second image is not matched with each first image, storing the second image into a fourth storage space.
8. The method of claim 7, further comprising at least one of:
acquiring the number of images of a target image in a first storage space as the correct matching number;
acquiring the number of images of a second image in a second storage space as the number of repeated matches;
acquiring the number of images of the target image in the third storage space as the number of unmatched images;
the number of images of the second image in the fourth storage space is acquired as the number of error matches.
9. An image matching system, the system comprising:
the acquisition module is used for respectively acquiring a plurality of initial images acquired by the image acquisition equipment in a first mode and a second mode to obtain a plurality of first images and a plurality of second images;
the image set determining module is used for determining a first image as a target image, respectively extracting image feature vectors of the target image and each second image, determining image feature cosine similarity between the target image and each second image, and determining an image set according to the image feature cosine similarity and a preset image feature cosine similarity threshold, wherein the image set comprises the target image and at least one second image;
and the matching module is used for respectively acquiring the second images in the image set and the image information of the target image in the image set, and comparing the image information to determine the matching relationship between the target image in the image set and the second images in the image set, wherein the image information comprises at least one of resolution, image characteristic vector and color space value.
10. An electronic device comprising a processor, a memory, and a communication bus;
the communication bus is used for connecting the processor and the memory;
the processor is configured to execute a computer program stored in the memory to implement the method of any one of claims 1-8.
11. A computer-readable storage medium, having stored thereon a computer program,
the computer program is for causing a computer to perform the method of any one of claims 1-8.
CN202210240581.0A 2022-03-10 2022-03-10 Image matching method, system, device and medium Pending CN114581853A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210240581.0A CN114581853A (en) 2022-03-10 2022-03-10 Image matching method, system, device and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210240581.0A CN114581853A (en) 2022-03-10 2022-03-10 Image matching method, system, device and medium

Publications (1)

Publication Number Publication Date
CN114581853A true CN114581853A (en) 2022-06-03

Family

ID=81775607

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210240581.0A Pending CN114581853A (en) 2022-03-10 2022-03-10 Image matching method, system, device and medium

Country Status (1)

Country Link
CN (1) CN114581853A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116386091A (en) * 2022-11-18 2023-07-04 荣耀终端有限公司 Fingerprint identification method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116386091A (en) * 2022-11-18 2023-07-04 荣耀终端有限公司 Fingerprint identification method and device
CN116386091B (en) * 2022-11-18 2024-04-02 荣耀终端有限公司 Fingerprint identification method and device

Similar Documents

Publication Publication Date Title
US7676085B2 (en) Method and apparatus for representing a group of images
US20190251742A1 (en) Method for the automatic material classification and texture simulation for 3d models
CN109753928B (en) Method and device for identifying illegal buildings
KR101854554B1 (en) Method, device and storage medium for calculating building height
US9558401B2 (en) Scanbox
CN109508404B (en) Maintenance teaching video management method and device, terminal equipment and storage medium
CN110245573A (en) A kind of register method, apparatus and terminal device based on recognition of face
CN114581853A (en) Image matching method, system, device and medium
EP2423850A2 (en) Object recognition system and method
WO2024001309A1 (en) Method and apparatus for generating and producing template for infrared thermal image analysis report
CN114519689A (en) Image tampering detection method, device, equipment and computer readable storage medium
AU2018101640A4 (en) A system and method for image processing
Sankar et al. Feature based classification of computer graphics and real images
CN112836759B (en) Machine-selected picture evaluation method and device, storage medium and electronic equipment
CN114460862A (en) Camera equipment ring test simulation method applied to ADAS
CN112991388A (en) Line segment feature tracking method based on optical flow tracking prediction and convex geometric distance
CN112399096B (en) Video processing method, device and computer readable storage medium
CN116912429B (en) Three-dimensional reconstruction method and system for high-definition video IG (inter-group) material
CN113345037A (en) Automatic testing method, system, medium and terminal for motor vehicle algorithm indexes
CN117014596A (en) Evaluation method, evaluation device, apparatus, storage medium, and program product
WO2024018449A1 (en) A system and method for generating a colored image of a scene
CN114691441A (en) Algorithm testing method and device
CN113112487A (en) Method, device, equipment and storage medium for identifying thermal defects
CN115330711A (en) Image video content management method and system based on data processing
CN116245962A (en) Data extraction system and method for wireless transmission to blockchain server

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination