CN114782716A - Image matching method and device - Google Patents

Image matching method and device Download PDF

Info

Publication number
CN114782716A
CN114782716A CN202210374133.XA CN202210374133A CN114782716A CN 114782716 A CN114782716 A CN 114782716A CN 202210374133 A CN202210374133 A CN 202210374133A CN 114782716 A CN114782716 A CN 114782716A
Authority
CN
China
Prior art keywords
image
matched
images
importance
areas
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210374133.XA
Other languages
Chinese (zh)
Inventor
郝婧雯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Industrial and Commercial Bank of China Ltd ICBC
Original Assignee
Industrial and Commercial Bank of China Ltd ICBC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Industrial and Commercial Bank of China Ltd ICBC filed Critical Industrial and Commercial Bank of China Ltd ICBC
Priority to CN202210374133.XA priority Critical patent/CN114782716A/en
Publication of CN114782716A publication Critical patent/CN114782716A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The embodiment of the application provides an image matching method and device, which relate to the field of computer vision, and the method comprises the following steps: determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result; the method comprises the steps of sequencing the importance of a plurality of image areas of each image to be matched, sequentially comparing the characteristics of two image areas of the two images to be matched according to the order of the importance, and determining the similarity of the two image areas; determining whether the two images to be matched are matched or not according to the similarity of the two image areas; the method and the device can effectively improve the image retrieval precision and the matching efficiency.

Description

Image matching method and device
Technical Field
The application relates to the field of computer vision, can also be used in the field of finance, and particularly relates to an image matching method and device.
Background
With the popularization of smart phones and digital products, users begin to take pictures by using the digital products and share the pictures with some geographic tags on some social network sites of the internet, which makes image data information of the internet explosively increase. At the same time, the demand for image retrieval is increasing.
The image retrieval means inputting key information of an image, thereby retrieving an image related to the key information on the internet. The key information includes a label of an image or the image itself, and for a user, it is most convenient to upload the image directly to the internet to query a related image of a local image, for example, the user wants to know a building, and can take the building directly, and then obtain the information that the user wants by extracting image features and comparing with a large database of the internet, but as the requirements of the user on the accuracy and speed of image retrieval are continuously raised, some conventional image retrieval methods cannot meet the requirements of the user.
Disclosure of Invention
Aiming at the problems in the prior art, the application provides an image matching method and device, which can effectively improve the image retrieval precision and the matching efficiency.
In order to solve at least one of the above problems, the present application provides the following technical solutions:
in a first aspect, the present application provides an image matching method, including:
determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result;
the method comprises the steps of sequencing the importance of a plurality of image areas of each image to be matched, sequentially comparing the characteristics of two image areas of the two images to be matched according to the importance sequence, and determining the similarity of the two image areas;
and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
Further, the determining the image feature points of the two images to be matched includes:
searching extreme points of the image to be matched on different scale spaces according to a preset scale invariant feature transformation algorithm to obtain image feature points of the image to be matched.
Further, the performing coordinate clustering according to the image feature points, and dividing the image to be matched into a plurality of image areas according to a coordinate clustering result includes:
carrying out coordinate clustering on the coordinates of the image feature points according to a preset mean shift clustering algorithm;
and determining a plurality of image areas according to the coordinate clustering result.
Further, the ranking of importance of the plurality of image areas of each image to be matched includes:
and sorting the importance of each image area according to the number of the image feature points contained in each divided image area.
Further, the ranking of importance of the plurality of image regions of each image to be matched further includes:
and sequencing the importance of each image area according to the position distance between each divided image area and the image center of the image to be matched.
Further, the sequentially comparing the features of the two image regions of the two images to be matched according to the order of the importance degree to determine the similarity of the two image regions includes:
sequentially determining whether the two image areas of the two images to be matched have the same visual attribute according to the importance sequence;
and determining the similarity of the two image areas according to the number of the same visual attributes of the two image areas and the space descriptors of the two image areas.
In a second aspect, the present application provides an image matching apparatus, comprising:
the image area dividing module is used for determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result;
the salient region determining module is used for sequencing the importance of a plurality of image regions of each image to be matched, sequentially comparing the characteristics of two image regions of the two images to be matched according to the importance sequence and determining the similarity of the two image regions;
and the image matching module is used for determining whether the two images to be matched are matched according to the similarity of the two image areas.
Further, the image region division module includes:
and the characteristic point determining unit is used for searching extreme points of the image to be matched on different scale spaces according to a preset scale invariant characteristic conversion algorithm to obtain image characteristic points of the image to be matched.
Further, the image area dividing module further includes:
the coordinate clustering unit is used for carrying out coordinate clustering on the coordinates of the image feature points according to a preset mean shift clustering algorithm;
and the image dividing unit is used for determining a plurality of image areas according to the coordinate clustering result.
Further, the salient region determination module includes:
and the quantity sequencing unit is used for sequencing the importance of each image area according to the quantity of the image characteristic points contained in each divided image area.
Further, the salient region determination module further comprises:
and the distance sorting unit is used for sorting the importance of each image area according to the position distance between each divided image area and the image center of the image to be matched.
Further, the salient region determining module further comprises:
the same attribute judging unit is used for sequentially determining whether the two image areas of the two images to be matched have the same visual attribute according to the importance degree sequence;
and the similarity calculation unit is used for determining the similarity of the two image areas according to the quantity of the same visual attributes of the two image areas and the space descriptors of the two image areas.
In a third aspect, the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the image matching method when executing the program.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the image matching method described.
In a fifth aspect, the present application provides a computer program product comprising computer programs/instructions which, when executed by a processor, implement the steps of the image matching method.
According to the technical scheme, the image matching method and the image matching device have the advantages that the whole image is divided into the significant part and the non-significant part by dividing the image area, and the image characteristics of the significant area are compared, so that the number of the image characteristics is greatly reduced, and the image retrieval precision and the image retrieval matching efficiency are improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating an image matching method according to an embodiment of the present application;
FIG. 2 is a second flowchart illustrating an image matching method according to an embodiment of the present application;
FIG. 3 is a third flowchart illustrating an image matching method according to an embodiment of the present application;
fig. 4 is one of the configuration diagrams of an image matching apparatus in the embodiment of the present application;
FIG. 5 is a second block diagram of an image matching apparatus according to an embodiment of the present application;
FIG. 6 is a third block diagram of an image matching apparatus according to an embodiment of the present application;
FIG. 7 is a fourth block diagram of an image matching apparatus according to an embodiment of the present application;
FIG. 8 is a fifth configuration diagram of an image matching apparatus according to an embodiment of the present application;
FIG. 9 is a sixth configuration diagram of an image matching apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of an electronic device in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
According to the technical scheme, the data acquisition, storage, use, processing and the like meet the relevant regulations of national laws and regulations.
In consideration of the problem that the image retrieval precision and speed are low in the prior art, the application provides the image matching method and the image matching device, the whole image is divided into two parts, namely 'significant' and 'non-significant' through dividing the image area, and the image features of the significant area are compared, so that the number of the image features is greatly reduced, and the image retrieval precision and the image retrieval matching efficiency are improved.
In order to effectively improve the image retrieval accuracy and the matching efficiency, the present application provides an embodiment of an image matching method, and with reference to fig. 1, the image matching method specifically includes the following contents:
step S101: determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result.
It will be appreciated that since image retrieval/image matching is a similarity discrimination problem rather than a recognition problem, accurate segmentation of the image is not required in this particular application domain, as compared to image segmentation at the time of object recognition.
Optionally, in order to determine representative image feature points from the images to be matched, an Scale Invariant Feature Transform (SIFT) algorithm may be used to determine extreme points of the two images to be matched in different scale spaces, and the extreme points are used as local image features, that is, image feature points.
Optionally, the region division is characterized in that the obtained region does not require to be capable of accurately matching the outline boundary of the semantic entity in the original image, so that the difficulty brought by accurate image segmentation can be avoided by using an image content representation method based on a significant region, and the image region is mined by means of the distribution of the image feature points, so that the mean shift clustering method can be adopted to perform coordinate clustering on the coordinates of the image feature points to obtain a plurality of image regions.
Step S102: and sequencing the importance of a plurality of image areas of each image to be matched, and sequentially comparing the characteristics of two image areas of the two images to be matched according to the importance sequence to determine the similarity of the two image areas.
Alternatively, some regions are important regions expressing the subject matter of the image, and some regions are secondary regions complementing and enriching the image content, which have different importance, so that the present application can distinguish the difference in importance when describing the image content, i.e. rank the importance of a plurality of image regions of each image to be matched.
Optionally, after finishing ranking the importance of the image regions of the two images to be matched, the method and the device may sequentially perform feature comparison on the two image regions according to the order of the importance, for example, perform feature comparison on an image region with the highest importance of one image to be matched and an image region with the highest importance of another image to be matched, and determine the similarity between the two image regions according to the result of the feature comparison.
Optionally, the similarity between the two image regions may be calculated based on the same visual property of the two image regions and the spatial descriptors of the two image regions.
Therefore, the visual attributes of the image feature points are compiled into an index table, and if two image regions (for example, a certain region of a query graph and a certain region of an image set) of two images to be matched contain the same visual attribute, the two image regions are a pair of matched rp (region pair). The similarity between two image regions can be calculated by the following formula:
Figure BDA0003590114900000051
in the formula, m represents the number of RPs that can be matched in one image region, SD and SR represent spatial descriptors of four visual attributes in a pair of RPs, respectively, and the higher the score of MS, the more similar the two image regions are.
Step S103: and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
Optionally, according to the similarity between the two image regions obtained by the above calculation, the similarity between the two images to be matched may be determined, for example, a similarity expression:
Figure BDA0003590114900000061
in the formula, S is the similarity between two images to be matched, MS is the similarity between two image regions, and weight is the similarity, so that two images to be matched with the similarity higher than the threshold value can be regarded as successful matching.
As can be seen from the above description, the image matching method provided in the embodiment of the present application can divide the entire image into two parts, namely "significant" and "non-significant", by dividing the image region, and compare the image features of the significant region, thereby greatly reducing the number of image features and facilitating improvement of image retrieval accuracy and image retrieval matching efficiency.
In order to accurately extract the image feature points of the image to be matched, in an embodiment of the image matching method of the present application, the step S101 may further specifically include the following steps:
searching extreme points of the image to be matched on different scale spaces according to a preset scale invariant feature conversion algorithm to obtain image feature points of the image to be matched.
Optionally, in order to determine representative image feature points from the images to be matched, the extreme points of the two images to be matched on different scale spaces may be determined by using a Scale Invariant Feature Transform (SIFT) algorithm, and the extreme points are used as local image features, that is, image feature points.
In order to accurately divide the image to be matched into a plurality of image areas, in an embodiment of the image matching method of the present application, referring to fig. 2, the step S101 may further specifically include the following steps:
step S201: and carrying out coordinate clustering on the coordinates of the image feature points according to a preset mean shift clustering algorithm.
Step S202: and determining a plurality of image areas according to the coordinate clustering result.
Optionally, the region division is characterized in that the obtained region does not require to be capable of accurately matching the outline boundary of the semantic entity in the original image, so that the difficulty caused by accurate segmentation of the image can be avoided by using an image content representation method based on the salient region, and the image region is mined by means of the distribution of the image feature points, so that the mean shift clustering method can be adopted to perform coordinate clustering on the coordinates of the image feature points to obtain a plurality of image regions.
In order to accurately determine a representative area in the plurality of image areas, in an embodiment of the image matching method of the present application, the step S102 may further include the following steps:
and sorting the importance of each image area according to the number of the image characteristic points contained in each divided image area.
Optionally, the number of image feature points included in each image region is an important factor affecting the region weight, so that the importance ranking of each image region can be performed according to the number of image feature points included in each divided image region.
In order to accurately determine a representative area in the plurality of image areas, in an embodiment of the image matching method of the present application, the step S102 may further specifically include the following steps:
and sequencing the importance of each image area according to the position distance between each divided image area and the image center of the image to be matched.
Optionally, the closer the position of the image region is to the center of the image to be matched, the greater the importance factor of the image region is, because for most images, especially images containing a significant object, the object of the image region is generally located at the center of the image, and human eyes are more sensitive to the portion of the image near the center, therefore, the importance ranking of the image regions can be performed according to the position distance between the divided image regions and the center of the image to be matched.
In order to accurately calculate the similarity between the two image regions, in an embodiment of the image matching method of the present application, referring to fig. 3, the step S102 may further include the following steps:
step S301: and sequentially determining whether the two image areas of the two images to be matched have the same visual attribute according to the importance sequence.
Step S302: and determining the similarity of the two image areas according to the number of the same visual attributes of the two image areas and the space descriptors of the two image areas.
Optionally, after finishing ranking the importance of the image regions of the two images to be matched, the method and the device may sequentially perform feature comparison on the two image regions according to the order of the importance, for example, perform feature comparison on an image region with the highest importance of one image to be matched and an image region with the highest importance of another image to be matched, and determine the similarity between the two image regions according to the result of the feature comparison.
Optionally, the similarity between the two image regions may be calculated based on the same visual property of the two image regions and the spatial descriptors of the two image regions.
Optionally, if two image regions (for example, a certain region of a query graph and a certain region of an image set) of two images to be matched have the same visual attribute, the two image regions are a pair of matched rp (region pair). The similarity between two image regions can be calculated by the following formula:
Figure BDA0003590114900000081
in the formula, m represents the number of RPs that can be matched in one image region, SD and SR represent spatial descriptors of four visual attributes in a pair of RPs, respectively, and the higher the score of MS, the more similar the two image regions are.
In order to effectively improve the image retrieval accuracy and the matching efficiency, the present application provides an embodiment of an image matching apparatus for implementing all or part of the contents of the image matching method, and with reference to fig. 4, the image matching apparatus specifically includes the following contents:
the image area dividing module 10 is configured to determine image feature points of two images to be matched, perform coordinate clustering according to the image feature points, and divide the images to be matched into a plurality of image areas according to a coordinate clustering result.
The salient region determining module 20 is configured to perform importance ranking on a plurality of image regions of each image to be matched, sequentially perform feature comparison on two image regions of the two images to be matched according to the importance ranking, and determine similarity between the two image regions.
And the image matching module 30 is configured to determine whether the two images to be matched are matched according to the similarity of the two image regions.
As can be seen from the foregoing description, the image matching apparatus provided in the embodiment of the present application can divide the image area into two parts, namely "significant" and "non-significant", and compare the image features of the significant area, thereby greatly reducing the number of image features, and facilitating to improve the image retrieval accuracy and the image retrieval matching efficiency.
In order to accurately extract the image feature points of the image to be matched, in an embodiment of the image matching apparatus of the present application, referring to fig. 5, the image region dividing module 10 includes:
and the feature point determining unit 11 is configured to search extreme points of the image to be matched in different scale spaces according to a preset scale invariant feature transformation algorithm, so as to obtain image feature points of the image to be matched.
In order to accurately divide the image to be matched into a plurality of image areas, in an embodiment of the image matching apparatus of the present application, referring to fig. 6, the image area dividing module 10 further includes:
and the coordinate clustering unit 12 is used for carrying out coordinate clustering on the coordinates of the image feature points according to a preset mean shift clustering algorithm.
And the image dividing unit 13 is used for determining a plurality of image areas according to the coordinate clustering result.
In order to accurately determine a representative region in a plurality of image regions, in an embodiment of the image matching apparatus of the present application, referring to fig. 7, the significant region determining module 20 includes:
the number sorting unit 21 is configured to sort the importance of each image region according to the number of image feature points included in each divided image region.
In order to accurately determine a representative region in the plurality of image regions, in an embodiment of the image matching apparatus of the present application, referring to fig. 8, the significant region determining module 20 further includes:
and the distance sorting unit 22 is configured to sort the importance of each image area according to the position distance between each divided image area and the image center of the image to be matched.
In order to accurately calculate the similarity between the two image regions, in an embodiment of the image matching apparatus of the present application, referring to fig. 9, the salient region determining module 20 further includes:
and the same attribute judging unit 23 is configured to sequentially determine whether two image regions of the two images to be matched have the same visual attribute according to the importance sequence.
A similarity calculation unit 24, configured to determine a similarity between the two image regions according to the number of the same visual attributes of the two image regions and the spatial descriptors of the two image regions.
In terms of hardware, in order to effectively improve the image retrieval accuracy and the matching efficiency, the present application provides an embodiment of an electronic device for implementing all or part of the contents in the image matching method, where the electronic device specifically includes the following contents:
a processor (processor), a memory (memory), a communication Interface (Communications Interface), and a bus; the processor, the memory and the communication interface complete mutual communication through the bus; the communication interface is used for realizing information transmission between the image matching device and relevant equipment such as a core service system, a user terminal, a relevant database and the like; the logic controller may be a desktop computer, a tablet computer, a mobile terminal, and the like, but the embodiment is not limited thereto. In this embodiment, the logic controller may be implemented with reference to the embodiment of the image matching method and the embodiment of the image matching apparatus in the embodiments, and the contents thereof are incorporated herein, and repeated descriptions thereof are omitted.
It is understood that the user terminal may include a smart phone, a tablet electronic device, a network set-top box, a portable computer, a desktop computer, a Personal Digital Assistant (PDA), an in-vehicle device, a smart wearable device, and the like. Wherein, intelligence wearing equipment can include intelligent glasses, intelligent wrist-watch, intelligent bracelet etc..
In practical applications, part of the image matching method may be performed on the electronic device side as described above, or all operations may be performed in the client device. The selection may be specifically performed according to the processing capability of the client device, the limitation of the user usage scenario, and the like. This is not a limitation of the present application. The client device may further include a processor if all operations are performed in the client device.
The client device may have a communication module (i.e., a communication unit), and may be communicatively connected to a remote server to implement data transmission with the server. The server may include a server on the task scheduling center side, and in other implementation scenarios, the server may also include a server on an intermediate platform, for example, a server on a third-party server platform that is communicatively linked to the task scheduling center server. The server may include a single computer device, or may include a server cluster formed by a plurality of servers, or a server structure of a distributed apparatus.
Fig. 10 is a schematic block diagram of a system configuration of an electronic device 9600 according to the embodiment of the present application. As shown in fig. 10, the electronic device 9600 can include a central processor 9100 and a memory 9140; the memory 9140 is coupled to the central processor 9100. Notably, this fig. 10 is exemplary; other types of structures may also be used in addition to or in place of the structures to implement telecommunications or other functions.
In one embodiment, the image matching method functions may be integrated into the central processor 9100. The central processor 9100 may be configured to control as follows:
step S101: determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result.
Step S102: and sequencing the importance of a plurality of image areas of each image to be matched, and sequentially comparing the characteristics of two image areas of the two images to be matched according to the importance sequence to determine the similarity of the two image areas.
Step S103: and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
As can be seen from the above description, the electronic device provided in the embodiment of the present application divides the image area, so that the whole image is divided into two parts, namely "salient" and "non-salient", and compares the image features of the salient area, thereby greatly reducing the number of image features, and facilitating to improve the image retrieval accuracy and the image retrieval matching efficiency.
In another embodiment, the image matching apparatus may be configured separately from the central processor 9100, for example, the image matching apparatus may be configured as a chip connected to the central processor 9100, and the image matching method function is realized by the control of the central processor.
As shown in fig. 10, the electronic device 9600 may further include: a communication module 9110, an input unit 9120, an audio processor 9130, a display 9160, and a power supply 9170. It is noted that the electronic device 9600 also does not necessarily include all of the components shown in fig. 10; in addition, the electronic device 9600 may further include components not shown in fig. 10, which may be referred to in the prior art.
As shown in fig. 10, a central processor 9100, sometimes referred to as a controller or operational control, can include a microprocessor or other processor device and/or logic device, which central processor 9100 receives input and controls the operation of the various components of the electronic device 9600.
The memory 9140 can be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. The information relating to the failure may be stored, and a program for executing the information may be stored. And the central processing unit 9100 can execute the program stored in the memory 9140 to realize information storage or processing, or the like.
The input unit 9120 provides input to the central processor 9100. The input unit 9120 is, for example, a key or a touch input device. The power supply 9170 is used to provide power to the electronic device 9600. The display 9160 is used for displaying display objects such as images and characters. The display may be, for example, an LCD display, but is not limited thereto.
The memory 9140 can be a solid state memory, e.g., Read Only Memory (ROM), Random Access Memory (RAM), a SIM card, or the like. There may also be a memory that holds information even when power is off, can be selectively erased, and is provided with more data, an example of which is sometimes called an EPROM or the like. The memory 9140 could also be some other type of device. The memory 9140 includes a buffer memory 9141 (sometimes referred to as a buffer). The memory 9140 may include an application/function storage portion 9142, the application/function storage portion 9142 being used for storing application programs and function programs or for executing a flow of operations of the electronic device 9600 by the central processor 9100.
The memory 9140 can also include a data store 9143, the data store 9143 being used to store data, such as contacts, digital data, pictures, sounds, and/or any other data used by an electronic device. The driver storage portion 9144 of the memory 9140 may include various drivers for the electronic device for communication functions and/or for performing other functions of the electronic device (e.g., messaging applications, contact book applications, etc.).
The communication module 9110 is a transmitter/receiver 9110 that transmits and receives signals via an antenna 9111. The communication module (transmitter/receiver) 9110 is coupled to the central processor 9100 to provide input signals and receive output signals, which may be the same as in the case of a conventional mobile communication terminal.
Based on different communication technologies, a plurality of communication modules 9110, such as a cellular network module, a bluetooth module, and/or a wireless local area network module, may be provided in the same electronic device. The communication module (transmitter/receiver) 9110 is also coupled to a speaker 9131 and a microphone 9132 via an audio processor 9130 to provide audio output via the speaker 9131 and receive audio input from the microphone 9132 to implement general telecommunications functions. The audio processor 9130 may include any suitable buffers, decoders, amplifiers and so forth. In addition, the audio processor 9130 is also coupled to the central processor 9100, thereby enabling recording locally through the microphone 9132 and enabling locally stored sounds to be played through the speaker 9131.
An embodiment of the present application further provides a computer-readable storage medium capable of implementing all the steps in the image matching method with the server or the client as an execution subject in the above embodiments, where the computer-readable storage medium stores a computer program, and the computer program, when executed by a processor, implements all the steps in the image matching method with the server or the client as an execution subject in the above embodiments, for example, the processor implements the following steps when executing the computer program:
step S101: determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result.
Step S102: and sequencing the importance of a plurality of image areas of each image to be matched, and sequentially comparing the characteristics of the two image areas of the two images to be matched according to the importance sequence to determine the similarity of the two image areas.
Step S103: and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
As can be seen from the above description, the computer-readable storage medium provided in the embodiment of the present application divides an image region, so that an entire image is divided into two parts, namely "salient" and "non-salient", and compares image features of the salient region, thereby greatly reducing the number of image features, and facilitating to improve image retrieval accuracy and image retrieval matching efficiency.
Embodiments of the present application further provide a computer program product capable of implementing all steps of the image matching method in which the execution subject is a server or a client in the above embodiments, and when being executed by a processor, the computer program/instruction implements the steps of the image matching method, for example, the computer program/instruction implements the following steps:
step S101: determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result.
Step S102: and sequencing the importance of a plurality of image areas of each image to be matched, and sequentially comparing the characteristics of two image areas of the two images to be matched according to the importance sequence to determine the similarity of the two image areas.
Step S103: and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
As can be seen from the foregoing description, the computer program product provided in the embodiments of the present application divides an image region into two parts, namely "salient" and "non-salient", and compares image features of the salient region, so as to greatly reduce the number of image features, and facilitate improvement of image retrieval accuracy and image retrieval matching efficiency.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (devices), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The principle and the implementation mode of the invention are explained by applying specific embodiments in the invention, and the description of the embodiments is only used for helping to understand the method and the core idea of the invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, the specific embodiments and the application range may be changed, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (10)

1. An image matching method, characterized in that the method comprises:
determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points, and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result;
the method comprises the steps of sequencing the importance of a plurality of image areas of each image to be matched, sequentially comparing the characteristics of two image areas of the two images to be matched according to the importance sequence, and determining the similarity of the two image areas;
and determining whether the two images to be matched are matched or not according to the similarity of the two image areas.
2. The image matching method according to claim 1, wherein the determining the image feature points of the two images to be matched comprises:
searching extreme points of the image to be matched on different scale spaces according to a preset scale invariant feature conversion algorithm to obtain image feature points of the image to be matched.
3. The image matching method according to claim 1, wherein the performing coordinate clustering according to the image feature points and dividing the image to be matched into a plurality of image regions according to a coordinate clustering result comprises:
carrying out coordinate clustering on the coordinates of the image feature points according to a preset mean shift clustering algorithm;
and determining a plurality of image areas according to the coordinate clustering result.
4. The image matching method according to claim 1, wherein the ranking of the importance of the plurality of image regions of each image to be matched comprises:
and sorting the importance of each image area according to the number of the image characteristic points contained in each divided image area.
5. The image matching method according to claim 1, wherein the ranking of importance of the plurality of image regions of each image to be matched further comprises:
and sequencing the importance of each image area according to the position distance between each divided image area and the image center of the image to be matched.
6. The image matching method according to claim 1, wherein the sequentially comparing the features of the two image regions of the two images to be matched according to the order of importance to determine the similarity of the two image regions comprises:
sequentially determining whether the two image areas of the two images to be matched have the same visual attribute according to the importance sequence;
and determining the similarity of the two image areas according to the number of the same visual attributes of the two image areas and the space descriptors of the two image areas.
7. An image matching apparatus, characterized by comprising:
the image area dividing module is used for determining image feature points of two images to be matched, carrying out coordinate clustering according to the image feature points and dividing the images to be matched into a plurality of image areas according to a coordinate clustering result;
the salient region determining module is used for sequencing the importance of a plurality of image regions of each image to be matched, sequentially comparing the characteristics of the two image regions of the two images to be matched according to the importance sequence and determining the similarity of the two image regions;
and the image matching module is used for determining whether the two images to be matched are matched according to the similarity of the two image areas.
8. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the image matching method according to any of claims 1 to 6 are implemented when the program is executed by the processor.
9. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image matching method of any one of claims 1 to 6.
10. A computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the image matching method of any of claims 1 to 6.
CN202210374133.XA 2022-04-11 2022-04-11 Image matching method and device Pending CN114782716A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210374133.XA CN114782716A (en) 2022-04-11 2022-04-11 Image matching method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210374133.XA CN114782716A (en) 2022-04-11 2022-04-11 Image matching method and device

Publications (1)

Publication Number Publication Date
CN114782716A true CN114782716A (en) 2022-07-22

Family

ID=82428228

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210374133.XA Pending CN114782716A (en) 2022-04-11 2022-04-11 Image matching method and device

Country Status (1)

Country Link
CN (1) CN114782716A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115437601A (en) * 2022-11-02 2022-12-06 荣耀终端有限公司 Image sorting method, electronic device, program product, and medium
CN117708357A (en) * 2023-06-16 2024-03-15 荣耀终端有限公司 Image retrieval method and electronic equipment

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115437601A (en) * 2022-11-02 2022-12-06 荣耀终端有限公司 Image sorting method, electronic device, program product, and medium
CN115437601B (en) * 2022-11-02 2024-04-19 荣耀终端有限公司 Image ordering method, electronic device, program product and medium
CN117708357A (en) * 2023-06-16 2024-03-15 荣耀终端有限公司 Image retrieval method and electronic equipment

Similar Documents

Publication Publication Date Title
CN114782716A (en) Image matching method and device
CN113722438A (en) Sentence vector generation method and device based on sentence vector model and computer equipment
CN110232131A (en) Intention material searching method and device based on intention label
CN113342948A (en) Intelligent question and answer method and device
CN111104572A (en) Feature selection method and device for model training and electronic equipment
CN110059172B (en) Method and device for recommending answers based on natural language understanding
CN110717405A (en) Face feature point positioning method, device, medium and electronic equipment
CN111563207B (en) Search result sorting method and device, storage medium and computer equipment
CN113138677A (en) Method and device for determining candidate words of input method, electronic equipment and storage medium
CN111274476B (en) House source matching method, device, equipment and storage medium based on face recognition
CN111444321A (en) Question answering method, device, electronic equipment and storage medium
CN113313066A (en) Image recognition method, image recognition device, storage medium and terminal
CN112200623A (en) Product recommendation method, device, equipment and storage medium
CN111966894A (en) Information query method and device, storage medium and electronic equipment
CN114398883B (en) Presentation generation method and device, computer readable storage medium and server
CN116310994A (en) Video clip extraction method and device, electronic equipment and medium
CN111127481A (en) Image identification method and device based on TOF image communication area
CN105677926A (en) Local search result display method and device and electronic equipment
CN115798458A (en) Classified language identification method and device
CN111292171B (en) Financial product pushing method and device
CN110399615B (en) Transaction risk monitoring method and device
CN105991400B (en) Group searching method and device
CN111382365A (en) Method and apparatus for outputting information
CN116522911B (en) Entity alignment method and device
CN114139031B (en) Data classification method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination