CN108921181B - Local image feature extraction method, device and system and readable storage medium - Google Patents

Local image feature extraction method, device and system and readable storage medium Download PDF

Info

Publication number
CN108921181B
CN108921181B CN201810870557.9A CN201810870557A CN108921181B CN 108921181 B CN108921181 B CN 108921181B CN 201810870557 A CN201810870557 A CN 201810870557A CN 108921181 B CN108921181 B CN 108921181B
Authority
CN
China
Prior art keywords
convolution
local image
group
neural network
local
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810870557.9A
Other languages
Chinese (zh)
Other versions
CN108921181A (en
Inventor
林璟怡
李东
章云
曾宪贤
王晓东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong University of Technology
Original Assignee
Guangdong University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong University of Technology filed Critical Guangdong University of Technology
Priority to CN201810870557.9A priority Critical patent/CN108921181B/en
Publication of CN108921181A publication Critical patent/CN108921181A/en
Application granted granted Critical
Publication of CN108921181B publication Critical patent/CN108921181B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Abstract

The application discloses a local image feature extraction method, which comprises the following steps: correcting the convolution parameters in the convolution neural network to obtain corrected convolution parameters; acquiring a local image; extracting feature information of the local image according to the corrected convolution parameters based on a convolution neural network to obtain feature vectors; the method for correcting the convolution parameters comprises the following steps: inputting the obtained local image information including different types into a convolutional neural network to obtain a feature vector corresponding to each local image information; calculating characteristic center points corresponding to the local image information of each category; calculating the similarity of local images among all categories and in the same category according to the characteristic central points; and correcting the convolution parameters according to the similarity. The extraction method has strong local feature extraction capability and good detail feature extraction effect. The application also discloses a local image feature extraction device, a system and a computer readable storage medium, which have the beneficial effects.

Description

Local image feature extraction method, device and system and readable storage medium
Technical Field
The present application relates to the field of field adaptation, and in particular, to a method, an apparatus, a system, and a computer-readable storage medium for extracting local image features.
Background
The local image features are local expressions of the image features, reflect local characteristics of the image, and compared with global image features such as line features, texture features and structural features, the local image features have the advantages of abundant content in the image, small correlation among the features, no influence on detection and matching of other features due to disappearance of part of the features under the shielding condition, and the like, and are widely applied to the fields of face recognition, three-dimensional reconstruction, target recognition and tracking, movie and television production, panoramic image splicing and the like. The extraction of local image features is usually the first step of many problems in computer vision and digital image processing, such as image classification, image retrieval, wide baseline matching, etc., and the quality of the extracted features directly affects the final performance of the task. Local image feature description is a basic research problem of computer vision, and plays an important role in finding corresponding points in images and object feature description. Therefore, the local feature extraction method has important research value.
However, the image often changes in scale, translation, rotation, illumination, visual angle, blur and the like, and particularly in practical application scenes, the image inevitably has large noise interference, a complex background and large target posture change, and in such an unfavorable situation, the extraction effect of the local features is often unsatisfactory.
Therefore, how to improve the local feature extraction capability and improve the detail feature extraction effect is a technical problem to be solved by those skilled in the art.
Disclosure of Invention
The method for extracting the local image features has the advantages that the local feature extraction capacity is high, and the detail feature extraction effect is good; another object of the present application is to provide a local image feature extraction apparatus, system and computer-readable storage medium, which have the above-mentioned advantages.
The application provides a local image feature extraction method, which comprises the following steps:
correcting the convolution parameters in the convolution neural network to obtain corrected convolution parameters;
acquiring a local image;
extracting feature information of the local image according to the corrected convolution parameters based on the convolution neural network to obtain feature vectors;
the method for correcting the convolution parameters comprises the following steps:
inputting the obtained local image information including different types into a convolutional neural network to obtain a feature vector corresponding to each local image information;
calculating characteristic center points corresponding to the local image information of each category;
calculating the similarity of local images among all categories and in the same category according to the characteristic central points;
and correcting the convolution parameters according to the similarity.
Optionally, the calculating the similarity between each category and the local images in the same category according to the feature center point includes:
calculating Euclidean distances between each feature vector and the corresponding feature center point in the same category;
counting the fluctuation condition of the Euclidean distance corresponding to each feature vector, and taking the fluctuation condition as the similarity of local images in corresponding categories;
and calculating Euclidean distances among the feature center points among different classes to serve as the similarity of the local images among the classes.
Optionally, the counting fluctuation of the euclidean distance corresponding to each feature vector includes:
and calculating the variance between the Euclidean distances corresponding to each feature vector.
Optionally, the modifying the convolution parameter according to the similarity includes:
constructing a loss function of the convolutional neural network according to the similarity;
and correcting the convolution parameters of the convolution neural network according to the loss function.
Optionally, the modifying the convolution parameter of the convolutional neural network according to the loss function includes:
and correcting the convolution parameters of the convolutional neural network based on a random gradient descent method according to the loss function.
Optionally, the acquiring the local image comprises:
receiving an overall image;
receiving local information for feature extraction;
and cutting the image according to the local information to obtain a local image.
The application discloses local image feature extraction element includes:
the correction unit is used for correcting the convolution parameters in the convolution neural network to obtain corrected convolution parameters;
an acquisition unit configured to acquire a local image;
the characteristic extraction unit is used for extracting the characteristic information of the local image according to the corrected convolution parameters on the basis of the convolution neural network to obtain a characteristic vector;
wherein the correction unit includes:
the image input subunit is used for inputting the acquired local image information including different types into the convolutional neural network to obtain a feature vector corresponding to each local image information;
the central point calculation subunit is used for calculating characteristic central points corresponding to the local image information of each category;
the similarity calculation operator unit is used for calculating the similarity of local images among all categories and in the same category according to the characteristic central points;
and the parameter correction subunit is used for correcting the convolution parameters according to the similarity.
Optionally, the similarity operator unit includes:
the first calculating subunit is used for calculating Euclidean distances between each feature vector in the same category and the corresponding feature center point;
the statistic subunit is used for counting the fluctuation condition of the Euclidean distance corresponding to each feature vector as the similarity of the local images in the corresponding category;
and the second calculating subunit is used for calculating Euclidean distances among feature center points among different categories as the similarity of local images among the categories.
The application discloses local image feature extraction system, its characterized in that includes:
a memory for storing a computer program;
and the processor is used for realizing the steps of the local image feature extraction method when the computer program is executed.
A readable storage medium having stored thereon a program which, when executed by a processor, implements the steps of the local image feature extraction method.
In order to solve the technical problems, the application provides a local image feature extraction method, the method extracts feature information of a local image through a convolutional neural network, and the convolutional neural network can find and depict complex structural features inside the local image, so that the feature extraction performance can be greatly improved; in addition, after the characteristic vector is obtained by inputting the local image information comprising different categories into the convolutional neural network, searching the similarity degree of the image features in the same category and between different categories for the feature vectors corresponding to the local image information in a manner of calculating feature center points, the degree of distinguishing the images of the same type and the images of different types by the learning network is evaluated through the similarity degree, the method for extracting the characteristics of different images is migrated, the characteristics of local images of the same type are extracted and distinguished, the convolution parameters are continuously corrected to obtain convolution parameters capable of accurately describing detail distinguishing features, only the similarity degree needs to be calculated in the correction process, the calculated amount is small, the correction process is rapid, the convolution neural network after the convolution parameters are corrected can accurately mine the features of the corresponding categories, the image recognition rate is high, and the robustness is high.
The application also discloses a local image feature extraction device, a system and a computer readable storage medium, which have the beneficial effects and are not repeated herein.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, it is obvious that the drawings in the following description are only embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
Fig. 1 is a flowchart of a local image feature extraction method provided in an embodiment of the present application;
fig. 2 is a flowchart of a method for correcting a convolution parameter according to an embodiment of the present application;
fig. 3 is a flowchart of a similarity calculation method according to an embodiment of the present application;
fig. 4 is a block diagram of a local image feature extraction apparatus according to an embodiment of the present application;
fig. 5 is a block diagram of a modification unit provided in the embodiment of the present application;
fig. 6 is a block diagram of a similarity calculator subunit structure provided in the embodiment of the present application;
fig. 7 is a block diagram of a parameter modification subunit according to an embodiment of the present application;
fig. 8 is a block diagram of a structure of an obtaining unit according to an embodiment of the present application.
Detailed Description
The core of the application is to provide a text information extraction method based on field self-adaptation, and the method can improve field mobility and improve text analysis and extraction capability in fields such as social media and the like; another core of the present application is to provide a text information extraction device, system and readable storage medium based on domain adaptation, which have the above beneficial effects.
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Fig. 1 is a flowchart of a local image feature extraction method provided in this embodiment; the method can comprise the following steps:
step s 100: and correcting the convolution parameters in the convolution neural network to obtain the corrected convolution parameters.
Fig. 2 is a flowchart of a method for correcting a convolution parameter provided in this embodiment, where the method for correcting a convolution parameter specifically includes the following steps:
step s 110: inputting the obtained local image information including different types into a convolutional neural network to obtain a feature vector corresponding to each local image information;
k classes are selected at will, local training images with the number of n in each class are randomly extracted and input into the convolutional neural network, the pixel values of the images are not limited herein, the local training images are 32 × 32 pixels for example, and k groups of k × n 128-dimensional feature vectors are obtained and used as descriptors of the images.
Step s 120: calculating characteristic center points corresponding to the local image information of each category;
the center point of each group is found out from the k groups of feature vectors obtained in step s110, and the feature center point refers to the vector mean of the feature vectors. For example, if the feature vectors in category 1 are 1,2,3, 4, and 5, respectively, the center point is 3. The feature center point is not limited to only the vectors included in the feature vector.
Step s 130: calculating the similarity of local images among all categories and in the same category according to the characteristic central points;
evaluating the similarity between each feature vector and the central feature vector in each group to judge the similarity of the similar local images; and evaluating the similarity of the feature vectors of the central points between the two groups to judge the similarity of the local images of different classifications.
The description of the image features through the imaging angle, the gray scale, the set position and the like can only reflect partial features of the image, the local image feature extraction method provided by the embodiment starts with the overall features of the image, and improves the distinguishing degree of the image features through the similarity, so that the tiny difference between the image and the image is found.
The method for calculating the similarity is not limited, for example, the similarity in each group can calculate the sum of the difference between each feature vector and the central point, and the similarity between different classes can calculate the difference between the central points of different groups as the similarity between the two groups; similarity or the like can also be evaluated by calculating the euclidean distance. Preferably, to ensure that the vector is accurately evaluated under the condition that the algorithm is as simple as possible, the similarity may be evaluated by calculating an euclidean distance, specifically, fig. 3 is a flowchart of the similarity calculation method provided in this embodiment, and calculating the similarity of the local images between each category and in the same category according to the feature center point may include:
step s 131: calculating Euclidean distances between each feature vector and the corresponding feature center point in the same category;
step s 132: counting the fluctuation condition of the Euclidean distance corresponding to each feature vector, and taking the fluctuation condition as the similarity of local images in corresponding categories;
step s 133: and calculating Euclidean distances among the feature center points among different classes to serve as the similarity of the local images among the classes.
The specific algorithm for calculating the fluctuation condition in step s132 may refer to the prior art, and is not limited herein, for example, the fluctuation condition may be determined by calculating a variance, a standard deviation, and the like, and preferably, to ensure that the calculation process is simplified, by calculating a variance between corresponding euclidean distances of each feature vector.
Step s 140: and correcting the convolution parameters according to the similarity.
Evaluating the performance of the current convolutional neural network by using the similarity information obtained in the step s 130; the lower the similarity degree among different categories, the more the unique characteristics corresponding to each category can be reflected, and the lower the similarity degree in the same category, the more the detail difference of each image can be reflected, so that the convolution parameters in the convolutional neural network are corrected, and the corrected target convolutional neural network is obtained.
In order to ensure the quantification and the accuracy of the correction process, the correction method preferably includes constructing a loss function, and the smaller the loss function is, the better the fitting of the network is. The specifically modifying the convolution parameter according to the similarity includes:
constructing a loss function of the convolutional neural network according to the similarity;
and correcting the convolution parameters of the convolution neural network according to the loss function.
The correction according to the loss function may be performed by using a general iterative correction method, which is not limited herein, but with reference to the prior art, because the general iterative correction method is time-consuming, in order to improve the parameter correction efficiency, it is preferable that the convolution parameters of the convolutional neural network be corrected according to the loss function based on a random gradient descent method. SGD (random gradient descent) can be used for optimizing solution under the condition that an accurate mathematical model cannot be established, a method for iteratively approaching a true value is adopted, the output error of the model is continuously reduced, a single sample is approximately optimized, the convergence rate is high, the calculated amount is small, and the overall efficiency can be greatly improved.
Step s 200: a local image is acquired.
The local image refers to an image including only a part of the whole object, for example, an image of a nose part in the whole face image is the local image. The method for acquiring the local image is not limited, the local image may be directly input, or a partial image may be retained as required after the whole image is received, where pixels and resolution of the input image need to be as high as possible so as not to affect the feature extraction effect.
Optionally, the acquiring the local image may specifically include:
receiving an overall image;
receiving local information for feature extraction;
and cutting the image according to the local information to obtain a local image.
Step s 300: and extracting the characteristic information of the local image according to the corrected convolution parameters based on the convolution neural network to obtain a characteristic vector.
And inputting the local image into a convolutional neural network, automatically extracting the characteristics of the convolutional neural network according to the corrected accurate convolutional parameters, and outputting a characteristic vector corresponding to the local image.
It should be noted that, in the step s100 provided in this embodiment, the process of correcting the convolution parameter of the convolutional neural network does not need to be completed in the process of extracting the local image feature each time, when the correction of the convolution parameter reaches the preset accuracy, the step s100 may not be executed, and the extraction of the feature only needs to execute the steps s200 and s300 according to the corrected accurate convolution parameter.
Based on the introduction, the local image feature extraction provided by the application extracts the feature information of the local image through the convolutional neural network, and the convolutional neural network can find and depict the complex structural features in the local image, so that the feature extraction performance can be greatly improved; in addition, after the characteristic vector is obtained by inputting the local image information comprising different categories into the convolutional neural network, searching the similarity degree of the image features in the same category and between different categories for the feature vectors corresponding to the local image information in a manner of calculating feature center points, the degree of distinguishing between the images of the same type and the images of different types is evaluated by the similarity degree, the method for extracting the characteristics of different images is migrated, the characteristics of local images of the same type are extracted and distinguished, the convolution parameters are continuously corrected to obtain convolution parameters capable of accurately describing detail distinguishing features, only the similarity degree needs to be calculated in the correction process, the calculated amount is small, the correction process is rapid, the convolution neural network after the convolution parameters are corrected can accurately mine the features of the corresponding categories, the image recognition rate is high, and the robustness is high.
For the convenience of understanding, the process of modifying the entire convolution parameter is described herein by taking the established convolutional neural network as a seven-layer convolutional layer, and all other ways can refer to the description of this embodiment.
The method specifically comprises the following steps:
step s 1.1: randomly selecting k classes from the training local image set, and randomly selecting n images from each class;
step s 1.2: respectively inputting the images obtained in the step s1.1 into Layer1 convolutional layers in a convolutional neural network, wherein the Layer1 convolutional layers comprise 32 convolution kernels of 3 × 3, the step size is 1 pixel, performing convolution operation on the input images and performing batch normalization (batch normalization), and outputting 32 × 32 × 32 Layer1 convolutional Layer characteristic images by using a ReLU as an activation function;
step s 1.3: inputting the convolution Layer characteristic image obtained in the step s1.2 into a Layer2 convolution Layer, wherein a Layer2 convolution Layer comprises 32 convolution kernels of 3 × 3, the step size is 1 pixel, performing convolution operation on the input image and performing batch normalization, and outputting a 32 × 32 × 32 Layer2 convolution Layer characteristic image by using a ReLU as an activation function;
step s 1.4: inputting the convolution Layer feature image obtained in the step s1.3 into a Layer3 convolution Layer, wherein the Layer3 convolution Layer comprises 64 convolution kernels of 3 × 3 and has the step size of 2 pixels, performing convolution operation on the input image and performing batch normalization, and outputting a 64 × 32 × 32 Layer3 convolution Layer feature image by using a ReLU as an activation function;
step s 1.5: inputting the convolution Layer characteristic image obtained in the step s1.4 into a Layer4 convolution Layer, wherein a Layer4 convolution Layer comprises 64 convolution kernels of 3 × 3, the step size is 1 pixel, performing convolution operation on the input image and performing batch normalization, and outputting a Layer4 convolution Layer characteristic image of 64 × 32 × 32 by using a ReLU as an activation function;
step s 1.6: inputting the convolution Layer characteristic image obtained in the step s1.5 into a Layer5 convolution Layer, wherein a Layer5 convolution Layer comprises 128 convolution kernels of 3 × 3, the step size is 2 pixels, the input image is subjected to convolution operation and batch normalization, a ReLU is used as an activation function, and a 128 × 32 × 32 Layer5 convolution Layer characteristic image is output;
step s 1.7: inputting the convolution Layer characteristic image obtained in the step s1.6 into a Layer6 convolution Layer, wherein a Layer6 convolution Layer comprises 128 convolution kernels of 3 × 3, the step size is 1 pixel, performing convolution operation on the input image and performing batch normalization, using a ReLU as an activation function, outputting a Layer5 convolution Layer characteristic image of 128 × 32 × 32, and setting a random inactivation factor to be 0.25;
step s 1.8: inputting the feature image of the convolutional Layer obtained in the step s1.7 into a Layer7 convolutional Layer, wherein the Layer7 convolutional Layer comprises 128 convolution kernels of 8 × 8, performing convolution operation on the input image and performing batch normalization, and outputting a 128-dimensional feature representation vector;
step s 2.1: let k classifications from step s1.8 be A1,A2,A3,...,AkWherein each classification group i comprises
Figure BDA0001752028810000091
The n features are equal to represent the vector.
Step s 2.2: the center point of each classification group in step s2.1 is calculated
Figure BDA0001752028810000092
The calculation formula is as follows
Figure BDA0001752028810000093
Step s 2.3: calculate all points in each group
Figure BDA0001752028810000094
And group center point
Figure BDA0001752028810000095
Euclidean distance of
Figure BDA0001752028810000096
The calculation formula is as follows
Figure BDA0001752028810000097
Step s 2.4: calculating the center point in each group
Figure BDA00017520288100000916
With the center points in the remaining k-1 groups
Figure BDA0001752028810000098
Euclidean distance of DijThe calculation formula is as follows
Figure BDA0001752028810000099
Step s 2.5: center point for any one group i
Figure BDA00017520288100000910
From the set of data obtained in step s2.4, find DijThe smallest group j, i.e.
Figure BDA00017520288100000911
And setting the central point feature expression vector of the j groups as
Figure BDA00017520288100000912
Let their Euclidean distance be
Figure BDA00017520288100000913
Step s 3: obtained according to step s2.3
Figure BDA00017520288100000914
D obtained in step s2.4ijAnd obtained in step s2.5
Figure BDA00017520288100000915
Constructing a loss function of
Figure BDA0001752028810000101
Step s 4: and (4) optimizing the parameters of the convolutional neural network by using a random gradient descent method according to the loss function obtained in the step s3, and repeating the steps until the loss function in the step s3 is not reduced or stabilized any more, so as to obtain the convolutional neural network comprising the corrected convolutional parameters.
The parameters are corrected through the steps, the correction process is high in efficiency, and the extraction effect is good when the characteristic extraction is performed through the neural network containing the convolution parameters obtained through correction in the steps.
Referring to fig. 4, fig. 4 is a block diagram of a local image feature extraction device according to an embodiment of the present application; the apparatus may include:
a correction unit 100, configured to correct a convolution parameter in a convolutional neural network to obtain a corrected convolution parameter;
an acquisition unit 200 for acquiring a local image;
a feature extraction unit 300, configured to extract feature information of the local image according to the convolution parameter corrected by the correction unit 100 based on the convolutional neural network, so as to obtain a feature vector;
the structural block diagram of the correction unit 100 is shown in fig. 5, and mainly includes:
the image input subunit 110 is configured to input the acquired local image information including different categories to a convolutional neural network, so as to obtain a feature vector corresponding to each local image information;
a central point calculating subunit 120, configured to calculate a feature central point corresponding to each category of local image information;
the similarity operator unit 130 is used for calculating the similarity of local images among all classes and in the same class according to the feature central points;
and a parameter modification subunit 140, configured to modify the convolution parameter according to the similarity.
It should be noted that, in the specific embodiment of the present application, please refer to the specific embodiment corresponding to fig. 1 for the working process of each unit in the local image feature extraction apparatus, which is not described herein again.
Optionally, as shown in fig. 6, a structural block diagram of the similarity operator unit 130 may specifically include:
a first calculating subunit 131, configured to calculate euclidean distances between feature vectors in the same category and corresponding feature center points;
a statistic subunit 132, configured to count fluctuation conditions of euclidean distances corresponding to the feature vectors, as similarities of local images in corresponding categories;
and a second calculating subunit 133, configured to calculate euclidean distances between feature center points of different categories as similarities of local images between the categories.
The statistic subunit 132 may be specifically configured to calculate a variance between euclidean distances corresponding to each feature vector.
Optionally, as shown in fig. 7, a structural block diagram of the parameter correcting subunit 140 mainly includes:
a loss function constructing subunit 141, configured to construct a loss function of the convolutional neural network according to the similarity;
and a loss function correcting subunit 142, configured to correct the convolution parameter of the convolutional neural network according to the loss function.
The loss function correcting subunit 142 may be specifically configured to correct a convolution parameter of the convolutional neural network based on a random gradient descent method according to the loss function.
Optionally, a block diagram of the structure of the obtaining unit 200 is shown in fig. 8, and may specifically include:
a receiving subunit 210 configured to receive the whole image;
an extraction subunit 220, configured to receive local information for performing feature extraction;
and a cropping subunit 230, configured to crop the image according to the local information, so as to obtain a local image.
The following describes a readable storage medium provided in an embodiment of the present application, and the readable storage medium described below and the local image feature extraction method described above may be referred to correspondingly.
The application also discloses a local image feature extraction system, which mainly comprises: a memory and a processor.
Wherein the memory is used for storing a computer program;
the processor is used for realizing the steps of the local image feature extraction method when executing the computer program.
The present application also discloses a computer-readable storage medium having a program stored thereon, which when executed by a processor, performs the steps of the local image feature extraction method.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses, devices, storage media and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus, system, storage medium, and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit is merely a logical division, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in a mobile terminal. Based on such understanding, the technical solution of the present application may be substantially implemented or contributed to by the prior art, or all or part of the technical solution may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a mobile terminal (which may be a mobile phone, a tablet computer, or the like) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The embodiments are described in a progressive manner in the specification, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The device disclosed by the embodiment corresponds to the method disclosed by the embodiment, so that the description is simple, and the relevant points can be referred to the method part for description.
Those of skill would further appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, terminal, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The method, the apparatus, the system, and the computer-readable storage medium for extracting local image features provided in the present application are described in detail above. The principles and embodiments of the present application are explained herein using specific examples, which are provided only to help understand the method and the core idea of the present application. It should be noted that, for those skilled in the art, it is possible to make several improvements and modifications to the present application without departing from the principle of the present application, and such improvements and modifications also fall within the scope of the claims of the present application.

Claims (5)

1. A local image feature extraction method is characterized by comprising the following steps:
correcting the convolution parameters in the convolution neural network to obtain corrected convolution parameters;
acquiring a local image;
extracting feature information of the local image according to the corrected convolution parameters based on the convolution neural network to obtain feature vectors;
the method for correcting the convolution parameters comprises the following steps:
the method comprises the following steps: k classes derived from convolutional neural networks as A1,A2,A3,...,AkWherein each classification group i contains
Figure FDA0003510129220000011
The n feature vectors;
step two: calculating to obtain the central point of each classification group
Figure FDA0003510129220000012
The calculation formula is as follows:
Figure FDA0003510129220000013
wherein, i is 1,2, 3.
Wherein i is the number of the classification group, and j is the number of the feature vector in each classification group;
step three: calculate all points in each group
Figure FDA0003510129220000014
And group center point
Figure FDA0003510129220000015
Euclidean distance of
Figure FDA0003510129220000016
The calculation formula is as follows:
Figure FDA0003510129220000017
wherein, i is 1,2,., k, j is 1,2,. and n;
step four: calculating the center point in each group
Figure FDA0003510129220000018
With the center points in the remaining k-1 groups
Figure FDA0003510129220000019
Euclidean distance of DijThe calculation formula is as follows:
Figure FDA00035101292200000110
wherein i ≠ j, i ≠ 1,2,.. k, j ∈ 1,2,3,. k;
step five: center point for any one group i
Figure FDA00035101292200000111
Finding D from the group of data in step fourijThe smallest group j, namely:
Figure FDA00035101292200000112
wherein i ≠ j, i ≠ 1,2, 3.. k, j ═ 1,2, 3.. k;
and set the feature vector of the center point of the group j as
Figure FDA00035101292200000113
Let their Euclidean distance be
Figure FDA00035101292200000114
Step six: obtained according to step three
Figure FDA00035101292200000115
D obtained in step fourijAnd obtained in step five
Figure FDA00035101292200000116
The constructive loss function is:
Figure FDA00035101292200000117
step seven: and (4) optimizing parameters of the convolutional neural network by using a random gradient descent method according to the loss function obtained in the step six, and repeating the steps until the loss function in the step six is not reduced or stabilized any more, so as to obtain the convolutional neural network comprising the corrected convolutional parameters.
2. The local image feature extraction method according to claim 1, wherein the acquiring of the local image includes:
receiving an overall image;
receiving local information for feature extraction;
and cutting the image according to the local information to obtain a local image.
3. A partial image feature extraction device characterized by comprising:
the correction unit is used for correcting the convolution parameters in the convolution neural network to obtain corrected convolution parameters;
an acquisition unit configured to acquire a local image;
the characteristic extraction unit is used for extracting the characteristic information of the local image according to the corrected convolution parameters on the basis of the convolution neural network to obtain a characteristic vector;
the correction unit is specifically configured to correct the convolution parameter by the following steps:
the method comprises the following steps: k classes derived from convolutional neural networks as A1,A2,A3,...,AkWherein each classification group i contains
Figure FDA0003510129220000021
The n feature vectors;
step two: calculating to obtain the central point of each classification group
Figure FDA0003510129220000022
The calculation formula is as follows:
Figure FDA0003510129220000023
wherein, i is 1,2, 3.
Wherein i is the number of the classification group, and j is the number of the feature vector in each classification group;
step three: calculate all points in each group
Figure FDA0003510129220000024
And group center point
Figure FDA0003510129220000025
Euclidean distance of
Figure FDA0003510129220000026
The calculation formula is as follows:
Figure FDA0003510129220000027
wherein, i is 1,2,., k, j is 1,2,. and n;
step four: calculating the center point in each group
Figure FDA0003510129220000028
And itThe central point in the remaining k-1 group
Figure FDA0003510129220000029
Euclidean distance of DijThe calculation formula is as follows:
Figure FDA00035101292200000210
wherein i ≠ j, i ≠ 1,2,.. k, j ∈ 1,2,3,. k;
step five: center point for any one group i
Figure FDA00035101292200000211
Finding D from the group of data in step fourijThe smallest group j, namely:
Figure FDA0003510129220000031
wherein i ≠ j, i ≠ 1,2, 3.. k, j ═ 1,2, 3.. k;
and set the feature vector of the center point of the group j as
Figure FDA0003510129220000032
Let their Euclidean distance be
Figure FDA0003510129220000033
Step six: obtained according to step three
Figure FDA0003510129220000034
D obtained in step fourijAnd obtained in step five
Figure FDA0003510129220000035
The constructive loss function is:
Figure FDA0003510129220000036
step seven: and (4) optimizing parameters of the convolutional neural network by using a random gradient descent method according to the loss function obtained in the step six, and repeating the steps until the loss function in the step six is not reduced or stabilized any more, so as to obtain the convolutional neural network comprising the corrected convolutional parameters.
4. A partial image feature extraction system, comprising:
a memory for storing a computer program;
a processor for implementing the steps of the local image feature extraction method according to any one of claims 1 to 2 when executing the computer program.
5. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a program which, when being executed by a processor, realizes the steps of the local image feature extraction method according to any one of claims 1 to 2.
CN201810870557.9A 2018-08-02 2018-08-02 Local image feature extraction method, device and system and readable storage medium Active CN108921181B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810870557.9A CN108921181B (en) 2018-08-02 2018-08-02 Local image feature extraction method, device and system and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810870557.9A CN108921181B (en) 2018-08-02 2018-08-02 Local image feature extraction method, device and system and readable storage medium

Publications (2)

Publication Number Publication Date
CN108921181A CN108921181A (en) 2018-11-30
CN108921181B true CN108921181B (en) 2022-05-10

Family

ID=64394361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810870557.9A Active CN108921181B (en) 2018-08-02 2018-08-02 Local image feature extraction method, device and system and readable storage medium

Country Status (1)

Country Link
CN (1) CN108921181B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109567600B (en) * 2018-12-05 2020-12-01 江西书源科技有限公司 Automatic accessory identification method for household water purifier
WO2020133510A1 (en) * 2018-12-29 2020-07-02 深圳迈瑞生物医疗电子股份有限公司 Ultrasonic imaging method and device
CN109829350A (en) * 2019-02-20 2019-05-31 北京意锐新创科技有限公司 The two dimensional code method of payment and device of compatible for wired and wireless two-channel access public network
CN109977717A (en) * 2019-03-18 2019-07-05 北京意锐新创科技有限公司 Bar code method of payment and device based on light and shade code
CN109978090A (en) * 2019-03-18 2019-07-05 北京意锐新创科技有限公司 The method of payment and device combined based on two dimensional code and private mark
CN113505740B (en) * 2021-07-27 2023-10-10 北京工商大学 Face recognition method based on transfer learning and convolutional neural network

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369316A (en) * 2008-07-09 2009-02-18 东华大学 Image characteristics extraction method based on global and local structure amalgamation
CN101853398A (en) * 2010-05-11 2010-10-06 浙江大学 Chinese paper cutting identification method based on space constraint characteristic selection and combination thereof

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106503743B (en) * 2016-10-31 2020-04-17 天津大学 Self-adaptive clustering method for local feature points of images with large number and high dimension
CN107403166B (en) * 2017-08-02 2021-01-26 广东工业大学 Method and device for extracting pore characteristics of face image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101369316A (en) * 2008-07-09 2009-02-18 东华大学 Image characteristics extraction method based on global and local structure amalgamation
CN101853398A (en) * 2010-05-11 2010-10-06 浙江大学 Chinese paper cutting identification method based on space constraint characteristic selection and combination thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Triplet-Center Loss for Multi-View 3D Object Retrieval;Xinwei He et al;《arXiv.org》;20180316;参见第1-11页 *
Working hard to know your neighbor’s margins:Local descriptor learning loss;Anastasiya Mishchuk et al;《arXiv.org》;20170530;参见第1-10页 *

Also Published As

Publication number Publication date
CN108921181A (en) 2018-11-30

Similar Documents

Publication Publication Date Title
CN108921181B (en) Local image feature extraction method, device and system and readable storage medium
CN108875522B (en) Face clustering method, device and system and storage medium
Manap et al. Non-distortion-specific no-reference image quality assessment: A survey
CN109815770B (en) Two-dimensional code detection method, device and system
US9727775B2 (en) Method and system of curved object recognition using image matching for image processing
JP5594852B2 (en) Histogram method and system for object recognition
CN110826519A (en) Face occlusion detection method and device, computer equipment and storage medium
CA3066029A1 (en) Image feature acquisition
WO2020024744A1 (en) Image feature point detecting method, terminal device, and storage medium
CN105551022B (en) A kind of image error matching inspection method based on shape Interactive matrix
CN109117854B (en) Key point matching method and device, electronic equipment and storage medium
WO2021258699A1 (en) Image identification method and apparatus, and electronic device and computer-readable medium
CN111831844A (en) Image retrieval method, image retrieval device, image retrieval apparatus, and medium
Sil et al. Convolutional neural networks for noise classification and denoising of images
CN110532413A (en) Information retrieval method, device based on picture match, computer equipment
CN112464803A (en) Image comparison method and device
CN110516731B (en) Visual odometer feature point detection method and system based on deep learning
CN104268550B (en) Feature extracting method and device
CN112149601A (en) Occlusion-compatible face attribute identification method and device and electronic equipment
CN111062927A (en) Method, system and equipment for detecting image quality of unmanned aerial vehicle
CN113128518B (en) Sift mismatch detection method based on twin convolution network and feature mixing
CN114022531A (en) Image processing method, electronic device, and storage medium
CN112017221B (en) Multi-modal image registration method, device and equipment based on scale space
CN113435479A (en) Feature point matching method and system based on regional feature expression constraint
CN111339884A (en) Image recognition method and related equipment and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant