CN109978072A - A kind of character comparison method and Compare System based on deep learning - Google Patents

A kind of character comparison method and Compare System based on deep learning Download PDF

Info

Publication number
CN109978072A
CN109978072A CN201910266072.3A CN201910266072A CN109978072A CN 109978072 A CN109978072 A CN 109978072A CN 201910266072 A CN201910266072 A CN 201910266072A CN 109978072 A CN109978072 A CN 109978072A
Authority
CN
China
Prior art keywords
character block
character
eigenvector
distance
deep learning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910266072.3A
Other languages
Chinese (zh)
Inventor
胥志伟
石志君
张瑜
王胜科
王亚平
吕昕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Weiran Intelligent Technology Co.,Ltd.
Original Assignee
Qingdao Accompanying Star Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Accompanying Star Intelligent Technology Co Ltd filed Critical Qingdao Accompanying Star Intelligent Technology Co Ltd
Priority to CN201910266072.3A priority Critical patent/CN109978072A/en
Publication of CN109978072A publication Critical patent/CN109978072A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Image Analysis (AREA)
  • Character Discrimination (AREA)

Abstract

The present invention relates to deep learning fields, propose a kind of character comparison method based on deep learning, comprising the following steps: obtain the feature vector of drawing character block and character block in kind;Calculate the distance between drawing character block eigenvector and character block eigenvector in kind value;According to the distance value of drawing character block eigenvector and character block eigenvector in kind, character block correlation result is obtained.The present invention is extracted by neural network characteristics and similarity measurement, can automatically compare to drawing character and character in kind, search whether water clock occur, situations such as mistake is carved, artificial comparison can be greatly reduced, the waste of human resources is reduced, while improving the precision and efficiency of comparison.The invention also provides a kind of character Compare System based on deep learning.

Description

A kind of character comparison method and Compare System based on deep learning
Technical field
The present invention relates to depth learning technology fields, and in particular to a kind of character comparison method and ratio based on deep learning To system.
Background technique
It often relates to mark character on surface in kind according to drawing in manufacturing industry, especially during mechanization production, Understand because the fortuitous events such as the power-off of marking tool, card stop and be broken lead to occur losing carve characters symbol, water clock word in imprint process Situations such as symbol and marking mistake.The comparison that drawing and marking character in kind will be carried out in the quality inspection stage, does not conform to detect these Lattice situation.
But currently, the quality inspection stage comparison and detection work often by manually undertaking, not only expend a large amount of human resources, Detection efficiency is extremely low simultaneously.
With the further investigation of current machine learning techniques, how depth learning technology to be utilized to replace artificial detection, used instead Machine carries out visual characters comparison, is current urgent problem to be solved.
Summary of the invention
In view of the foregoing deficiencies of prior art, the purpose of the present invention is to provide a kind of characters based on deep learning Comparison method and Compare System do the line drawing block of acquisition and character picture block input system in kind by deep learning Similarity measurement out, is judged whether to match according to threshold value, once judges whether water clock, wrong situations such as carving occur.
The technical scheme of the present invention is realized as follows:
A kind of character comparison method based on deep learning, comprising the following steps: obtain drawing character block and character in kind The feature vector of block;Calculate the distance between drawing character block eigenvector and character block eigenvector in kind value;According to drawing The distance value of character block eigenvector and character block eigenvector in kind, obtains character block correlation result.
Optionally, the step of feature vector for obtaining drawing character block and character block in kind, comprising: call instruction in advance The characteristic vector pickup model perfected, drawing character block image or character block image in kind to input are handled.
Optionally, it includes: character block image or drawing character block image input in kind that described eigenvector, which extracts model, To after characteristic vector pickup model, successively passes through 64 × 64,32 × 32,16 × 16,8 × 8,1 × 1 down-sampled convolutional layer, obtain To multidimensional characteristic matrix, then carry out up-sampling operating process, up-sampling operating process include successively by 8 × 8,16 × 16, 32 × 32,64 × 64,1 × 15 layers of up-sampling convolutional layer and 4 merging layers, 5 layers of up-sampling convolutional layer and 4 merging layer alternatings Connect;Merging layer is that the result obtained to the eigenmatrix of upper one layer of convolutional layer output with the down-sampled convolutional layer with size does one Then 1,2 dimension channels are extended for originally by a multi-dimensional matrix element adduction on the basis of keeping eigenmatrix third dimension size Two times, then input next layer of up-sampling convolutional layer again;64 × 64 eigenmatrix is finally obtained, then elongates and is characterized Vector.
Optionally, the step for calculating the distance between drawing character block eigenvector and character block eigenvector in kind value Suddenly, comprising: the Euclidean distance and COS distance between drawing character block eigenvector and character block eigenvector in kind are calculated, and Euclidean distance and COS distance are fitted according to weight proportion, obtain final distance value.
Optionally, the character comparison method based on deep learning further includes pre-treatment step, and pre-treatment step includes: The drawing character block and character block in kind compare to needs pre-processes, and character block scaling is unified for standard input network Pixel dimension, spcial character block carry out binary conversion treatment, after pretreatment carry out feature vector extraction.
The invention also provides a kind of character Compare System based on deep learning calls trained twin network in advance Model carries out similarity measurement, and twin network model includes characteristic vector pickup module and similarity measurement module;Feature vector Extraction module is for obtaining drawing character block eigenvector and character block eigenvector in kind;Similarity measurement module is for calculating The distance between drawing character block eigenvector and character block eigenvector in kind value;It further include threshold value judgment module, threshold value is sentenced Disconnected module is used to obtain character block similarity according to the distance value of drawing character block eigenvector and character block eigenvector in kind As a result.
Optionally, described eigenvector extraction module is dual input module, inputs character block image and drawing character in kind The picture pair of block image, twin network model need to meet simultaneously in the training process character block image and drawing character block in kind The feature extraction effect requirements of image.
Optionally, the similarity measurement module includes Euclidean distance computing unit, COS distance computing unit and joint Unit;Euclidean distance computing unit by Euclidean distance formula calculate drawing character block eigenvector and character block feature in kind to Euclidean distance between amount;COS distance computing unit calculates drawing character block eigenvector and material object by COS distance formula COS distance between character block eigenvector;Associated units are by combinatorial formula by Euclidean distance and COS distance according to weight Ratio is fitted to final distance value.
Optionally, the training process of the twin network model includes: firstly, building initial twin network model;So Afterwards, by the parameter of the twin network model of training set training, training set includes matching faultless character to have quarter to mismatch The character pair for mistake of misprinting further includes that matching and unmatched label calculate on verifying collection every the training step of setting number Each character to whether correct judgment, then calculate total accuracy;When accuracy reaches given threshold, then deconditioning is twin Network model then adjusts initial twin if accuracy does not reach preset requirement still after reaching the training total degree of setting The raw network model network establishment number of plies, re -training.
Optionally, the character Compare System based on deep learning further includes preprocessing module, and preprocessing module is first The character block compared to needs pre-processes, and scaling is unified for the pixel dimension of standard input network, spcial character block It needs to carry out binary conversion treatment, the character block input feature value extraction module after pretreatment.
Optionally, described eigenvector extraction module includes: character block image or drawing character block image input in kind To after characteristic vector pickup module, successively passes through 64 × 64,32 × 32,16 × 16,8 × 8,1 × 1 down-sampled convolutional layer, obtain To multidimensional characteristic matrix, then carry out up-sampling operating process, up-sampling operating process include successively by 8 × 8,16 × 16, 32 × 32,64 × 64,1 × 15 layers of up-sampling convolutional layer and 4 merging layers, 5 layers of up-sampling convolutional layer and 4 merging layer alternatings Connect;Merging layer is that the result obtained to the eigenmatrix of upper one layer of convolutional layer output with the down-sampled convolutional layer with size does one Then 1,2 dimension channels are extended for originally by a multi-dimensional matrix element adduction on the basis of keeping eigenmatrix third dimension size Two times, then input next layer of up-sampling convolutional layer again;64 × 64 eigenmatrix is finally obtained, then elongates and is characterized Vector.
The beneficial effects of the present invention are:
(1) automatically drawing character and character in kind can be compared, searches whether water clock occur, wrong situations such as carving;
(2) artificial comparison can be greatly reduced, the waste of human resources is reduced, while improving the precision and effect of comparison Rate.
Detailed description of the invention
Fig. 1 is an a kind of optional implementation process diagram of the character comparison method based on deep learning of the present invention;
Fig. 2 is the optional implementation structural schematic diagram that feature vector extracts model;
Fig. 3 is an optional implementation structural schematic diagram for merging layer;
Fig. 4 is a kind of another optional implementation process diagram of the character comparison method based on deep learning of the present invention;
Fig. 5 is an a kind of optional implementation structure control block diagram of the character Compare System based on deep learning of the present invention;
Fig. 6 is an optional implementation structural schematic diagram of feature vector extraction module.
Specific embodiment
The preferred embodiments of the present invention will be described in detail with reference to the accompanying drawing, so that advantages and features of the invention energy It is easier to be readily appreciated by one skilled in the art, so as to make a clearer definition of the protection scope of the present invention.
Fig. 1 shows an alternative embodiment of the character comparison method based on deep learning.
In the alternative embodiment, the character comparison method based on deep learning the following steps are included:
Step 11, the feature vector of drawing character block and character block in kind is obtained.
Step 12, the distance between drawing character block eigenvector and character block eigenvector in kind value are calculated.
Step 13, according to the distance value of drawing character block eigenvector and character block eigenvector in kind, character block is obtained Correlation result.If the distance between drawing character block eigenvector and character block eigenvector in kind value are less than threshold value, recognize It can be matched between drawing character block and character block in kind, marking mistake is determined as similar in tolerance interval.If drawing word The distance between symbol block eigenvector and character block eigenvector in kind value are greater than or are more than threshold value, then it is assumed that drawing character block It is mismatched between character block in kind, marking error situation occurs, be determined as that disconnected quarter occur in dissmilarity, such as imprinting process, it is more Situations such as quarter.
Optionally, the step 11 of the feature vector of above-mentioned acquisition drawing character block and character block in kind, comprising: call preparatory Trained characteristic vector pickup model, drawing character block image or character block image in kind to input are handled.
Fig. 2 shows an alternative embodiments of characteristic vector pickup model.
In the alternative embodiment, character block image or drawing character block image in kind is input to eigenmatrix and extracts model Afterwards, the down-sampled convolutional layer for successively passing through 64 × 64,32 × 32,16 × 16,8 × 8,1 × 1, obtains multidimensional characteristic matrix, then Carry out up-sampling operating process.Up-sample operating process include 5 layers of up-sampling convolutional layer (8 × 8,16 × 16,32 × 32,64 × 64,1 × 1) alternately connect with 4 merging layers, 5 layers of up-sampling convolutional layer and 4 merging layers, merge layer structure as shown in figure 3, closing And layer is that the result obtained to the eigenmatrix of upper one layer of convolutional layer output and the down-sampled convolutional layer of same size is done more than one Matrix element adduction is tieed up, then expands 1,2 dimension (length and width dimension) channels on the basis of keeping eigenmatrix third dimension size It is original two times, then inputs next layer of up-sampling convolutional layer again.Characteristic vector pickup model finally obtains 64 × 64 Eigenmatrix, then elongating is feature vector.
Fig. 3 shows an alternative embodiment for merging layer.
In the alternative embodiment, merging layer includes element adduction unit 31 and up-sampling unit 32.
Optionally, the step of the distance between above-mentioned calculating drawing character block eigenvector and character block eigenvector in kind value Rapid 12, comprising: the Euclidean distance and COS distance between drawing character block eigenvector and character block eigenvector in kind are calculated, And be fitted Euclidean distance and COS distance according to weight proportion, obtain final distance value.
Wherein, Euclidean distance formula are as follows:
COS distance formula are as follows:
In formula, XAFor drawing character block eigenvector, XBFor character block eigenvector in kind, θ XA,XBVector angle. xAiFor XAElement in vector, xBiFor XBElement in vector, n indicate the element number of vector.
Then, Euclidean distance and COS distance are fitted according to the combinatorial formula of formula (3), obtain final distance Value.
Wherein, α is the weight of Euclidean distance, and β is the weight of COS distance, and α, β need specific value according to the actual situation.
Fig. 4 shows another alternative embodiment of the character comparison method based on deep learning.
In the alternative embodiment, the character comparison method based on deep learning further includes pre-treatment step 10, pretreatment step Rapid 10 include: that the drawing character block compared to needs is pre-processed with character block in kind, and character block scaling is unified for mark The pixel dimension of quasi- input network, spcial character block (such as character block ambient noise in kind is excessive) need to carry out at binaryzation Reason carries out the extraction of feature vector after pretreatment.
Fig. 5 shows an alternative embodiment of the character Compare System based on deep learning.
In the alternative embodiment, the character Compare System based on deep learning calls trained twin network model in advance Similarity measurement is carried out, twin network model includes characteristic vector pickup module 1 and similarity measurement module 2, and feature vector mentions Modulus block 1 is for obtaining drawing character block eigenvector and character block eigenvector in kind, and similarity measurement module 2 is for calculating The distance between drawing character block eigenvector and character block eigenvector in kind value.Characteristic extracting module 1 is dual input module, Need to input the picture pair of character block image and drawing character block image in kind, twin network model is in continuous training process Need to meet simultaneously the requirement that both images all reach excellent feature extraction effect.
Character Compare System based on deep learning further includes threshold value judgment module 3, and threshold value judgment module 3 is used for according to figure The distance value of paper character block eigenvector and character block eigenvector in kind, obtains character block correlation result.If drawing character The distance value of block eigenvector and character block eigenvector in kind is less than threshold value, then thinking can between drawing character and character in kind With matching, marking mistake is determined as similar in tolerance interval.If drawing character block eigenvector and character block feature in kind to The distance value of amount is greater than or is more than threshold value, then it is assumed that mismatches between drawing character and character in kind, occurs marking wrong feelings Condition is determined as dissmilarity.
Fig. 6 shows an alternative embodiment of characteristic vector pickup module.
In the alternative embodiment, character block image or drawing character block image in kind is input to characteristic vector pickup module Afterwards, the down-sampled convolutional layer for successively passing through 64 × 64,32 × 32,16 × 16,8 × 8,1 × 1, obtains multidimensional characteristic matrix, then Carry out up-sampling operating process, up-sampling operating process include 5 layers of up-sampling convolutional layer (8 × 8,16 × 16,32 × 32,64 × 64,1 × 1) alternately connect with 4 merging layers, 5 layers of up-sampling convolutional layer and 4 merging layers, merge layer structure as shown in figure 3, closing And layer is that the result obtained to the eigenmatrix of upper one layer of convolutional layer output with the down-sampled convolutional layer with size does a multidimensional Then 1,2 dimension (length and width dimension) channels are extended for by matrix element adduction on the basis of keeping eigenmatrix third dimension size Originally two times, next layer of up-sampling convolutional layer is then inputted again.Characteristic vector pickup module finally obtains 64 × 64 spy Matrix is levied, then elongating is feature vector.
Fig. 3 shows an alternative embodiment for merging layer.
In the alternative embodiment, merging layer includes element adduction unit 31 and up-sampling unit 32.
Optionally, similarity measurement module 2 includes that Euclidean distance computing unit, COS distance computing unit and joint are single Member, Euclidean distance computing unit calculate drawing character block eigenvector and character block eigenvector in kind by Euclidean distance formula Between Euclidean distance, COS distance computing unit calculates drawing character block eigenvector and word in kind by COS distance formula The COS distance between block eigenvector is accorded with, associated units are by combinatorial formula by Euclidean distance and COS distance according to weight ratio Example is fitted to final distance value.
Wherein, Euclidean distance formula are as follows:
COS distance formula are as follows:
In formula, XAFor drawing character block eigenvector, XBFor character block eigenvector in kind, θ XA,XBVector angle. xAiFor XAElement in vector, xBiFor XBElement in vector, n indicate the element number of vector.
Combinatorial formula are as follows:
Wherein, α is the weight of Euclidean distance, and β is the weight of COS distance, and α, β need specific value according to the actual situation.
Optionally, the training process of twin network model includes: firstly, building initial twin network model, including roll up Lamination, pond layer, the arrangement of full articulamentum and the initialization of each layer parameter.Then, pass through the ginseng of training set training network Number, training set include matching faultless character to the character pair for having marking mistake with mismatch, further include matching and mismatch Label, every the training step of setting number, calculated on verifying collection each character to whether correct judgment, then calculate total Accuracy.When accuracy reach given threshold then can with deconditioning model, if after reaching the training total degree of setting, essence Exactness does not reach preset requirement still, then adjusts the initial network establishment number of plies, re -training.Optionally, verifying collection and instruction It is substantially similar to practice collection, but it is relatively fewer to verify collection quantity.
In another alternative embodiment, the character Compare System based on deep learning further includes preprocessing module, pre- to locate The character block that reason module first compares needs pre-processes, and scaling is unified for the pixel dimension of standard input network, Spcial character block (such as character block ambient noise in kind is excessive) needs to carry out binary conversion treatment, the character block after pretreatment Input feature value extraction module 1.
In conclusion the method and system of the embodiment of the present disclosure is extracted by neural network characteristics and similarity measurement, Automatically drawing character and character in kind can be compared, search whether water clock occur, wrong situations such as carving, can greatly reduced Artificial comparison, reduces the waste of human resources, while improving the precision and efficiency of comparison.
The above description is only an embodiment of the present invention, is not intended to limit the scope of the invention, all to utilize this hair Equivalent structure or equivalent flow shift made by bright specification and accompanying drawing content is applied directly or indirectly in other relevant skills Art field, is included within the scope of the present invention.

Claims (10)

1. a kind of character comparison method based on deep learning, which comprises the following steps:
Obtain the feature vector of drawing character block and character block in kind;
Calculate the distance between drawing character block eigenvector and character block eigenvector in kind value;
According to the distance value of drawing character block eigenvector and character block eigenvector in kind, character block correlation result is obtained.
2. a kind of character comparison method based on deep learning as described in claim 1, which is characterized in that the acquisition drawing The step of feature vector of character block and character block in kind, comprising: trained characteristic vector pickup model in advance is called, to defeated The drawing character block image or character block image in kind entered is handled.
3. a kind of character comparison method based on deep learning as described in claim 1, which is characterized in that the calculating drawing The step of the distance between character block eigenvector and character block eigenvector in kind value, comprising: calculate drawing character block feature Euclidean distance and COS distance between vector sum material object character block eigenvector, and by Euclidean distance and COS distance according to power Weight ratio fitting, obtains final distance value.
4. a kind of character comparison method based on deep learning as described in claim 1, which is characterized in that further include pretreatment Step, pre-treatment step include: that the drawing character block compared to needs is pre-processed with character block in kind, the contracting of character block scale The pixel dimension for being unified for standard input network is put, spcial character block carries out binary conversion treatment, carries out feature after pretreatment The extraction of vector.
5. a kind of character Compare System based on deep learning, which is characterized in that call trained twin network model in advance Similarity measurement is carried out, twin network model includes characteristic vector pickup module and similarity measurement module;
Characteristic vector pickup module is for obtaining drawing character block eigenvector and character block eigenvector in kind;
Similarity measurement module is for calculating the distance between drawing character block eigenvector and character block eigenvector in kind value;
It further include threshold value judgment module, threshold value judgment module is used for according to drawing character block eigenvector and character block feature in kind The distance value of vector obtains character block correlation result.
6. a kind of character Compare System based on deep learning as claimed in claim 5, which is characterized in that described eigenvector Extraction module is dual input module, inputs the picture pair of character block image and drawing character block image in kind, twin network model Need to meet simultaneously the feature extraction effect requirements of character block image and drawing character block image in kind in the training process.
7. a kind of character Compare System based on deep learning as claimed in claim 5, which is characterized in that the similarity measurements Measuring module includes Euclidean distance computing unit, COS distance computing unit and associated units;
Euclidean distance computing unit by Euclidean distance formula calculate drawing character block eigenvector and character block feature in kind to Euclidean distance between amount;
COS distance computing unit by COS distance formula calculate drawing character block eigenvector and character block feature in kind to COS distance between amount;
Euclidean distance and COS distance are fitted to final distance value according to weight proportion by combinatorial formula by associated units.
8. a kind of character Compare System based on deep learning as claimed in claim 5, which is characterized in that the twin network The training process of model includes: firstly, building initial twin network model;Then, pass through the twin network mould of training set training The parameter of type, training set include match faultless character to mismatch have marking mistake character pair, further include matching with Unmatched label, every setting number training step, verifying collection on calculate each character to whether correct judgment, then Calculate total accuracy;When accuracy reaches the given threshold then twin network model of deconditioning, if reaching the training of setting After total degree, accuracy does not reach preset requirement still, then adjusts the initial twin network model network establishment number of plies, again Training.
9. a kind of character Compare System based on deep learning as claimed in claim 5, which is characterized in that further include pretreatment Module, the character block that preprocessing module first compares needs pre-process, and scaling is unified for standard input network Pixel dimension, spcial character block need to carry out binary conversion treatment, and the character block input feature value after pretreatment extracts mould Block.
10. a kind of character Compare System based on deep learning as claimed in claim 5, which is characterized in that the feature to Measuring extraction module includes: after character block image or drawing character block image in kind is input to characteristic vector pickup module, successively By 64 × 64,32 × 32,16 × 16,8 × 8,1 × 1 down-sampled convolutional layer, multidimensional characteristic matrix is obtained, is then carried out Sampling operation process, up-sampling operating process include successively by 5 layers of 8 × 8,16 × 16,32 × 32,64 × 64,1 × 1 Convolutional layer and 4 merging layers are sampled, 5 layers of up-sampling convolutional layer and 4 merging layers alternately connect;Merging layer is to upper one layer of convolution The result that the eigenmatrix of layer output is obtained with the down-sampled convolutional layer with size is done a multi-dimensional matrix element and is summed it up, and then exists 1,2 dimension channels are extended for original two times on the basis of holding eigenmatrix third dimension size, then input next layer again Up-sample convolutional layer;64 × 64 eigenmatrix is finally obtained, then elongating is feature vector.
CN201910266072.3A 2019-04-03 2019-04-03 A kind of character comparison method and Compare System based on deep learning Pending CN109978072A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910266072.3A CN109978072A (en) 2019-04-03 2019-04-03 A kind of character comparison method and Compare System based on deep learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910266072.3A CN109978072A (en) 2019-04-03 2019-04-03 A kind of character comparison method and Compare System based on deep learning

Publications (1)

Publication Number Publication Date
CN109978072A true CN109978072A (en) 2019-07-05

Family

ID=67082640

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910266072.3A Pending CN109978072A (en) 2019-04-03 2019-04-03 A kind of character comparison method and Compare System based on deep learning

Country Status (1)

Country Link
CN (1) CN109978072A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111091144A (en) * 2019-11-27 2020-05-01 云南电网有限责任公司电力科学研究院 Image feature point matching method and device based on depth pseudo-twin network
CN111144438A (en) * 2019-11-26 2020-05-12 苏州方正璞华信息技术有限公司 Method and device for detecting commodities in advertisement bill
CN113204974A (en) * 2021-05-14 2021-08-03 清华大学 Method, device and equipment for generating confrontation text and storage medium
CN113591698A (en) * 2021-07-30 2021-11-02 国网上海市电力公司 High-voltage cable accessory drawing comparison method and system based on image recognition

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165548A1 (en) * 2015-07-30 2018-06-14 Beijing Sensetime Technology Development Co., Ltd Systems and methods for object tracking
CN108228915A (en) * 2018-03-29 2018-06-29 华南理工大学 A kind of video retrieval method based on deep learning
CN108364463A (en) * 2018-01-30 2018-08-03 重庆交通大学 A kind of prediction technique and system of the magnitude of traffic flow
CN108918536A (en) * 2018-07-13 2018-11-30 广东工业大学 Tire-mold face character defect inspection method, device, equipment and storage medium
CN108960245A (en) * 2018-07-13 2018-12-07 广东工业大学 The detection of tire-mold character and recognition methods, device, equipment and storage medium
CN109299462A (en) * 2018-09-20 2019-02-01 武汉理工大学 Short text similarity calculating method based on multidimensional convolution feature

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165548A1 (en) * 2015-07-30 2018-06-14 Beijing Sensetime Technology Development Co., Ltd Systems and methods for object tracking
CN108364463A (en) * 2018-01-30 2018-08-03 重庆交通大学 A kind of prediction technique and system of the magnitude of traffic flow
CN108228915A (en) * 2018-03-29 2018-06-29 华南理工大学 A kind of video retrieval method based on deep learning
CN108918536A (en) * 2018-07-13 2018-11-30 广东工业大学 Tire-mold face character defect inspection method, device, equipment and storage medium
CN108960245A (en) * 2018-07-13 2018-12-07 广东工业大学 The detection of tire-mold character and recognition methods, device, equipment and storage medium
CN109299462A (en) * 2018-09-20 2019-02-01 武汉理工大学 Short text similarity calculating method based on multidimensional convolution feature

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111144438A (en) * 2019-11-26 2020-05-12 苏州方正璞华信息技术有限公司 Method and device for detecting commodities in advertisement bill
CN111091144A (en) * 2019-11-27 2020-05-01 云南电网有限责任公司电力科学研究院 Image feature point matching method and device based on depth pseudo-twin network
CN111091144B (en) * 2019-11-27 2023-06-27 云南电网有限责任公司电力科学研究院 Image feature point matching method and device based on depth pseudo-twin network
CN113204974A (en) * 2021-05-14 2021-08-03 清华大学 Method, device and equipment for generating confrontation text and storage medium
CN113591698A (en) * 2021-07-30 2021-11-02 国网上海市电力公司 High-voltage cable accessory drawing comparison method and system based on image recognition

Similar Documents

Publication Publication Date Title
CN109978072A (en) A kind of character comparison method and Compare System based on deep learning
CN106600577B (en) A kind of method for cell count based on depth deconvolution neural network
CN105427298B (en) Remote sensing image registration method based on anisotropic gradient metric space
CN106295653B (en) Water quality image classification method
CN109993164A (en) A kind of natural scene character recognition method based on RCRNN neural network
CN106156684B (en) A kind of two-dimensional code identification method and device
CN115063573A (en) Multi-scale target detection method based on attention mechanism
CN107506765B (en) License plate inclination correction method based on neural network
CN109035251A (en) One kind being based on the decoded image outline detection method of Analysis On Multi-scale Features
CN110991359A (en) Satellite image target detection method based on multi-scale depth convolution neural network
CN114331869B (en) Dam face crack semantic segmentation method
CN103345760B (en) A kind of automatic generation method of medical image object shapes template mark point
CN102184404B (en) Method and device for acquiring palm region in palm image
CN109766838A (en) A kind of gait cycle detecting method based on convolutional neural networks
CN114549555A (en) Human ear image planning and division method based on semantic division network
CN114782770A (en) License plate detection and recognition method and system based on deep learning
CN109815957A (en) A kind of character recognition method based on color image under complex background
CN109271868B (en) Dense connection convolution network hypersphere embedding-based target re-identification method
CN110675421A (en) Depth image collaborative segmentation method based on few labeling frames
CN115797808A (en) Unmanned aerial vehicle inspection defect image identification method, system, device and medium
CN111339932A (en) Palm print image preprocessing method and system
CN110516615A (en) Human and vehicle shunting control method based on convolutional neural networks
CN106504211A (en) Based on the low-light-level imaging method for improving SURF characteristic matchings
CN116935175A (en) Parking space detection method based on multi-scale detection head feature fusion
CN107273793A (en) A kind of feature extracting method for recognition of face

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210112

Address after: 266000 party masses Service Center, malianzhuang Town, Laixi City, Qingdao City, Shandong Province

Applicant after: Shandong Weiran Intelligent Technology Co.,Ltd.

Address before: 266000 706-1, block B, Suning Plaza, 28 Jingkou Road, Licang District, Qingdao City, Shandong Province

Applicant before: QINGDAO BANXING INTELLIGENT TECHNOLOGY Co.,Ltd.

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190705