CN107230230A - A kind of Instrument image localization method based on composite filter - Google Patents

A kind of Instrument image localization method based on composite filter Download PDF

Info

Publication number
CN107230230A
CN107230230A CN201710446051.0A CN201710446051A CN107230230A CN 107230230 A CN107230230 A CN 107230230A CN 201710446051 A CN201710446051 A CN 201710446051A CN 107230230 A CN107230230 A CN 107230230A
Authority
CN
China
Prior art keywords
mrow
instrumentation
composite filter
mfrac
localization method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201710446051.0A
Other languages
Chinese (zh)
Other versions
CN107230230B (en
Inventor
葛成伟
王�锋
程敏
赵伟
许春山
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yijiahe Technology Co Ltd
Original Assignee
Yijiahe Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yijiahe Technology Co Ltd filed Critical Yijiahe Technology Co Ltd
Priority to CN201710446051.0A priority Critical patent/CN107230230B/en
Publication of CN107230230A publication Critical patent/CN107230230A/en
Application granted granted Critical
Publication of CN107230230B publication Critical patent/CN107230230B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Abstract

The present invention provides a kind of Instrument image localization method based on composite filter, by training in advance, composite filter is obtained, the training pattern parameter of instrumentation image is obtained, recycles the training pattern parameter of instrumentation image to obtain coordinate points of the instrumentation in panoramic picture.The present invention can accurately obtain position of the instrumentation in panoramic picture to different instrument equipment, varying environment change and different background condition.

Description

A kind of Instrument image localization method based on composite filter
Technical field
The invention belongs to Computer Image Processing field, it is related to a kind of Instrument image localization method, in particular it relates to a kind of Instrument image localization method based on composite filter.
Background technology
Electric inspection process robot is in Instrument image equipment identification process is carried out, it is necessary first to navigated in panoramic picture Equipment region, carries out the operation of camera zoom to obtain clearly instrumentation image further according to the equipment region navigated to, is easy to Image recognition.If equipment positioning failure in panoramic picture, follow-up all operations will be invalid, thus, design a kind of accurate height The panorama sketch Instrument image localization method of effect has important practical significance.
Instrument image localization method mainly includes template matching method, Feature Points Matching method etc..Template matching method is according to mark Fixed instrumentation template image, the similarity of image to be matched and template image is calculated using some similarity evaluation functions, Similarity is more high, illustrate be instrument region possibility it is higher, it is generally the case that template matching method can be accurately positioned instrument Equipment region, but when light change or complicated background, template matching method often provides multiple peak values, it is impossible to be accurately positioned Target location;All characteristic points in Feature Points Matching method instrumentation equipment drawing picture first and panoramic picture, then carry out two-by-two Match somebody with somebody and carry out matching checking, finally give the target location of matching, change greatly in light, background it is complicated in the case of, feature Point match method can still obtain good effect, but when characteristic point or characteristic point is not present than sparse in instrumentation in itself When, this method will fail, and the usual amount of calculation of Feature Points Matching method is larger.
The content of the invention
To solve the problem of prior art is present, the present invention provides a kind of instrumentation framing side of efficiently and accurately Method, this method can accurately obtain instrumentation in panorama to different instrument equipment, varying environment change and different background condition Position in image.
The Instrument image localization method based on composite filter that the present invention is provided, comprises the following steps;
Instrumentation model training:
Define dimensional Gaussian receptance functionWherein, p (x0,y0) for instrumentation in panorama Scheme the centre coordinate in f (x, y), σ is Gaussian response radius, 0≤x≤M-1,0≤y≤N-1, and M, N is respectively the width of image With height.Dimensional Gaussian receptance function g (x, y) discrete Fourier transform can be expressed as
Wherein exp is the exponential function using natural constant e the bottom of as, and j is imaginary unit, x=0,1 ..., M-1, y=0, 1,…,N-1;U=0,1 ..., M-1, v=0,1 ..., N-1;U, v are nonnegative integer;Define convolution kernel h (x, y) so that set up
ThenWherein F, H are respectively f, h two-dimentional Fourier Conversion,Operated for two-dimentional inverse Fourier transform, " " operates for two-dimensional matrix element dot product;
Convolution operation is replaced with associative operation, then G (u, v)=F (u, v) H*(u, v), wherein * grasp for the conjugation of matrix Make, thus composite filter is defined asH*(u, v) is the training pattern parameter of instrumentation image;
Instrumentation is positioned:
For panorama sketch f to be positionede, its corresponding dimensional Gaussian, which is responded, is expressed as Ge(u, v)=Fe(u,v)·H*(u, v);By frequency domain Ge(u, v) transforms to spatial domain ge(x, y),ge(x, y) maximum is corresponding Coordinate points are instrumentation in panorama sketch feCenter.
In order to obtain more accurate model parameter, in instrumentation model training step, generally different images are instructed The training pattern parameter got carries out average operation, specifically, setting training sequence of pictures as fi(1≤i≤L), wherein L is figure Piece number, i.e. total sample number, picture fiCorresponding instrumentation centre coordinate is pi(x0,y0) (1≤i≤L), corresponding two dimension height This response is gi(1≤i≤L), then averagely composite filter is defined as
In instrumentation positioning step, panorama sketch f is obtained using average composite filtereDimensional Gaussian response, i.e.,
In order to increase the robustness of average composite filter, it is set more to stablize, generally one regular coefficient ε of increase, i.e. flat Equal composite filter is expressed asThe effect of regular coefficient is to avoid denominator from being led for 0 Cause numerical computations unstable.
, need to be to complete before instrumentation model training in order to reduce the image boundary effect during Fourier transformation Scape figure f (x, y) is carried out plus Cosine Window processing, i.e. f (x, y) ← f (x, y) w (x, y), wherein,
In order to reduce the interference of the factors such as light, original panoramic images need to be carried out with some pretreatments, i.e. fe(x,y)←log (fe(x, y)+1),WhereinFor the average of image.In instrument In device model training step, it is also desirable to which panorama sketch f (x, y) is pre-processed i.e., f (x, y) ← log (f (x, y)+1), f (x,y)←f(x,y)-μf,Wherein μfFor the average of image.
Instrumentation model training needs a large amount of panorama sketch, and the camera that the present invention is carried using crusing robot captures panorama Figure, carries out instrumentation image calibration, specifically, capturing panorama sketch of the different time sections containing instrumentation using camera, obtains Centre coordinate of the instrumentation in panoramic picture, wherein, in the coordinate and panoramic picture of the photocentre of camera in panoramic picture Heart coordinate is overlapped, also, centre coordinate of the instrumentation in panoramic picture is overlapped with picture centre coordinate.
The invention has the advantages that:(1) the Instrument image localization method based on average synthesis accurate filtering device can More accurately and efficiently to navigate to instrumentation position in the picture;(2) rings such as light change, background complexity can be overcome Border factor interference, realizes that instrumentation is stable and accurately positions;(3) identification of crusing robot instrumentation can be greatly enhanced Rate.
Brief description of the drawings
Fig. 1 panorama sketch instrumentation demarcates schematic diagram;
Fig. 2 panorama sketch positioning result figures;
Fig. 3 panorama sketch location response figures.
Embodiment
Most highly preferred embodiment of the invention is illustrated below in conjunction with accompanying drawing:
By taking certain pointer-type gauges equipment in certain transformer station as an example, the Instrument image positioning side based on average composite filter Method, is carried out according to the following steps:
1) instrumentation image calibration:Calibration for cameras photocentre is carried out in advance to rectify standard, it is ensured that camera photocentre is in the picture Coordinate is overlapped with picture centre coordinate.Crusing robot is during patrol task, by adjusting head parameter so that instrument is set Standby centre coordinate is moved to picture centre, then carries out camera zoom so that target is clear, and details is visible.In whole zoom process In, the centre coordinate of instrumentation image only need to be obtained, thus only need gauged instrument to set during instrumentation image calibration The centre coordinate of standby image, the dimension information without instrument.
Instrumentation image calibration is exactly the panorama sketch for collecting the candid photograph of different time sections crusing robot, and obtains equipment and exist Centre coordinate in image, as shown in Figure 1.
2) instrumentation model training:Panorama sketch f (x, y) (0≤x≤M-1,0≤y≤N-1), wherein M, N is respectively figure The centre coordinate of instrumentation in the picture is p (x in the width of picture and height, panorama sketch0,y0), i.e. step 1) in demarcation Centre coordinate.Dimensional Gaussian receptance function g (x, y) (0≤x≤M-1,0≤y≤N-1) is defined,
σ represents Gaussian response radius, and value is bigger, and response radius is bigger, and the tolerance to noise is higher, and σ value is according to warp Selection is tested, those skilled in the art will know that obtaining selection σ=2 in suitable value, the present embodiment by way of experiment, two dimension is high This receptance function g (x, y) is in centre coordinate p (x0,y0) functional value at place reaches maximum 1, away from p (x0,y0) place function It is worth close to 0;Dimensional Gaussian receptance function g (x, y) discrete Fourier transform can be expressed as
Wherein exp is the exponential function using natural constant e the bottom of as, and j is imaginary unit, x=0,1 ..., M-1, y=0, 1,…,N-1;U=0,1 ..., M-1, v=0,1 ..., N-1;U, v are nonnegative integer.Define convolution kernel h (x, y) (0≤x≤M- 1,0≤y≤N-1) so that set upWhereinFor two-dimensional convolution operator.It is desirable that complete One dimensional Gaussian of output is responded after scape figure f (x, y) is operated by convolution kernel h (x, y), and the response is sat at the center of instrumentation Punctuate reaches peak, and the response away from central point is close to 0.According to two-dimensional convolution theorem, set upWherein F, H are respectively f and h two-dimensional Fourier transform,For Two-dimentional inverse Fourier transform operation, is the operation of two-dimensional matrix element dot product.
Convolution operation is replaced with associative operation, then sets up G (u, v)=F (u, v) H*(u, v), wherein G are g two-dimentional Fu In leaf transformation, * be matrix conjugate operation, accordingly, composite filter may be defined as
H*(u, v) is the training pattern parameter of the instrumentation image.
In the present embodiment, in order to reduce the image boundary effect during Fourier transformation, in instrumentation model Before training, panorama sketch need to be carried out plus Cosine Window processing, i.e.,
F (x, y) ← f (x, y) w (x, y),
Wherein,
In the present embodiment, in order to reduce the interference of the factors such as light, some pretreatments need to be carried out to panoramic picture f (x, y), That is,
F (x, y) ← log (f (x, y)+1),
f(x,y)←f(x,y)-μf,
Wherein μfFor the average of image.
In a practical situation, in order to obtain more accurate model parameter, different images are generally trained to obtained model Parameter carries out average operation, specifically, if training sequence of pictures is fi(1≤i≤L), wherein L is picture number, the present embodiment choosing It is 8, picture f to take L=8, i.e. total sample numberiCorresponding instrumentation centre coordinate is pi(x0,y0) (1≤i≤L), corresponding two Dimension Gaussian response is gi(1≤i≤L), then averagely composite filter is defined as
In order to increase the robustness of average composite filter, it is set more to stablize, generally one regular coefficient ε of increase, It is vertical
The effect of regular coefficient is to avoid denominator from causing numerical computations unstable for 0, those skilled in the art will know that according to Need to be chosen, suitable value is chosen by testing, the present embodiment chooses ε=0.1.
3) instrumentation is positioned:Obtaining the training pattern parameter of instrumentation imageAfterwards, for be positioned Panorama sketch fe, its corresponding two dimension, which is responded, to be represented by
In the present embodiment, need to be to original panoramic images f in order to reduce the interference of the factors such as lighteSome pretreatments are carried out, That is,
fe(x,y)←log(fe(x, y)+1),
WhereinFor the average of image.
Again by Ge(u, v) transforms to spatial domain ge(x, y),
geThe corresponding coordinate points of (x, y) maximum are instrumentation in panorama sketch feCenter, panorama sketch positioning knot Fruit is as shown in Fig. 2 corresponding response is as shown in Figure 3.After the completion of instrumentation positioning, you can the numerical value on instrumentation is carried out Read.

Claims (7)

1. a kind of Instrument image localization method based on composite filter, it is characterised in that comprise the following steps;
Instrumentation model training:
Define dimensional Gaussian receptance functionWherein, p (x0,y0) for instrumentation panorama sketch f (x, Y) centre coordinate in, σ be Gaussian response radius, 0≤x≤M-1,0≤y≤N-1, M, N be respectively image width with height; Dimensional Gaussian receptance function g (x, y) discrete Fourier transform is expressed as
<mrow> <mi>G</mi> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>x</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>M</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>y</mi> <mo>=</mo> <mn>0</mn> </mrow> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> </munderover> <mi>g</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <msup> <mi>exp</mi> <mrow> <mo>-</mo> <mi>j</mi> <mn>2</mn> <mi>&amp;pi;</mi> <mrow> <mo>(</mo> <mfrac> <mrow> <mi>x</mi> <mi>u</mi> </mrow> <mi>M</mi> </mfrac> <mo>+</mo> <mfrac> <mrow> <mi>y</mi> <mi>v</mi> </mrow> <mi>N</mi> </mfrac> <mo>)</mo> </mrow> </mrow> </msup> <mo>,</mo> </mrow>
Wherein exp is the exponential function using natural constant e the bottom of as, and j is imaginary unit, x=0,1 ..., M-1, y=0,1 ..., N- 1;U=0,1 ..., M-1, v=0,1 ..., N-1;U, v are nonnegative integer;Define convolution kernel h (x, y) so that set up
<mrow> <mi>f</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>&amp;CircleTimes;</mo> <mi>h</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>g</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>,</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow>
ThenWherein F, H are respectively f, h two-dimensional Fourier transform,Operated for two-dimentional inverse Fourier transform, " " operates for two-dimensional matrix element dot product;
Convolution operation is replaced with associative operation, then sets up G (u, v)=F (u, v) H*(u, v), wherein * grasp for the conjugation of matrix Make, thus, composite filter is defined asH*(u, v) is the training pattern ginseng of instrumentation image Number;
Instrumentation is positioned:
For panorama sketch f to be positionede, its corresponding dimensional Gaussian, which is responded, is expressed as Ge(u, v)=Fe(u,v)·H*(u,v);Will Frequency domain Ge(u, v) transforms to spatial domain ge(x, y),geThe corresponding coordinate points of (x, y) maximum As instrumentation is in panorama sketch feCenter.
2. the Instrument image localization method as claimed in claim 2 based on composite filter, it is characterised in that in instrumentation In model training step, obtain training pattern parameter for the training of different panorama sketch and carry out average operation, specifically, setting training figure Piece sequence is fi(1≤i≤L), wherein L is picture number, i.e. total sample number, picture fiCorresponding instrumentation centre coordinate is pi (x0,y0) (1≤i≤L), corresponding dimensional Gaussian response is gi(1≤i≤L), then averagely composite filter is defined as
<mrow> <msubsup> <mi>H</mi> <mi>&amp;mu;</mi> <mo>*</mo> </msubsup> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>L</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>L</mi> </munderover> <msup> <mi>H</mi> <mo>*</mo> </msup> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mn>1</mn> <mi>L</mi> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mn>1</mn> </mrow> <mi>L</mi> </munderover> <mfrac> <mrow> <msub> <mi>G</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msubsup> <mi>F</mi> <mi>i</mi> <mo>*</mo> </msubsup> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> <mrow> <msub> <mi>F</mi> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> <mo>&amp;CenterDot;</mo> <msubsup> <mi>F</mi> <mi>i</mi> <mo>*</mo> </msubsup> <mrow> <mo>(</mo> <mi>u</mi> <mo>,</mo> <mi>v</mi> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>,</mo> </mrow>
In instrumentation positioning step, panorama sketch f is obtained using average composite filtereDimensional Gaussian response, i.e.,
3. the Instrument image localization method as claimed in claim 2 based on composite filter, it is characterised in that average synthesis filter Ripple device is changed toWherein ε is regular coefficient.
4. the Instrument image localization method as claimed in claim 1 based on composite filter, it is characterised in that in instrumentation Before model training, first panoramic picture f (x, y) is carried out plus Cosine Window processing, i.e. f (x, y) ← f (x, y) w (x, y), wherein,
5. the Instrument image localization method as claimed in claim 1 based on composite filter, it is characterised in that to panorama sketch fe Pre-processed, i.e. fe(x,y)←log(fe(x, y)+1), Its InFor the average of image.
6. the Instrument image localization method as claimed in claim 1 based on composite filter, it is characterised in that in instrumentation In model training step, panorama sketch f (x, y) is pre-processed i.e., f (x, y) ← log (f (x, y)+1), f (x, y) ← f (x, y)-μf,Wherein μfFor the average of image.
7. the Instrument image localization method based on composite filter as described in claim 1-6 any claims, its feature It is, in addition to instrumentation image calibration step, specifically, capturing panorama of the different time sections containing instrumentation using camera Figure, obtains centre coordinate of the instrumentation in panoramic picture, wherein, the coordinate and panorama of the photocentre of camera in panoramic picture Picture centre coordinate is overlapped, also, centre coordinate of the instrumentation in panoramic picture is overlapped with picture centre coordinate.
CN201710446051.0A 2017-06-14 2017-06-14 Instrument image positioning method based on synthesis filter Active CN107230230B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710446051.0A CN107230230B (en) 2017-06-14 2017-06-14 Instrument image positioning method based on synthesis filter

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710446051.0A CN107230230B (en) 2017-06-14 2017-06-14 Instrument image positioning method based on synthesis filter

Publications (2)

Publication Number Publication Date
CN107230230A true CN107230230A (en) 2017-10-03
CN107230230B CN107230230B (en) 2020-03-24

Family

ID=59934940

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710446051.0A Active CN107230230B (en) 2017-06-14 2017-06-14 Instrument image positioning method based on synthesis filter

Country Status (1)

Country Link
CN (1) CN107230230B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921177A (en) * 2018-06-22 2018-11-30 重庆邮电大学 The instrument localization method of Intelligent Mobile Robot

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120051423A1 (en) * 2010-09-01 2012-03-01 Electronics And Telecommunications Research Institute Video processing method and apparatus based on multiple texture images using video excitation signals
CN105678344A (en) * 2016-02-29 2016-06-15 浙江群力电气有限公司 Intelligent classification method for power instrument equipment
CN105741266A (en) * 2016-01-22 2016-07-06 北京航空航天大学 Pathological image cell nucleus quick location method
CN105825215A (en) * 2016-03-15 2016-08-03 云南大学 Instrument positioning method based on local neighbor embedded kernel function and carrier of method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120051423A1 (en) * 2010-09-01 2012-03-01 Electronics And Telecommunications Research Institute Video processing method and apparatus based on multiple texture images using video excitation signals
CN105741266A (en) * 2016-01-22 2016-07-06 北京航空航天大学 Pathological image cell nucleus quick location method
CN105678344A (en) * 2016-02-29 2016-06-15 浙江群力电气有限公司 Intelligent classification method for power instrument equipment
CN105825215A (en) * 2016-03-15 2016-08-03 云南大学 Instrument positioning method based on local neighbor embedded kernel function and carrier of method

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
JOHN S.等: "TRANSFORMS,FILTERS AND EDGE DETECTORS IN IMAGE PROCESSING", 《INTERNATIONAL JOURNAL OF PUREAND APPLIED MATHEMATICS》 *
XIAOPING L.等: "Comparative study on wavelet transform and traditional image processing", 《THE 3RD INTERNATIONAL CONFERENCE ON INFORMATION SCIENCES AND INTERACTION SCIENCES》 *
吕瑞峰 等: "变电站智能巡检系统仪表定位算法设计与实现", 《电子制作》 *
李辉: "基于机器视觉的仪表数字识别研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108921177A (en) * 2018-06-22 2018-11-30 重庆邮电大学 The instrument localization method of Intelligent Mobile Robot

Also Published As

Publication number Publication date
CN107230230B (en) 2020-03-24

Similar Documents

Publication Publication Date Title
US11244197B2 (en) Fast and robust multimodal remote sensing image matching method and system
CN103955694B (en) Image recognition meter reading system and method
CN102541954B (en) Method and system for searching trademarks
CN109872305B (en) No-reference stereo image quality evaluation method based on quality map generation network
CN110706293B (en) SURF feature matching-based electronic component positioning and detecting method
CN103914847A (en) SAR image registration method based on phase congruency and SIFT
CN103218825B (en) Quick detection method of spatio-temporal interest points with invariable scale
CN107122787A (en) A kind of image scaling quality evaluating method of feature based fusion
CN102509304A (en) Intelligent optimization-based camera calibration method
CN106327422A (en) Image stylized reconstruction method and device
CN110895697A (en) Transformer nameplate information acquisition method and intelligent acquisition system
CN116448769A (en) Multi-mode information fusion plate defect detection system and detection method thereof
CN112749752A (en) Hyperspectral image classification method based on depth transform
CN109993116B (en) Pedestrian re-identification method based on mutual learning of human bones
CN107230230A (en) A kind of Instrument image localization method based on composite filter
Panetta et al. Unrolling post-mortem 3D fingerprints using mosaicking pressure simulation technique
CN106709915A (en) Image resampling operation detection method
CN110321869A (en) Personnel&#39;s detection and extracting method based on Multiscale Fusion network
CN106204593A (en) A kind of Feature Points Extraction based on sequence image synthesis
CN105956530A (en) Image correction method and image correction device
CN107622476A (en) Image Super-resolution processing method based on generative probabilistic model
CN110084736A (en) A kind of method of detecting watermarks and system based on SURF and pyramid algorith
CN113111850B (en) Human body key point detection method, device and system based on region-of-interest transformation
CN110599398A (en) Online image splicing and fusing method based on wavelet technology
CN112184835A (en) Method and system for restoring deformation of piano keyboard image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant