CN109242823B - Reference image selection method and device for positioning calculation and automatic driving system - Google Patents

Reference image selection method and device for positioning calculation and automatic driving system Download PDF

Info

Publication number
CN109242823B
CN109242823B CN201810823922.0A CN201810823922A CN109242823B CN 109242823 B CN109242823 B CN 109242823B CN 201810823922 A CN201810823922 A CN 201810823922A CN 109242823 B CN109242823 B CN 109242823B
Authority
CN
China
Prior art keywords
training
score
index
training window
orthogonality
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810823922.0A
Other languages
Chinese (zh)
Other versions
CN109242823A (en
Inventor
孟然
李飞
姜安
崔峰
朱海涛
柴华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Smarter Eye Technology Co Ltd
Original Assignee
Beijing Smarter Eye Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Smarter Eye Technology Co Ltd filed Critical Beijing Smarter Eye Technology Co Ltd
Priority to CN201810823922.0A priority Critical patent/CN109242823B/en
Publication of CN109242823A publication Critical patent/CN109242823A/en
Application granted granted Critical
Publication of CN109242823B publication Critical patent/CN109242823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Abstract

The invention provides a reference image selection method and a device for positioning calculation and an automatic driving system, which are used for basic input of image positioning calculation, and the reference image selection method comprises the following steps: dividing the training image into any plurality of training windows according to a preset rule; and performing weight comprehensive analysis on the calculation result, and determining a training window as a reference image according to the analysis result and a preset threshold value. By adopting the technical scheme of the invention, the reasonable reference image is automatically selected through the evaluation index of the reference image, the limitation of manual selection is avoided, and the reliability and the precision of the positioning result are improved.

Description

Reference image selection method and device for positioning calculation and automatic driving system
Technical Field
The invention relates to the field of image processing, in particular to a reference image selection method and device for positioning calculation and an automatic driving system.
Background
In the fields of machine vision and image processing, image positioning calculation (surface positioning, edge positioning, geometric positioning, and the like) is generally used for image detection, measurement, calibration, and the like. The result of the image positioning calculation can be used as the basic input of the subsequent algorithm, and has direct influence on the final result of the overall algorithm. In image positioning calculation, the reliability, precision and efficiency of a positioning result are directly influenced by reasonable selection of a reference image (also called a positioning kernel). In many application scenarios for machine vision inspection, the reference image must be selected manually. Manual selection can have several limitations: strong subjectivity, high complexity, high working strength and the like.
Therefore, the problems of strong subjectivity, high complexity, high working strength and the like caused by manually selecting the reference image exist in the prior art.
Disclosure of Invention
The invention provides a reference image selection method and device for positioning calculation and an automatic driving system, and aims to solve the problems of strong subjectivity, high complexity, high working strength and the like caused by manual selection of a reference image in the prior art.
In order to achieve the above object, according to a first aspect of the present invention, a reference image selection method for positioning calculation is provided, and the following specific scheme is adopted:
a reference image selection method for positioning calculation includes: dividing the training image into any plurality of training windows according to a preset rule; calculating the evaluation index of each training window and obtaining a calculation result; and performing weight comprehensive analysis on the calculation result, and determining a training window as a reference image according to the analysis result and a preset threshold value.
Further, the evaluation index includes, but is not limited to: redundancy index, balance index, and uniqueness index.
Further, the method for calculating the redundancy index includes: measuring the training window and calculating the normalized correlation coefficient of the training window on an x axis and the normalized correlation coefficient of the training window on a y axis; and carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain the symmetry fraction, wherein the symmetry fraction represents the redundancy index.
Further, the method for calculating the balance index comprises the following steps: calculating the edge intensity and the edge angle of each pixel point in the training window; acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle; and performing orthogonality score synthesis on the orthogonality score in the x-axis direction and the orthogonality score in the y-axis direction to obtain an orthogonality score, wherein the orthogonality score represents the balance index.
Further, the calculation method of the uniqueness index is as follows: traversing the training window; searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the second highest score; and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score is used for representing the uniqueness index.
According to a second aspect of the present invention, a reference image selection apparatus for positioning calculation is provided, and the following technical solutions are adopted:
a reference image selection apparatus for positioning calculation comprising: the dividing module is used for dividing the training image into any plurality of training windows according to a preset rule; the calculation module is used for calculating the evaluation index of each training window and obtaining a calculation result; and the determining module is used for carrying out weight comprehensive analysis on the calculation result and determining a training window as a reference image according to the analysis result and a preset threshold value.
Further, the evaluation index includes, but is not limited to: redundancy index, balance index, and uniqueness index.
Further, the calculation module is further configured to: measuring the training window and calculating the normalized correlation coefficient of the training window on an x axis and the normalized correlation coefficient of the training window on a y axis; and carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain the symmetry fraction, wherein the symmetry fraction represents the redundancy index.
Further, the calculation module is further configured to: calculating the edge intensity and the edge angle of each pixel point in the training window; acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle; and performing orthogonality score synthesis on the orthogonality score in the x-axis direction and the orthogonality score in the y-axis direction to obtain an orthogonality score, wherein the orthogonality score represents the balance index.
Further, the calculation module is further configured to: traversing the training window; searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the next highest score; and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score represents the uniqueness index.
According to a third aspect of the present invention, there is provided an automatic driving system, and the following technical solutions are adopted:
an automatic driving system comprises the reference image selection device.
Aiming at the evaluation of automatic selection of a reference image, the invention provides three indexes, namely a redundancy index, a balance index and a uniqueness index. In terms of the redundancy index, in order to ensure the positioning robustness, a certain information redundancy in the reference image is necessary to ensure that the feature of interest can be reliably positioned when the feature of interest is partially occluded or contaminated by noise, and therefore the symmetry score is used for measurement. In terms of the balance index, in order to ensure the positioning accuracy, the horizontal direction and the vertical direction of the reference image must have strong edge information, and the edge information in the two directions is balanced, so that the orthogonality score is used for measurement. In the aspect of uniqueness indexes, when a real-time image is searched, if the distinction between other characteristics in a reference image and other characteristics in the real-time image is obvious, when the real-time image is seriously degraded, the characteristics can still be distinguished, and the uniqueness indexes are used for measurement. Through the three indexes, the high-quality reference image can be automatically selected from the training image by integrating according to a certain weight, and the high-quality reference image can be used for positioning calculation of real-time images in the future.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flowchart of a reference image selection method according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a training window according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of fast calculation by an iterative method according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a low orthogonality fractional mode according to an embodiment of the present invention;
FIG. 5 is a diagram illustrating a high orthogonality score mode according to an embodiment of the present invention;
FIG. 6 is a statistical example graph of an edge intensity histogram according to an embodiment of the present invention;
FIG. 7 is a block diagram of uniqueness score calculation in accordance with an embodiment of the present invention;
fig. 8 is a schematic structural diagram of a reference image selection apparatus according to an embodiment of the invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be described clearly and completely with reference to the accompanying drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without any inventive step, are within the scope of the present invention.
First, the nouns appearing in the examples are explained:
training images: and the base image or the evaluation image is used for automatically selecting the reference image.
Real-time image: an image of the actual product or scene being detected, measured. The purpose of the reference image selection is to perform positioning calculations in the real-time image.
Training a window: in a certain region evaluated in the training image, if the evaluation score is high, the region is selected as a reference image for positioning calculation in the real-time image.
Reference image: the positioning kernel is a small area image on the training image and is used for performing positioning calculation on the real-time image.
Specifically, referring to fig. 1, a reference image selection method for positioning calculation includes:
s101: dividing the training image into any plurality of training windows according to a preset rule;
s103: calculating the evaluation index of each training window and obtaining a calculation result;
s105: and performing weight comprehensive analysis on the calculation result, and determining a training window as a reference image according to the analysis result and a preset threshold value.
In the above technical solution of this embodiment, in step S101, the training image is first divided into a plurality of training windows, as shown in fig. 2, a certain training window is shown, and the size is m × n, and (u, v) is a coordinate of the window center point in the training image. For the convenience of the later calculation, the width m and the height n can be taken as the nearest neighbor odd numbers. In step S103, an evaluation index is calculated for the training window in step S101. Aiming at the evaluation of automatic selection of a reference image, three indexes are provided. According to the following three indexes and by integrating according to a certain weight, a high-quality reference image can be automatically selected from the training images and used for positioning calculation of the real-time images.
1. Index of redundancy
In order to ensure the positioning robustness, a certain information redundancy in the reference image is necessary to ensure that the feature of interest can be reliably positioned and measured by the symmetry score when the feature of interest is partially occluded or contaminated by noise.
2. Index of balance
In order to ensure the positioning accuracy, the horizontal direction and the vertical direction of the reference image must have stronger edge information, and the edge information of the two directions is balanced and measured by an orthogonality score.
3. Uniqueness index
When the real-time image is searched, if the distinction between other characteristics in the reference image and the real-time image is obvious, the real-time image can still be distinguished when the real-time image is seriously degraded, and the uniqueness score is used for measurement.
In step S105, a weight comprehensive analysis is performed on the calculation result, and a training window is determined as a reference image according to the analysis result and a preset threshold. Specifically, in step S103, the evaluation index is calculated, the calculation result includes the score of each index, and in this step, the score of each index is subjected to weight analysis, and the specific weight may be flexibly configured according to a specific use scenario, and may be, for example, 1: 1: 1, may be 1: 2: 3, but is not limited thereto. And obtaining a total score after analysis, comparing the score with a threshold value of the threshold value, and determining the training window as a reference image when the score is greater than or equal to a preset threshold value. Therefore, the reference image may be many sets, not limited to one set. Therefore, the evaluation index is comprehensively analyzed according to the weight requirement to determine a reasonable reference image.
In the embodiment, three evaluation indexes, namely a redundancy index, a balance index and a uniqueness index, are provided, and a high-quality reference image can be automatically selected from the training images by calculating and comprehensively analyzing the three indexes.
Preferably, the evaluation index includes, but is not limited to: a redundancy index, a balance index, and a uniqueness index.
Preferably, the method for calculating the redundancy index includes: measuring the training window and calculating the normalized correlation coefficient of the training window on an x axis and the normalized correlation coefficient of the training window on a y axis; and carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain the symmetry fraction, wherein the symmetry fraction represents the redundancy index.
In particular, the redundancy index is characterized by a symmetry score, the number of symmetry scores being intended to measure the symmetry of the training window (reference image) about the x-axis and about the y-axis, the symmetry being measured by a normalized correlation coefficient.
The training window is shown in FIG. 2, and has a size of m n, and (u, v) is the coordinates of the center point of the window in the training image.
For the y-axis, the left and right symmetric columns are numbered u, u-1 …, u- (m-1)/2, (left) and u, u +1 …, u + (m-1)/2, (right); similarly, for the x-axis, the upper and lower symmetry lines are numbered v, v-1, …, v- (n-1)/2 (upper) and v, v +1, …, v + (n-1)/2 (lower).
The normalized correlation coefficient is used to measure the degree of symmetry about the coordinate axis. A training window of size m × n at (u, v), the normalized correlation coefficient for the x-axis is calculated as:
Figure BDA0001742009160000065
Figure BDA0001742009160000066
and
Figure BDA0001742009160000067
the gray level means of the upper half and the lower half of the window image are respectively. The value range of i is
Figure BDA0001742009160000068
The value range of j is
Figure BDA0001742009160000069
The correlation coefficient is converted to a normalized score of:
nx(u,v)=[max(rx(u,v),0)]2
similarly, for a training window of size m × n at (u, v), the normalized correlation coefficient and its normalized score with respect to the y-axis are:
Figure BDA00017420091600000610
ny(u,v)=[max(ry(u,v),0)]2
Figure BDA00017420091600000611
and
Figure BDA00017420091600000612
the gray level means of the left half and the right half of the window image respectively. The value range of i is
Figure BDA00017420091600000613
The value range of j is
Figure BDA00017420091600000614
As a preferred implementation manner, this embodiment further provides a symmetry score fast calculation method, which is specifically as follows:
the fast calculation principle is illustrated using a normalized correlation coefficient of the training window about the x-axis, and the principle of a normalized phase coefficient about the y-axis is similar. The normalized correlation coefficient about the x-axis can be rewritten as:
Figure BDA0001742009160000071
wherein N is the number of pixels in the upper (or lower) half of the training window; the correlation coefficient calculation comprises two sum terms sigmaijI(u+i,v-j),∑ijI (u + I, v + j) corresponding to the sum of the pixels of the upper half window and the lower half window, respectively; two sum of squares terms ΣijI2(u+i,v-j),∑ijI2(u + i, v + j) corresponding to the sum of the squares of the pixels of the upper half window and the lower half window, respectively; a correlation term sigmaijI(u+i,v-j)I(u+i,v+ j。
In order to improve the calculation efficiency, a dynamic programming technology can be used for calculating a sum term and a square sum term in the correlation coefficient, and for the correlation term, an iterative method can be used for improving the calculation efficiency. As shown in fig. 3, the correlation terms may be calculated and accumulated column by column in the upper half window and the lower half window respectively for the training area (the area to be evaluated on the training image), and then the correlation terms may be calculated iteratively in the row direction.
And finally, according to the calculation result, integrating the symmetry scores:
after the normalized scores are computed for the x-axis and for the y-axis, the two scores can be combined in two different ways. First, the algebra is average, if the sum of the two scores is larger, the comprehensive result is larger; second, geometric averaging, where the sum of the two scores is large and the two scores are close to each other, the composite result is large. Geometric averaging emphasizes not only the sum of the scores, but also the balance between the two scores.
The algebraic mean of the normalized scores for the training window at (u, v) is calculated as:
Figure BDA0001742009160000072
the geometric mean calculation of the normalized score is:
Figure BDA0001742009160000081
preferably, the method for calculating the balance index is as follows: calculating the edge intensity and the edge angle of each pixel point in the training window; acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle; and performing orthogonality score synthesis on the orthogonality score in the x-axis direction and the orthogonality score in the y-axis direction to obtain an orthogonality score, wherein the orthogonality score represents the balance index.
Specifically, the orthogonality score measures the information strength of the horizontal edge and the vertical edge of the reference image and the balance therebetween, and is shown in fig. 4-5. The orthogonality score is used for measuring the distribution of the edge strength in two orthogonal directions (x-axis direction and y-axis direction), and the edge strength (namely edge amplitude) in the orthogonal directions can be counted to measure the orthogonality of the edge strength.
Firstly, calculating the edge strength and the edge angle of each pixel point in a training window. And (5) counting an edge strength histogram, wherein the abscissa is an edge angle, and the ordinate is the accumulated sum p (i, u, v) of the edge strength corresponding to the given edge angle. Wherein i is an edge angle value, and the value range is (0, 359); u, v are the coordinates of the center point of the training window, as shown in FIG. 6.
The edge strength in the x-axis direction, i.e., the edge angle, corresponds to the cumulative sum of 0 degrees and 180 degrees of edge strength. The orthogonality score along the x-axis may be defined as:
Figure BDA0001742009160000082
similarly, the orthogonality score along the y-axis may be defined as:
Figure BDA0001742009160000083
according to the calculation result, performing orthogonality score synthesis, wherein the orthogonality score synthesis also comprises geometric average and algebraic average, and the orthogonality score synthesis comprises the following steps:
Figure BDA0001742009160000084
Figure BDA0001742009160000085
finally, the orthogonality fraction is normalized:
Figure BDA0001742009160000091
where m and n are the length and width of the training window, respectively.
Preferably, the calculation method of the uniqueness index is as follows: traversing the training window; searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the second highest score; and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score represents the uniqueness index.
The uniqueness score number is intended to measure the degree of difference between other features in the reference image and the real-time image, and is described by a matching score difference.
The uniqueness score calculation process is shown in fig. 7, which includes:
step 70: and traversing the training window.
Step 72: the training window (candidate reference image) is used as a reference image, the training images are searched, the number of the searched images is set to be 2, and two scores with the highest score and the second highest score are obtained.
Step 74: and calculating the uniqueness score, namely subtracting the second highest score from the highest score to be used as the uniqueness score metric.
Fig. 8 is a schematic structural diagram of a reference image selection apparatus according to an embodiment of the invention.
Referring to fig. 8, a reference image selection apparatus for localization calculation includes: the dividing module 81 is configured to divide the training image into any plurality of training windows according to a preset rule; the calculating module 83 is configured to calculate an evaluation index of each training window and obtain a calculation result; and the determining module 85 is configured to perform a comprehensive weight analysis on the calculation result, and determine a training window as a reference image according to the analysis result.
Preferably, the evaluation index includes, but is not limited to: a redundancy index, a balance index, and a uniqueness index.
Preferably, the calculation module 83 is further configured to: measuring the training window and calculating the normalized correlation coefficient of the training window on the x axis and the normalized correlation coefficient of the training window on the y axis; and carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain the symmetry fraction, wherein the symmetry fraction represents the redundancy index.
Preferably, the calculation module 83 is further configured to: calculating the edge intensity and the edge angle of each pixel point in the training window; acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle; and integrating the orthogonality fraction in the x-axis direction and the orthogonality fraction in the y-axis direction to obtain an orthogonality fraction, wherein the orthogonality fraction represents the balance index.
Preferably, the calculation module 83 is further configured to: traversing the training window; searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the second highest score; and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score represents the uniqueness index.
The automatic driving system provided by the invention comprises the reference image selection device.
Aiming at the evaluation of automatic selection of a reference image, the invention provides three indexes, namely a redundancy index, a balance index and a uniqueness index. In terms of the redundancy index, in order to ensure the positioning robustness, a certain information redundancy in the reference image is necessary to ensure that the feature of interest can be reliably positioned when the feature of interest is partially occluded or contaminated by noise, and therefore the symmetry score is used for measurement. In terms of the balance index, in order to ensure the positioning accuracy, the horizontal direction and the vertical direction of the reference image must have strong edge information, and the edge information in the two directions is balanced, so that the orthogonality score is used for measurement. In the aspect of uniqueness indexes, when a real-time image is searched, if the distinction between other characteristics in a reference image and other characteristics in the real-time image is obvious, when the real-time image is seriously degraded, the characteristics can still be distinguished, and the uniqueness indexes are used for measurement. Through the three indexes, the high-quality reference image can be automatically selected from the training image by integrating according to a certain weight, and the high-quality reference image can be used for positioning calculation of real-time images in the future.
The above description is only for the specific embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (5)

1. A reference image selection method for localization calculation, comprising:
dividing the training image into any plurality of training windows according to a preset rule;
calculating the evaluation index of each training window and obtaining a calculation result;
performing weight comprehensive analysis on the calculation result, and determining a training window as a reference image according to the analysis result and a preset threshold value; the evaluation index includes but is not limited to:
a redundancy index, a balance index, and a uniqueness index;
the redundancy index calculation method comprises the following steps:
measuring the training window and calculating the normalized correlation coefficient of the training window on an x axis and the normalized correlation coefficient of the training window on a y axis;
carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain a symmetry fraction, wherein the symmetry fraction represents the redundancy index;
the method for calculating the balance index comprises the following steps:
calculating the edge intensity and the edge angle of each pixel point in the training window;
acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle;
and performing orthogonality score synthesis on the orthogonality score in the x-axis direction and the orthogonality score in the y-axis direction to obtain an orthogonality score, wherein the orthogonality score represents the balance index.
2. The method of selecting a reference image according to claim 1, wherein the uniqueness index is calculated by:
traversing the training window;
searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the second highest score;
and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score represents the uniqueness index.
3. A reference image selection apparatus for positioning calculation, comprising:
the dividing module is used for dividing the training image into any plurality of training windows according to a preset rule;
the calculation module is used for calculating the evaluation index of each training window and obtaining a calculation result;
the determining module is used for carrying out weight comprehensive analysis on the calculation result and determining a training window as a reference image according to the analysis result and a preset threshold;
wherein the evaluation index includes but is not limited to:
a redundancy index, a balance index, and a uniqueness index;
wherein the computing module is further to:
measuring the training window and calculating the normalized correlation coefficient of the training window on an x axis and the normalized correlation coefficient of the training window on a y axis;
carrying out symmetry fraction synthesis on the normalization correlation coefficient of the x axis and the normalization correlation coefficient of the y axis to obtain a symmetry fraction, wherein the symmetry fraction represents the redundancy index;
wherein the computing module is further to:
calculating the edge intensity and the edge angle of each pixel point in the training window;
acquiring the orthogonality fraction of the training window in the x-axis direction and the orthogonality fraction of the training window in the y-axis direction according to the edge strength and the edge angle;
and performing orthogonality score synthesis on the orthogonality score in the x-axis direction and the orthogonality score in the y-axis direction to obtain an orthogonality score, wherein the orthogonality score represents the balance index.
4. The reference image selection apparatus of claim 3, wherein the computing module is further configured to:
traversing the training window;
searching the training images by using the training windows as preset reference images, and setting the number of the searched training images to be 2 to obtain a first training window with the highest score and a second training window with the second highest score;
and subtracting the score of the second training window from the score of the first training window to obtain a uniqueness score, wherein the uniqueness score represents the uniqueness index.
5. An automatic driving system, characterized by comprising the reference image selection device according to any one of claims 3 to 4.
CN201810823922.0A 2018-07-25 2018-07-25 Reference image selection method and device for positioning calculation and automatic driving system Active CN109242823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810823922.0A CN109242823B (en) 2018-07-25 2018-07-25 Reference image selection method and device for positioning calculation and automatic driving system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810823922.0A CN109242823B (en) 2018-07-25 2018-07-25 Reference image selection method and device for positioning calculation and automatic driving system

Publications (2)

Publication Number Publication Date
CN109242823A CN109242823A (en) 2019-01-18
CN109242823B true CN109242823B (en) 2022-03-18

Family

ID=65072378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810823922.0A Active CN109242823B (en) 2018-07-25 2018-07-25 Reference image selection method and device for positioning calculation and automatic driving system

Country Status (1)

Country Link
CN (1) CN109242823B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110632094B (en) * 2019-07-24 2022-04-19 北京中科慧眼科技有限公司 Pattern quality detection method, device and system based on point-by-point comparison analysis

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0924656B2 (en) * 1997-12-22 2012-05-30 Northrop Grumman Corporation Personal identification FOB
CN106327462A (en) * 2015-06-16 2017-01-11 征图新视(江苏)科技有限公司 Printed image positioning core extraction method and extraction device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020085761A1 (en) * 2000-12-30 2002-07-04 Gary Cao Enhanced uniqueness for pattern recognition

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0924656B2 (en) * 1997-12-22 2012-05-30 Northrop Grumman Corporation Personal identification FOB
CN106327462A (en) * 2015-06-16 2017-01-11 征图新视(江苏)科技有限公司 Printed image positioning core extraction method and extraction device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
《表面质量检测中定位核自动识别算法研究》;王宇恒;《中国优秀硕士学位论文全文数据库 信息科技辑》;20170615(第6期);摘要、正文第1-42页 *

Also Published As

Publication number Publication date
CN109242823A (en) 2019-01-18

Similar Documents

Publication Publication Date Title
CN104574421B (en) Large-breadth small-overlapping-area high-precision multispectral image registration method and device
Liu et al. An improved online dimensional measurement method of large hot cylindrical forging
CN104697476B (en) Roughness light cuts the automatic testing method and device of contour curve
CN106920245B (en) Boundary detection method and device
CN108986126A (en) The center of circle detection method of RANSAC algorithm is detected and improved based on Gauss curve fitting sub-pixel edge
CN103292701A (en) Machine-vision-based online dimensional measurement method of precise instrument
CN110263795B (en) Target detection method based on implicit shape model and graph matching
Powell et al. Comparing curved-surface range image segmenters
CN113570605A (en) Defect detection method and system based on liquid crystal display panel
CN108550166B (en) Spatial target image matching method
CN112825190B (en) Precision evaluation method, system, electronic equipment and storage medium
CN115330791A (en) Part burr detection method
CN107341824B (en) Comprehensive evaluation index generation method for image registration
TWI765442B (en) Method for defect level determination and computer readable storage medium thereof
Flesia et al. Sub-pixel straight lines detection for measuring through machine vision
CN106529548A (en) Sub-pixel level multi-scale Harris corner point detection algorithm
CN116563279A (en) Measuring switch detection method based on computer vision
CN110766657B (en) Laser interference image quality evaluation method
CN109242823B (en) Reference image selection method and device for positioning calculation and automatic driving system
CN111582270A (en) Identification tracking method based on high-precision bridge region visual target feature points
CN108764343B (en) Method for positioning tracking target frame in tracking algorithm
CN114419140A (en) Positioning algorithm for light spot center of track laser measuring device
CN116385440B (en) Visual detection method for arc-shaped blade
Kruglov The algorithm of the roundwood volume measurement via photogrammetry
CN112233186A (en) Equipment air tightness detection camera self-calibration method based on image perception

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant