CN117011376A - Industrial part comprehensive positioning method and system based on edge contour and feature moment - Google Patents

Industrial part comprehensive positioning method and system based on edge contour and feature moment Download PDF

Info

Publication number
CN117011376A
CN117011376A CN202310826067.XA CN202310826067A CN117011376A CN 117011376 A CN117011376 A CN 117011376A CN 202310826067 A CN202310826067 A CN 202310826067A CN 117011376 A CN117011376 A CN 117011376A
Authority
CN
China
Prior art keywords
contour
difference value
calculating
template
angle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310826067.XA
Other languages
Chinese (zh)
Inventor
齐文博
郑道勤
王建力
周逸飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Zhongke Rowing Ship Information Technology Co ltd
Original Assignee
Chongqing Zhongke Rowing Ship Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Zhongke Rowing Ship Information Technology Co ltd filed Critical Chongqing Zhongke Rowing Ship Information Technology Co ltd
Priority to CN202310826067.XA priority Critical patent/CN117011376A/en
Publication of CN117011376A publication Critical patent/CN117011376A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The application provides an industrial part comprehensive positioning method based on edge contour and feature moment, which is based on preprocessing of fixed values of image information, reduces the operation time consumption of a subsequent algorithm, improves the accuracy, performs edge extraction on the image information, can prepare the condition of scattered and non-overlapped objects to be detected which are commonly found in an industrial environment and return the image information in seconds, screens the contour, removes redundant meaningless points and repeated points with too small intervals, avoids the problem of subsequent interference, and improves the speed and the accuracy of a matching algorithm.

Description

Industrial part comprehensive positioning method and system based on edge contour and feature moment
Technical Field
The application relates to the technical field of industrial part positioning, in particular to an industrial part comprehensive positioning method and system based on edge contours and feature moments.
Background
The vision algorithm is an algorithm based on computer vision technology, and the main purpose of the vision algorithm is to solve various problems involved in image or video processing. Visual algorithms are a very important field and have a very wide range of applications such as image segmentation, object tracking, object recognition and face recognition. At present, vision algorithm matching positioning requirements in an industrial environment are mainly used for production and detection of a production line, environments such as shooting distance, illumination environment, models needing positioning and the like are basically kept consistent, types of objects to be detected are mostly normal parts, a small number of objects to be detected have defects and stains, detection requirements are mainly used for positioning, and real-time results are basically required to be obtained.
However, the existing image matching algorithm focuses more on matching stability and accuracy under abnormal conditions, and keeps matching availability under the condition that industrial environments such as deformation, scaling, illumination change, template image change and the like are not basically met, so that time consumption is high, and actual production requirements are not met.
Disclosure of Invention
Aiming at the defects existing in the prior art, the application provides an industrial part comprehensive positioning method and system based on edge contours and feature moments, so as to solve the technical problems that in the prior art, an image matching algorithm focuses on matching stability and accuracy under abnormal conditions, and the matching availability is kept under the condition that industrial environments such as deformation, scaling, illumination change, template image change and the like are basically not met, so that time consumption is high, and actual production requirements are not met.
The application provides an industrial part comprehensive positioning method based on edge contour and feature moment, which comprises the following steps:
s1, acquiring image information of an object to be detected through camera equipment, and preprocessing fixed values of the image information;
s2, extracting edges of the image information, and extracting the outline of the object to be detected in the image information;
s3, screening the extracted outline, and reserving an effective outline to be tested and a template outline;
s4, calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
s5, calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be measured; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and S6, outputting the optimal contour to be detected and the angle direction result as final matching results.
Optionally, the screening the extracted profile, and retaining the valid profile to be tested and the template profile, includes:
firstly, adopting a teh-Chinl chain approximation algorithm to carry out first screening on the L1 distance, then adopting the L2 distance to carry out second screening, finally solving the perimeter and the area of the template outline and the outline to be tested, solving an error value P according to the perimeter and the area, removing the outline of the shape which is different from the template outline according to the error value P, and retaining the effective outline to be tested and the template outline;
screening repeated contours, and reserving effective contours to be tested and template contours;
wherein, L1 distance is Manhattan distance, expressed as:
L1=∣x1-x2∣+∣y1-y2∣
l2 distance is the euclidean distance, expressed as:
the error value P is calculated by the following steps:
optionally, the calculating the center moment matrix of the image information includes:
calculating a third-order central moment matrix of the image information by using a motion function, wherein nu ji Representing normalized center moments in a third order center moment matrix:
m ji representing spatial moments in a third-order central moment matrix:
mu ji representing the central moment in the third order central moment matrix:
optionally, the calculating, based on the central moment matrix, feature H moments of the outline to be measured and the template outline respectively includes:
the characteristic H moment is calculated in the following way:
h[0]=η 2002
h[1]=(η 2002 ) 2 +4η 11 2
h[2]=(η 30 -3η 12 ) 2 +(3η 2103 ) 2
h[3]=(η 3012 ) 2 +(η 2103 ) 2
h[4]=(η 30 -3η 12 )(η 3012 )[(η 3012 ) 2 -3(η 2103 ) 2 ]+(3η 2103 )(η 2103 )[3(η
3012 ) 2 -(η 2103 ) 2 ]
h[5]=(η 2002 )[(η 3012 ) 2 -(η 2103 ) 2 ]+4η11(η 3012 )(η 2103 )
h[6]=(3η 2103 )(η 2103 )[3(η 3012 ) 2 -(η 2103 ) 2 ]-(η 30 -3η 12 )(η 2103 )[3(η
3012 ) 2 -(η 2103 ) 2 ]。
optionally, the finally obtaining the I1 distance between the feature H moments of all the contours to be measured and the template contour includes:
the calculation mode of the I1 distance is as follows:
optionally, the calculating a first angle difference between the edge point set of the template contour and the edge point set of the optimal contour to be measured includes:
and the edge point set of the template contour and the edge point set of the optimal contour to be measured are respectively reduced into one-dimensional vectors through a PCA algorithm, PCA angles of the one-dimensional vectors are obtained, and a first angle difference value PA is calculated.
Optionally, the calculating the second angle difference value through the central moment matrix and calculating an absolute error difference value between the first angle difference value and the second angle difference value includes:
and acquiring a matrix angle through the central moment matrix, calculating a second angle difference PB of the matrix angle, and calculating an absolute error value PC of the first angle difference PA and the second angle difference PB according to the first angle difference PA and the second angle difference PB.
Optionally, said recalculating the actual rotation angle as an angular direction result, otherwise, comprises:
rotating the edge point sets of the template profile by a first angle difference value PA and a second angle difference value PB respectively, and calculating error distances between the two rotated point sets and the edge point set of the optimal profile to be measured respectively, so that the error distances are calculated in the following manner:
and selecting a rotating point set with smaller error distance as a rotating result, wherein the rotating angle of the rotating point set is an angle direction result.
The application also provides an industrial part comprehensive positioning system based on the edge contour and the characteristic moment, which comprises:
the preprocessing module is used for acquiring image information of an object to be detected through the camera equipment and preprocessing the image information to a fixed value;
the edge extraction module is used for carrying out edge extraction on the image information and extracting the outline of the object to be detected in the image information;
the profile screening module is used for screening the extracted profile and reserving an effective profile to be tested and a template profile;
the contour matching module is used for calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
the angle direction restoration module is used for calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be detected; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and the output module is used for outputting the optimal contour to be detected and the angle direction result as a final matching result.
Compared with the prior art, the application has the following beneficial effects:
1. based on the preprocessing of the fixed value of the image information, the time consumption of the subsequent algorithm is reduced, and the accuracy is improved.
2. The image information is subjected to edge extraction, so that the situation that the objects to be detected are scattered and placed in a non-overlapping mode commonly in an industrial environment can be prepared.
3. The contours are screened, redundant meaningless points and repeated points with too small intervals are removed, and the problem of subsequent interference is avoided.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the application and together with the description, serve to explain the principles of the application.
In order to more clearly illustrate the embodiments of the application or the technical solutions of the prior art, the drawings which are used in the description of the embodiments or the prior art will be briefly described, and it will be obvious to a person skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic flow chart of the present application;
FIG. 2 is a schematic diagram of the image information before preprocessing;
FIG. 3 is a schematic diagram of the image information after constant value preprocessing in the present application;
FIG. 4 is a schematic diagram of the edge extraction-left 0 and right 1 as starting points in the present application;
FIG. 5 is a schematic diagram of edge extraction counter-clockwise search and assignment in the present application;
FIG. 6 is a schematic diagram of assigning values according to an edge hierarchy before the edge extraction search is completed in the present application;
FIG. 7 is a schematic diagram of assigning values according to an edge hierarchy after the edge extraction search is completed in the present application;
FIG. 8 is a schematic view of the L1 distance and L2 distance in the present application;
FIG. 9 is a schematic representation of a characteristic H-moment logarithmic transformation under common variations of the present application;
FIG. 10 is a schematic diagram of the PCA dimension reduction concept of the present application;
FIG. 11 is a schematic view of PCA angle determination and rotation to 0 in the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application. The functional units of the same reference numerals in the examples of the present application have the same and similar structures and functions.
Referring to fig. 1, the application provides an industrial part comprehensive positioning method based on edge contour and feature moment, which comprises the following steps:
s1, acquiring image information of an object to be detected through camera equipment, and preprocessing fixed values of the image information;
s2, extracting edges of the image information, and extracting the outline of the object to be detected in the image information;
s3, screening the extracted outline, and reserving an effective outline to be tested and a template outline;
s4, calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
s5, calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be measured; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and S6, outputting the optimal contour to be detected and the angle direction result as final matching results.
In this embodiment, S1, image information of an object to be measured is obtained through an image capturing device, and a fixed value of the image information is preprocessed.
Referring to fig. 2 and 3, due to the stability of shooting in an industrial environment, compared with a general algorithm, the method provided by the application additionally carries out fixed-value preprocessing on the image, and the flow comprises common gray, dialate, erode, image pyramid, contrast adjustment and the like, so that the time consumption of the operation of the subsequent algorithm is reduced, and the accuracy is improved.
S2, extracting edges of the image information, and extracting the outline of the object to be detected in the image information.
Referring to fig. 4-7, the whole image is traversed from left to right and from top to bottom starting from the top left corner of the picture, the first point meeting the left 0 and right 1 found in the traversal process is set as the outer edge starting point (the left 1 and right 0 are inner edges) of the object, and then the outer edges formed by all adjacent points are found in a counterclockwise order by utilizing the boundary search algorithm from the point. The essence of the boundary search is to check the continuity of the neighbors, i.e. starting from the start point, whether the 8 neighbors of the iterative search start point are 1. Since the boundary is tracked in a counterclockwise direction, querying whether surrounding neighbors have 1's should also be counterclockwise. The neighbors are thus searched for in a recurring way and the previous center point is taken as a new starting point, while the new neighbors found are new center points. This process loops until a new neighbor is found as the boundary start point or there is no next new neighbor. A set of points consisting of a complete set of start-center-neighbor points can determine a continuous boundary condition. And extracting the rough outline condition and the position of the object in the template picture and the picture to be detected to depict the distribution condition of scattered non-overlapping parts common in the industrial environment.
S3, screening the extracted outline, and reserving an effective outline to be tested and a template outline.
Referring to fig. 8, the extracted contours often have problems of partial overlap, incomplete, containing abnormal point sets, etc. due to actual interference problems. Therefore, firstly, evaluating and screening the L1 distance through a teh-Chinl chain approximation algorithm, removing redundant nonsensical points, reserving a point set with more characteristics, then, carrying out secondary screening through the L2 distance, traversing points in the contour to calculate the relative distance L2, and removing repeated points and nonsensical points with too small interval distances. After the screening of the points in the contour is finished, the distances between the contour points and the edge points of other external contours are calculated, and the repeated contour is screened. Finally, the perimeter and the area of the template outline and the outline to be measured are obtained, the two values are used as the basis of the template outline to comprehensively obtain an error value P, and the outline (such as halation points, stains, unclosed linear edges and inner edges of non-effective features) with a large shape distinguished from the template outline is removed according to the value P. The contours finally kept after screening are the effective contours to be measured.
Wherein, L1 distance is Manhattan distance, expressed as:
L1=∣x1-x2∣+∣y1-y2∣
l2 distance is the euclidean distance, expressed as:
the error value P is calculated by the following steps:
GS and GC are weight values preset according to the shape of the template.
S4, calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected.
Referring to fig. 9, the image is regarded as a numerical matrix, and the existing movements function is used to obtain the third-order center moment matrix (24 values) of the numerical matrix. For a numerical matrix, this central matrix describes the numerical features, such as: symmetry, mean, gradient change, etc., whereas for an image, the central moment matrix describes some geometrical features of different kinds of the image, such as: size, centroid, moment of inertia, gray scale, direction, etc.
Wherein nu ji Representing normalized center moments in a third order center moment matrix:
m ji representing spatial moments in a third-order central moment matrix:
mu ji representing the central moment in the third order central moment matrix:
because the numerical range difference of each dimension is too large to be well compared, the logarithmic transformation is carried into the same range logarithmic transformation firstly
H i =-sign(h i )log|h i |
And calculating and obtaining the characteristic h moment of the template contour and all contours to be measured. The characteristic h moment is based on a central moment matrix, and the parameters related to object matching are extracted and weighted and counted into 7 dimensions, wherein the calculation mode is as follows:
h[0]=η 2002
h[1]=(η 2002 ) 2 +4η 11 2
h[2]=(η 30 -3η 12 ) 2 +(3η 2103 ) 2
h[3]=(η 3012 ) 2 +(η 2103 ) 2
h[4]=(η 30 -3η 12 )(η 3012 )[(η 3012 ) 2 -3(η 2103 ) 2 ]+(3η 2103 )(η 2103 )[3(η
3012 ) 2 -(η 2103 ) 2 ]
h[5]=(η 2002 )[(η 3012 ) 2 -(η 2103 ) 2 ]+4η11(η 3012 )(η 2103 )
h[6]=(3η 2103 )(η 2103 )[3(η 3012 ) 2 -(η 2103 ) 2 ]-(η 30 -3η 12 )(η 2103 )[3(η
3012 ) 2 -(η 2103 ) 2 ]。
and comparing the characteristic h moment of the profile A to be measured with the characteristic h moment of the template profile B. The distance I1 of the h moment of the two edges is determined, and a smaller distance value I1 indicates a closer shape. And judging the correlation degree between the outline to be measured and the template outline.
The calculation mode of the I1 distance is as follows:
and repeating the steps to complete comparison of all the contours to be detected, wherein the contour with the highest matching degree correlation degree is a matching result, and the center point of the contour is the matching center point. The method is used for matching and comparing the characteristic h moment of part of edge characteristics, but not the whole image or the edge image point set, and the time consumption of the matching part is shortened by carrying out numerical processing on the image in advance.
S5, calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be measured; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; and judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result.
Because the central moment matrix is mainly used for describing the numerical value condition, partial loss exists in angle information of an actual object, and after the matching central point is obtained through the method, the direction angle needs to be repaired through the PCA angle.
Referring to fig. 10, the edge point set of the template contour and the edge point set of the optimal contour to be measured are respectively reduced into one-dimensional vectors through a PCA algorithm, PCA angles of the one-dimensional vectors are obtained, and a first angle difference PA is calculated; and respectively acquiring matrix angles through the central moment matrix, calculating a second angle difference value PB of the matrix angles, and calculating an absolute error value PC of the first angle difference value PA and the second angle difference value PB according to the first angle difference value PA and the second angle difference value PB.
Referring to fig. 11, according to a preset accuracy judgment value PP (set based on the requirements of the actual environment on the angle accuracy and the speed), the absolute error difference value and the preset accuracy value are judged, if the absolute error difference value PC is smaller than the preset accuracy value PP, the angle characteristic of the object is considered to be obvious, the angle loss of matrix operation is within an acceptable range, the average value of the first angle difference value PA and the second angle difference value PB is taken to be output as an angle direction result, otherwise, the angle characteristic of the object is considered to be less obvious, the angle loss of matrix operation exceeds the requirement, and the actual rotation angle needs to be recalculated.
Rotating the edge point sets of the template profile by a first angle difference value PA and a second angle difference value PB respectively, and calculating error distances between the two rotated point sets and the edge point set of the optimal profile to be measured respectively, so that the error distances are calculated in the following manner:
and selecting a rotating point set with smaller error distance as a rotating result, wherein the rotating angle (PA or PB) of the rotating point set is an angle direction result.
And S6, outputting the optimal contour to be detected and the angle direction result as final matching results.
According to the application, based on the preprocessing of the fixed value of the image information, the time consumption of the operation of the subsequent algorithm is reduced, the accuracy is improved, the edge extraction of the image information is carried out, the situation that the objects to be detected are scattered and placed in a non-overlapping manner commonly seen in the industrial environment can be prepared and returned in seconds, the contours are screened, redundant meaningless points and repeated points with too small intervals are removed, the problem of subsequent interference is avoided, and the speed and the accuracy of the matching algorithm are improved.
The application also provides an industrial part comprehensive positioning system based on the edge contour and the characteristic moment, which comprises:
the preprocessing module is used for acquiring image information of an object to be detected through the camera equipment and preprocessing the image information to a fixed value;
the edge extraction module is used for carrying out edge extraction on the image information and extracting the outline of the object to be detected in the image information;
the profile screening module is used for screening the extracted profile and reserving an effective profile to be tested and a template profile;
the contour matching module is used for calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
the angle direction restoration module is used for calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be detected; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and the output module is used for outputting the optimal contour to be detected and the angle direction result as a final matching result.
It should be noted that in this document, relational terms such as "first" and "second" and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The foregoing is only a specific embodiment of the application to enable those skilled in the art to understand or practice the application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (10)

1. An industrial part comprehensive positioning method based on edge contour and feature moment is characterized by comprising the following steps:
s1, acquiring image information of an object to be detected through camera equipment, and preprocessing fixed values of the image information;
s2, extracting edges of the image information, and extracting the outline of the object to be detected in the image information;
s3, screening the extracted outline, and reserving an effective outline to be tested and a template outline;
s4, calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
s5, calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be measured; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and S6, outputting the optimal contour to be detected and the angle direction result as final matching results.
2. The method for comprehensively positioning industrial parts based on edge contours and feature moments according to claim 1, wherein the step of screening the extracted contours to retain valid contours to be tested and template contours comprises the steps of:
firstly, adopting a teh-Chinl chain approximation algorithm to carry out first screening on the L1 distance, then adopting the L2 distance to carry out second screening, finally solving the perimeter and the area of the template outline and the outline to be tested, solving an error value P according to the perimeter and the area, removing the outline of the shape which is different from the template outline according to the error value P, and retaining the effective outline to be tested and the template outline;
screening repeated contours, and reserving effective contours to be tested and template contours;
wherein, L1 distance is Manhattan distance, expressed as:
L1=∣x1-x2∣+∣y1-y2∣
l2 distance is the euclidean distance, expressed as:
the error value P is calculated by the following steps:
3. the method for comprehensively positioning industrial parts based on edge contours and feature moments according to claim 2, wherein the calculating the center moment matrix of the image information includes:
calculating a third-order central moment matrix of the image information by using a motion function, wherein nu ji Representing normalized center moments in a third order center moment matrix:
m ji representing spatial moments in a third-order central moment matrix:
mu ji representing the central moment in the third order central moment matrix:
4. the method for comprehensively positioning industrial parts based on edge profiles and feature moments according to claim 3, wherein the calculating the feature H moments of the profile to be measured and the template profile based on the center moment matrix comprises:
the characteristic H moment is calculated in the following way:
h[0]=η 2002
h[1]=(η 2002 ) 2 +4η 11 2
h[2]=(η 30 -3η 12 ) 2 +(3η 2103 ) 2
h[3]=(η 3012 ) 2 +(η 2103 ) 2
h[4]=(η 30 -3η 12 )(η 3012 )[(η 3012 ) 2 -3(η 2103 ) 2 ]+(3η 2103 )(η 2103 )[3(η 3012 ) 2 -(η 2103 ) 2 ]
h[5]=(η 2002 )[(η 3012 ) 2 -(η 2103 ) 2 ]+4η11(η 3012 )(η 2103 )
h[6]=(3η 2103 )(η 2103 )[3(η 3012 ) 2 -(η 2103 ) 2 ]-(η 30 -3η 12 )(η 2103 )[3(η 3012 ) 2 -(η 2103 ) 2 ]。
5. the method for integrated locating of industrial parts based on edge contours and feature moments according to claim 4, wherein said and logarithmic change of said feature H moment comprises:
the logarithmic change is calculated in the following manner:
H i =-sign(h i )log|h i |。
6. the method for comprehensively positioning industrial parts based on edge contours and feature moments according to claim 5, wherein the step of finally obtaining the I1 distance between the feature moments of all contours to be tested and template contours comprises the steps of:
the calculation mode of the I1 distance is as follows:
7. the method for integrated positioning of industrial parts based on edge contours and feature moments according to claim 6, wherein the calculating a first angle difference between the set of edge points of the template contours and the set of edge points of the optimal contour to be measured comprises:
and the edge point set of the template contour and the edge point set of the optimal contour to be measured are respectively reduced into one-dimensional vectors through a PCA algorithm, PCA angles of the one-dimensional vectors are obtained, and a first angle difference value PA is calculated.
8. The method for integrated positioning of industrial parts based on edge contours and feature moments according to claim 7, wherein the calculating a second angle difference value by the center moment matrix and calculating an absolute error difference value between the first angle difference value and the second angle difference value comprises:
and acquiring a matrix angle through the central moment matrix, calculating a second angle difference PB of the matrix angle, and calculating an absolute error value PC of the first angle difference PA and the second angle difference PB according to the first angle difference PA and the second angle difference PB.
9. The method for integrated positioning of industrial parts based on edge contours and feature moments according to claim 8, wherein the recalculating the actual rotation angle as an angular direction result comprises:
rotating the edge point sets of the template profile by a first angle difference value PA and a second angle difference value PB respectively, and calculating error distances between the two rotated point sets and the edge point set of the optimal profile to be measured respectively, so that the error distances are calculated in the following manner:
and selecting a rotating point set with smaller error distance as a rotating result, wherein the rotating angle of the rotating point set is an angle direction result.
10. An industrial part integrated positioning system based on edge contour and feature moment, comprising:
the preprocessing module is used for acquiring image information of an object to be detected through the camera equipment and preprocessing the image information to a fixed value;
the edge extraction module is used for carrying out edge extraction on the image information and extracting the outline of the object to be detected in the image information;
the profile screening module is used for screening the extracted profile and reserving an effective profile to be tested and a template profile;
the contour matching module is used for calculating a central moment matrix of the image information, respectively calculating characteristic H moments of the contour to be detected and the template contour based on the central moment matrix, carrying out logarithmic change on the characteristic H moments, finally solving the I1 distance between the characteristic H moments of all the contour to be detected and the template contour, and selecting the contour to be detected with the minimum I1 distance as the optimal contour to be detected;
the angle direction restoration module is used for calculating a first angle difference value between the edge point set of the template profile and the edge point set of the optimal profile to be detected; calculating a second angle difference value through the central moment matrix, and calculating an absolute error difference value between the first angle difference value and the second angle difference value; judging the absolute error difference value and a preset precision value, if the absolute error difference value is smaller than the preset precision value, taking the average value of the absolute error difference value and the preset precision value to output as an angle direction result, otherwise, recalculating the actual rotation angle as the angle direction result;
and the output module is used for outputting the optimal contour to be detected and the angle direction result as a final matching result.
CN202310826067.XA 2023-07-07 2023-07-07 Industrial part comprehensive positioning method and system based on edge contour and feature moment Pending CN117011376A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310826067.XA CN117011376A (en) 2023-07-07 2023-07-07 Industrial part comprehensive positioning method and system based on edge contour and feature moment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310826067.XA CN117011376A (en) 2023-07-07 2023-07-07 Industrial part comprehensive positioning method and system based on edge contour and feature moment

Publications (1)

Publication Number Publication Date
CN117011376A true CN117011376A (en) 2023-11-07

Family

ID=88562848

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310826067.XA Pending CN117011376A (en) 2023-07-07 2023-07-07 Industrial part comprehensive positioning method and system based on edge contour and feature moment

Country Status (1)

Country Link
CN (1) CN117011376A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117576088A (en) * 2024-01-15 2024-02-20 平方和(北京)科技有限公司 Intelligent liquid impurity filtering visual detection method and device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117576088A (en) * 2024-01-15 2024-02-20 平方和(北京)科技有限公司 Intelligent liquid impurity filtering visual detection method and device
CN117576088B (en) * 2024-01-15 2024-04-05 平方和(北京)科技有限公司 Intelligent liquid impurity filtering visual detection method and device

Similar Documents

Publication Publication Date Title
CN111815630B (en) Defect detection method and device for LCD screen
CN107543828B (en) Workpiece surface defect detection method and system
CN113570605B (en) Defect detection method and system based on liquid crystal display panel
JP6099479B2 (en) Crack detection method
EP2177898A1 (en) Method for selecting an optimized evaluation feature subset for an inspection of free-form surfaces and method for inspecting a free-form surface
CN104616278B (en) Three-dimensional point cloud interest point detection method and system
CN108550166B (en) Spatial target image matching method
CN117011376A (en) Industrial part comprehensive positioning method and system based on edge contour and feature moment
US6980685B2 (en) Model-based localization and measurement of miniature surface mount components
CN113962306A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN110634137A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN113554649A (en) Defect detection method and device, computer equipment and storage medium
US20110164129A1 (en) Method and a system for creating a reference image using unknown quality patterns
JP4003465B2 (en) Specific pattern recognition method, specific pattern recognition program, specific pattern recognition program recording medium, and specific pattern recognition apparatus
CN110929782B (en) River channel abnormity detection method based on orthophoto map comparison
WO2020125528A1 (en) Anchor object detection method and apparatus, electronic device, and storage medium
CN111814852A (en) Image detection method, image detection device, electronic equipment and computer-readable storage medium
CN106815830B (en) Image defect detection method
JP2001143073A (en) Method for deciding position and attitude of object
CN115184362A (en) Rapid defect detection method based on structured light projection
CN114862816A (en) Glitch detection method, system, and computer-readable storage medium
Frangione et al. Multi-step approach for automated scaling of photogrammetric micro-measurements
WO2022162417A1 (en) Systems and methods for paint defect detection using machine learning
Carrasco et al. Automated visual inspection using trifocal analysis in an uncalibrated sequence of images
Spence et al. Automotive sheet metal and grid digitizing solutions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination