CN117522784A - Gear part image detection method and system based on tooth distance segmentation compensation - Google Patents
Gear part image detection method and system based on tooth distance segmentation compensation Download PDFInfo
- Publication number
- CN117522784A CN117522784A CN202311373451.5A CN202311373451A CN117522784A CN 117522784 A CN117522784 A CN 117522784A CN 202311373451 A CN202311373451 A CN 202311373451A CN 117522784 A CN117522784 A CN 117522784A
- Authority
- CN
- China
- Prior art keywords
- tooth
- far
- detected
- tooth surface
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 57
- 230000011218 segmentation Effects 0.000 title claims abstract description 27
- 238000012937 correction Methods 0.000 claims abstract description 41
- 238000005259 measurement Methods 0.000 claims abstract description 40
- 238000000034 method Methods 0.000 claims abstract description 31
- 238000013528 artificial neural network Methods 0.000 claims abstract description 18
- 238000005070 sampling Methods 0.000 claims description 18
- 238000004364 calculation method Methods 0.000 claims description 16
- 238000001914 filtration Methods 0.000 claims description 5
- 238000003709 image segmentation Methods 0.000 claims description 3
- 230000000007 visual effect Effects 0.000 description 6
- 238000012545 processing Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 3
- 238000000691 measurement method Methods 0.000 description 3
- 238000003672 processing method Methods 0.000 description 2
- XKJMBINCVNINCA-UHFFFAOYSA-N Alfalone Chemical compound CON(C)C(=O)NC1=CC=C(Cl)C(Cl)=C1 XKJMBINCVNINCA-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000003153 chemical reaction reagent Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003702 image correction Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Quality & Reliability (AREA)
- Computational Linguistics (AREA)
- Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a gear piece image detection method and system based on tooth distance segmentation compensation, which comprises the steps of using a preset segmentation neural network to segment and identify an area to be detected, dividing a projection image into a far area, a middle area and a near area, respectively using corresponding calibration coefficients to conduct targeted correction on thickness values obtained by image matching to obtain measured values for detecting a gear piece, and distinguishing far teeth and near teeth by introducing the neural network to obtain corresponding calibration coefficients, so that targeted detection of tooth surfaces with different distances under a single image acquisition of the gear piece is realized, the detection precision of the gear piece image is effectively improved, and the method is particularly suitable for solving the problems of inaccurate measurement caused by the fact that the size of the detected gear piece exceeds the effective range of a lens and the depth of field is insufficient.
Description
Technical Field
The invention relates to the technical fields of mechanical part measurement, visual image measurement and mapping image correction, in particular to a gear part image detection method and system based on tooth distance segmentation compensation.
Background
The machine vision measurement method is adopted to detect the sizes of the mechanical parts, has the advantages of low cost, high precision, high efficiency, convenient operation and the like, and particularly compared with the traditional contact detection methods such as a micrometer, a vernier caliper, a feeler gauge and the like, the machine vision measurement method has the advantages that the detection speed is remarkably improved, and the machine vision measurement method is more suitable for the high-frequency detection requirement of large-scale automatic production.
The visual dimension measurement implementation of the mechanical parts generally comprises the steps of image acquisition, image processing, feature extraction, dimension calculation, result output and the like. The current method based on visual dimension measurement is mainly realized based on a projection method, an edge detection principle is applied, an object is imaged by a backlight in front of a camera in a projection mode, a black-and-white clear image is formed, edges of a bright part and a dark part in the image are detected through a visual system, and then a series of processing steps are carried out, so that the high-precision measurement of the dimension of a mechanical part is finally realized. Therefore, when the reagent is used for measuring the visual size, the requirements on the size of the measured part, the backlight parallel quality, the depth of field of the detected area and the like are high.
Because the measurement of mechanical parts often involves the detection of a plurality of measurement points under different object planes in a three-dimensional state, a telecentric lens is generally used for collecting detection images so as to ensure that the magnification of the detection images does not change, overcome the influence of parallax of the images and eliminate image distortion.
In the specific scene of detecting the thickness dimension of the gear piece, the detection process needs to determine the thickness of the highest point of each tooth surface of the gear piece, and is a typical detection task of multiple measuring points under different object planes. In actual detection execution, when the diameter of the detected gear piece is larger (for example, a gear piece with the diameter of 50 mm), the depth distance of the detected tooth is beyond the effective depth-of-field adjusting range of the telecentric lens, so that the front tooth surface which is closer to the lens in the acquired image is obviously thicker, and the rear tooth surface which is farther from the lens is thinner, and the requirement for simultaneously and accurately detecting the thickness of all tooth surfaces of the gear piece cannot be realized only by statically shooting a single picture of the gear piece.
The existing solution is to discard single image detection, focus only the front tooth surface thickness close to the lens for the collected image, and detect each tooth surface of the gear part in sequence by matching with the rotation station. The processing mode can ensure that the distance between each tooth surface to be detected and the lens is consistent, so that the depth of field error is avoided, but the detection efficiency is greatly reduced.
On the other hand, even with high-performance telecentric lenses, there is still a degree of distortion (typically within 0.1%), which, although having very high accuracy compared to conventional industrial lenses, may still exhibit pixel-level errors in the actual acquired images, and auxiliary corrections such as calibration patterns still need to be employed for component detection that require very high accuracy.
In view of the above problems, the prior art gives improvements based on detection algorithms. For example, patent application CN101477685a entitled "sub-pixel level image detection method with depth of field part processing quality" discloses that in performing part image detection, a method of part edge positioning and layered calibration is adopted to improve detection accuracy for the problem of multiple detection points having different depths of field. However, in the prior art, compensation can be realized only for a part with a fixed shape and a fixed detection angle, and for detecting a part with an unfixed detection point position (for example, a large difference exists in a projection image due to rotation of a gear member in the detection of the tooth surface thickness of the gear member), and in the case that the edge has a complex structure (for example, the edge of the gear member is chamfered to form a curve or an oblique line), the accurate edge position cannot be determined, and visual measurement compensation of the layered calibration cannot be realized.
Disclosure of Invention
In order to solve the defects in the prior art, the invention provides a gear piece image detection method and a system based on tooth distance segmentation compensation, which are used for distinguishing far teeth from near teeth to obtain corresponding calibration coefficients by introducing a neural network, so that the targeted detection of tooth surfaces with different distances under the condition that the gear piece is subjected to single-time image acquisition is realized, the detection precision of the gear piece image is effectively improved, and the method and the system are particularly suitable for solving the problems of inaccurate measurement caused by the fact that the size of a detected gear piece exceeds the effective range of a lens and the depth of field is insufficient.
In order to achieve the above object, the present invention adopts the technical scheme that:
the gear piece image detection method based on tooth distance segmentation compensation is characterized by comprising the following steps of:
s1, acquiring a projection image of a gear piece to be detected, and respectively extracting a first reference area, a second reference area and the area to be detected from the projection image according to a preset judging area;
s2, performing segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens;
s3, respectively executing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network, and generating a far-end tooth set, a middle-end tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected;
s4, establishing a datum line by using the first datum region and the second datum region;
s5, respectively matching tooth surfaces to be detected in the far-end tooth set, the middle tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle image thickness set and a near-end image thickness set;
s6, obtaining far-part calibration coefficients k corresponding to the far-part region, the middle region and the near-part region respectively f Middle calibration coefficient k m And a near calibration coefficient k n ;
S7, respectively according to the abscissa of the tooth surface to be detected in the far tooth set and the near tooth set relative to the datum lineValue corrected far calibration coefficient k f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt A collection;
s8, matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt Acquiring a near tooth surface thickness measurement value set by set calculation;
and S9, matching the measured value set of the far tooth surface thickness with the measured value set of the middle tooth surface thickness and the measured value set of the near tooth surface thickness with the standard value of the gear piece to be detected, and judging whether the gear piece to be detected is qualified.
Further, the step S3 further includes:
judging whether the total number of tooth surfaces to be detected contained in the far tooth set, the middle tooth set and the near tooth set is equal to the number of tooth surfaces designed for the gear piece to be detected;
when the judgment is equal to the judgment, the generated far tooth set, middle tooth set and near tooth set are accepted;
and when the set of the far teeth, the set of the middle teeth and the set of the near teeth are judged to be unequal, the generated set of the far teeth, the generated set of the middle teeth and the generated set of the near teeth are refused, and the step S2 is repeatedly executed.
Further, the step S7 includes performing the following sub-steps:
s71, calculating the remote calibration coefficient k f And a near calibration coefficient k n The difference generates a coefficient difference k d ;
S72, calculating the average value of the abscissa values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference midpoint coordinate value x middle Calculating the distance between the horizontal coordinate values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference distance dis x ;
S73, according to the abscissa value of the central point of the tooth surface to be detected in the far tooth set and the near tooth set relative to the datum linex tooth Respectively calculating a correction value a corresponding to each tooth surface to be detected by using the method 1;
s74, correspondingly calculating the far correction coefficient k of each tooth surface to be detected by using the correction value a and applying 2 ft And a near correction coefficient k nt ;
Further, the projection image of the gear piece to be detected is focused into a middle shaft area of the size of the gear piece to be detected.
Further, extracting the first reference area, the second reference area and the area to be detected from the projection image according to the preset determination area includes:
respectively selecting 100-pixel long and 30-pixel high areas of preset reference surface positions on two sides of a gear piece to be detected to generate a first reference area and a second reference area;
and selecting the tooth surface position 50 pixels of the gear piece to be detected, and covering all tooth surface areas to be detected in length to generate a to-be-detected area.
Further, the step S1 further includes:
and respectively executing filtering operation on the first reference area, the second reference area and the area to be detected.
Further, the performing a tooth surface recognition operation includes:
judging whether the edge length of the tooth surface position is not less than 50 pixels, and discarding the tooth surface position when the edge length of the tooth surface position is less than 50 pixels;
when judging that the edge length of the tooth surface position is not less than 50 pixels, further judging whether the slope of the tooth surface position straight line is less than 0.02, and when judging that the slope of the tooth surface position straight line is not less than 0.02, discarding the tooth surface position;
when judging that the slope of the tooth surface position straight line is smaller than 0.02, the tooth surface position is reserved as the tooth surface to be detected.
Further, the step S4 includes:
s41, respectively calculating gray histograms of plane edges of a first reference area and a second reference area, and sampling the edge contour according to preset sampling precision to obtain a first sampling point set and a second sampling point set;
s42, obtaining a first intersection point set and a second intersection point set according to the first derivative extremum positions of the first sampling point set and the second sampling point set as intersection points;
s43, fitting and generating a first plane edge straight line and a second plane edge straight line by using the first intersection point set and the second intersection point set respectively;
s44, calculating and obtaining a datum line by using the first plane edge straight line and the second plane edge straight line by using an application type 3;
wherein x is 1 、y 1 X is the first reference point of the datum line 2 、y 2 For datum line second reference point, x L1 、y L1 Is the first end point of the straight line of the first plane edge, x L2 、y L2 Is the second end point of the straight line of the first plane edge, x R1 、y R1 Straight line first end point of second plane edge, x R2 、y R2 Is the second planar edge straight second end point.
The invention also relates to a gear piece image detection system based on tooth distance segmentation compensation, which is characterized by comprising:
the image acquisition module is used for respectively extracting a first reference area, a second reference area and an area to be detected from the projection image according to a preset judging area;
the image segmentation module is used for carrying out segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens;
the tooth surface recognition module is used for respectively executing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network to generate a far-end tooth set, a middle tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected;
the datum line module is used for establishing a datum line by using the first datum region and the second datum region;
the thickness recognition module is used for respectively matching tooth surfaces to be detected in the far-end tooth set, the middle tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle image thickness set and a near-end image thickness set;
the coefficient correction module is used for correcting the far-end calibration coefficient k according to the abscissa value of the tooth surface to be detected in the far-end tooth set and the near-end tooth set relative to the datum line respectively f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt A collection;
a thickness calculation module for matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt Acquiring a near tooth surface thickness measurement value set by set calculation;
and the image detection module is used for judging whether the gear piece to be detected is qualified or not by using the far tooth surface thickness measurement value set, the middle tooth surface thickness measurement value set and the near tooth surface thickness measurement value set to match the standard value of the gear piece to be detected.
The beneficial effects of the invention are as follows:
by adopting the gear piece image detection method and system based on tooth distance segmentation compensation, provided by the invention, the neural network is introduced to distinguish the far teeth and the near teeth to obtain the corresponding calibration coefficients, so that the targeted detection of tooth surfaces with different distances under the condition of single image acquisition of the gear piece is realized. Compared with the existing processing method, the method can effectively improve the detection repetition precision under the single image measurement scene, so that all tooth heights of the gear are photographed at one time, and meanwhile, the detection precision and the detection efficiency of tooth thickness detection are greatly improved. The method is particularly suitable for solving the technical problem of inaccurate measurement caused by insufficient depth of field because the size of the measured gear piece exceeds the effective range of the lens.
Drawings
Fig. 1 is a schematic flow chart of a gear piece image detection method based on tooth distance segmentation compensation.
Fig. 2 is a schematic diagram of a gear member image detection system based on tooth distance segmentation compensation according to the present invention.
Fig. 3 is a schematic view of a projected image of a gear member to be inspected according to the present invention.
Detailed Description
For a clearer understanding of the present invention, reference will be made to the following detailed description taken in conjunction with the accompanying drawings and examples.
The first aspect of the present invention relates to a gear member image detection method based on tooth distance segmentation compensation, which is shown in fig. 1, and includes:
s1, acquiring a projection image of a gear piece to be detected, and respectively extracting a first reference area, a second reference area and the area to be detected from the projection image according to a preset judging area.
Preferably, the projected image of the gear member to be detected is focused into the central axis region of the size of the gear member to be detected, so that the teeth (the central region of the distance from the photographing lens) at the left and right ends of the image are near the focusing point position. In such an in-focus state, due to distortion of the lens, teeth on the near side of the lens (near region at a distance from the photographing lens) are larger in the image, and the calibration coefficient is larger and larger as the teeth are closer to the center of the image; the teeth distal to the lens (the far region from the photographing lens) will be smaller in the image, with a smaller calibration factor and smaller closer to the center of the image. Meanwhile, in order to obtain the tooth surface characteristics of the far-part region through shooting, the lens is arranged to have a certain angle relative to the gear piece to be detected, namely, a smaller perspective angle exists between the lens and the gear piece to be detected. Comprehensive arrangement, the resulting projection image is shown in fig. 3. Wherein the first reference region and the second reference region respectively correspond to left and right end regions (corresponding to the detection base and kept relatively horizontal) in the projection image; the area to be detected is selected as the top area of the projection image and covers the whole area of the tooth surface of the gear piece to be detected.
Preferably, extracting the first reference area, the second reference area, and the area to be detected in the projection image according to the preset determination area includes:
respectively selecting 100-pixel long and 30-pixel high areas of preset reference surface positions on two sides of a gear piece to be detected to generate a first reference area and a second reference area; and selecting the tooth surface position 50 pixels of the gear piece to be detected, and covering all tooth surface areas to be detected in length to generate a to-be-detected area.
Preferably, the method further comprises performing a filtering operation on the first reference region, the second reference region and the region to be detected, respectively. For example, the filter size is selected to be 7*3, and the first reference region and the second reference region are each subjected to the filtering operation twice, and the region to be detected is subjected to the filtering operation once.
S2, performing segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens.
Preferably, the segmented neural network can be trained by dividing a data set (to-be-detected area) of the gear piece to be detected into three types of far, medium and near, so as to obtain the corresponding segmented neural network.
S3, performing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network, and generating a far-end tooth set, a middle-end tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected.
Preferably, performing the tooth surface recognition operation includes: judging whether the edge length of the tooth surface position is not less than 50 pixels, and discarding the tooth surface position when the edge length of the tooth surface position is less than 50 pixels; when judging that the edge length of the tooth surface position is not less than 50 pixels, further judging whether the slope of the tooth surface position straight line is less than 0.02, and when judging that the slope of the tooth surface position straight line is not less than 0.02, discarding the tooth surface position; when judging that the slope of the tooth surface position straight line is smaller than 0.02, the tooth surface position is reserved as the tooth surface to be detected.
Preferably, the straight line detection mode is to sample every 3 pixels, calculate the edge position of the sub-pixels, and fit a straight line.
Preferably, after executing the tooth surface recognition operation, judging whether the total number of tooth surfaces to be detected contained in the far tooth set, the middle tooth set and the near tooth set is equal to the designed tooth surface number of the gear piece to be detected; when the judgment is equal to the judgment, the generated far tooth set, middle tooth set and near tooth set are accepted; and when the set of the far teeth, the set of the middle teeth and the set of the near teeth are judged to be unequal, the generated set of the far teeth, the generated set of the middle teeth and the generated set of the near teeth are refused, and the step S2 is repeatedly executed.
S4, establishing a datum line by using the first datum region and the second datum region.
Specifically, the method comprises the following sub-steps: s41, respectively calculating gray histograms of plane edges of a first reference area and a second reference area, and sampling the edge contour according to preset sampling precision to obtain a first sampling point set and a second sampling point set; s42, obtaining a first intersection point set and a second intersection point set according to the first derivative extremum positions of the first sampling point set and the second sampling point set as intersection points; s43, fitting and generating a first plane edge straight line and a second plane edge straight line by using the first intersection point set and the second intersection point set respectively; s44, calculating and obtaining a datum line by using the first plane edge straight line and the second plane edge straight line by using an application type 3;
wherein x is 1 、y 1 X is the first reference point of the datum line 2 、y 2 For datum line second reference point, x L1 、y L1 Is the first end point of the straight line of the first plane edge, x L2 、y L2 Is the second end point of the straight line of the first plane edge, x R1 、y R1 Straight line first end point of second plane edge, x R2 、y R2 Is a second plane edgeThe edge lines the second end point.
And S5, respectively matching tooth surfaces to be detected in the far-end tooth set, the middle-end tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle-end image thickness set and a near-end image thickness set.
S6, obtaining far-part calibration coefficients k corresponding to the far-part region, the middle region and the near-part region respectively f Middle calibration coefficient k m And a near calibration coefficient k n 。
S7, correcting the far-end calibration coefficient k according to the abscissa value of the tooth surface to be detected in the far-end tooth set and the near-end tooth set relative to the datum line respectively f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt And (5) collecting.
Corresponding focusing is to be the projection image of the middle shaft area of the size of the gear piece to be detected, and the far calibration coefficient k is the same as the projection image f Should be greater than the near calibration coefficient k n At the same time, middle calibration coefficient k m The value of the far-end calibration coefficient k is about f And a near calibration coefficient k n Is a mean value of the arithmetic mean value of (a).
Specifically, the method comprises the following sub-steps: s71, calculating the remote calibration coefficient k f And a near calibration coefficient k n The difference generates a coefficient difference k d The method comprises the steps of carrying out a first treatment on the surface of the S72, calculating the average value of the abscissa values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference midpoint coordinate value x middle Calculating the distance between the horizontal coordinate values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference distance dis x The method comprises the steps of carrying out a first treatment on the surface of the S73, according to the abscissa value x of the central point of the tooth surface to be detected in the far tooth set and the near tooth set relative to the datum line tooth Respectively calculating a correction value a corresponding to each tooth surface to be detected by using the method 1;
s74, correspondingly calculating the distance of each tooth surface to be detected by using the correction value a and applying 2Correction coefficient k of part ft And a near correction coefficient k nt ;
Meanwhile, the tooth surface to be detected in the middle tooth set is corresponding, and the middle calibration coefficient k is not needed m Additional corrections are made.
S8, matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt And obtaining a near-part tooth surface thickness measurement value set through set calculation.
And S9, matching the measured value set of the far tooth surface thickness with the measured value set of the middle tooth surface thickness and the measured value set of the near tooth surface thickness with the standard value of the gear piece to be detected, and judging whether the gear piece to be detected is qualified.
By applying the method to detect the gear parts, all detection can be completed only by photographing once, and the whole detection process corresponding to one gear part can be completed within about 32ms under the current gear product when the whole detection time is about 30ms for triggering the camera photographing time of 2ms and the running time of the algorithm. In contrast to the method of imaging measurements on each tooth individually, the process of gear rotation is increased, rotating each tooth to a fixed position. For example, in the case of 18 teeth in total of a gear product to be detected, the detection time is about (gear rotation time 1000 ms+photographing time 2 ms+detection time 10 ms) ×18= 18216ms, and the detection speed is several times that of the prior art method for individually focusing imaging for each tooth by using the above method. Meanwhile, compared with single imaging, the method has the advantages that each tooth is calibrated in a single detection method. The position of the tooth distance lens is calibrated and corrected independently, so that the lens is more accurate. In the case of specific gear piece application to be detected, the product detection repetition accuracy can be improved from 0.1mm to 0.04mm. Compared with the calibration of parts with different depth of field in the prior art, the method can solve the technical problem that the calibration positions of the parts are not fixed, and is particularly suitable for the situation that the gear part and the like have a plurality of unfixed detection positions.
In another aspect, the present invention further relates to a gear member image detection system based on tooth distance segmentation compensation, whose structure is shown in fig. 2, and includes:
the image acquisition module is used for respectively extracting a first reference area, a second reference area and an area to be detected from the projection image according to a preset judging area;
the image segmentation module is used for carrying out segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens;
the tooth surface recognition module is used for respectively executing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network to generate a far-end tooth set, a middle tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected;
the datum line module is used for establishing a datum line by using the first datum region and the second datum region;
the thickness recognition module is used for respectively matching tooth surfaces to be detected in the far-end tooth set, the middle tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle image thickness set and a near-end image thickness set;
the coefficient correction module is used for correcting the far-end calibration coefficient k according to the abscissa value of the tooth surface to be detected in the far-end tooth set and the near-end tooth set relative to the datum line respectively f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt A collection;
a thickness calculation module for matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt Acquiring a near tooth surface thickness measurement value set by set calculation;
and the image detection module is used for judging whether the gear piece to be detected is qualified or not by using the far tooth surface thickness measurement value set, the middle tooth surface thickness measurement value set and the near tooth surface thickness measurement value set to match the standard value of the gear piece to be detected.
By using the system, the above-mentioned operation processing method can be executed and the corresponding technical effects can be achieved.
The foregoing is only a preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions easily contemplated by those skilled in the art within the scope of the present invention should be included in the scope of the present invention. Therefore, the protection scope of the present invention should be subject to the protection scope of the claims.
Claims (9)
1. The gear piece image detection method based on tooth distance segmentation compensation is characterized by comprising the following steps of:
s1, acquiring a projection image of a gear piece to be detected, and respectively extracting a first reference area, a second reference area and the area to be detected from the projection image according to a preset judging area;
s2, performing segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens;
s3, respectively executing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network, and generating a far-end tooth set, a middle-end tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected;
s4, establishing a datum line by using the first datum region and the second datum region;
s5, respectively matching tooth surfaces to be detected in the far-end tooth set, the middle tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle image thickness set and a near-end image thickness set;
s6, obtaining the respective correspondenceDistal calibration coefficients k for distal, middle and proximal regions f Middle calibration coefficient k m And a near calibration coefficient k n ;
S7, correcting the far-end calibration coefficient k according to the abscissa value of the tooth surface to be detected in the far-end tooth set and the near-end tooth set relative to the datum line respectively f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt A collection;
s8, matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt Acquiring a near tooth surface thickness measurement value set by set calculation;
and S9, matching the measured value set of the far tooth surface thickness with the measured value set of the middle tooth surface thickness and the measured value set of the near tooth surface thickness with the standard value of the gear piece to be detected, and judging whether the gear piece to be detected is qualified.
2. The method of claim 1, wherein the step S3 further comprises:
judging whether the total number of tooth surfaces to be detected contained in the far tooth set, the middle tooth set and the near tooth set is equal to the number of tooth surfaces designed for the gear piece to be detected;
when the judgment is equal to the judgment, the generated far tooth set, middle tooth set and near tooth set are accepted;
and when the set of the far teeth, the set of the middle teeth and the set of the near teeth are judged to be unequal, the generated set of the far teeth, the generated set of the middle teeth and the generated set of the near teeth are refused, and the step S2 is repeatedly executed.
3. The method according to claim 1, wherein said step S7 comprises performing the sub-steps of:
s71, calculating the remote calibration coefficient k f And a near calibration coefficient k n The difference generates a coefficient difference k d ;
S72, calculating the average value of the abscissa values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference midpoint coordinate value x middle Calculating the distance between the horizontal coordinate values of the center points of the two tooth surfaces to be detected which are farthest from each other relative to the datum line to generate a reference distance dis x ;
S73, according to the abscissa value x of the central point of the tooth surface to be detected in the far tooth set and the near tooth set relative to the datum line tooth Respectively calculating a correction value a corresponding to each tooth surface to be detected by using the method 1;
s74, correspondingly calculating the far correction coefficient k of each tooth surface to be detected by using the correction value a and applying 2 ft And a near correction coefficient k nt ;
4. The method of claim 1, wherein the projected image of the gear member to be inspected is focused to a central axis area of the gear member dimension to be inspected.
5. The method of claim 1, wherein extracting the first reference region, the second reference region, and the region to be detected in the projection image according to the preset determination region includes:
respectively selecting 100-pixel long and 30-pixel high areas of preset reference surface positions on two sides of a gear piece to be detected to generate a first reference area and a second reference area;
and selecting the tooth surface position 50 pixels of the gear piece to be detected, and covering all tooth surface areas to be detected in length to generate a to-be-detected area.
6. The method of claim 5, wherein the step S1 further comprises:
and respectively executing filtering operation on the first reference area, the second reference area and the area to be detected.
7. The method of claim 1, wherein the performing a tooth surface recognition operation comprises:
judging whether the edge length of the tooth surface position is not less than 50 pixels, and discarding the tooth surface position when the edge length of the tooth surface position is less than 50 pixels;
when judging that the edge length of the tooth surface position is not less than 50 pixels, further judging whether the slope of the tooth surface position straight line is less than 0.02, and when judging that the slope of the tooth surface position straight line is not less than 0.02, discarding the tooth surface position;
when judging that the slope of the tooth surface position straight line is smaller than 0.02, the tooth surface position is reserved as the tooth surface to be detected.
8. The method according to claim 1, wherein the step S4 includes:
s41, respectively calculating gray histograms of plane edges of a first reference area and a second reference area, and sampling the edge contour according to preset sampling precision to obtain a first sampling point set and a second sampling point set;
s42, obtaining a first intersection point set and a second intersection point set according to the first derivative extremum positions of the first sampling point set and the second sampling point set as intersection points;
s43, fitting and generating a first plane edge straight line and a second plane edge straight line by using the first intersection point set and the second intersection point set respectively;
s44, calculating and obtaining a datum line by using the first plane edge straight line and the second plane edge straight line by using an application type 3;
wherein x is 1 、y 1 Is the first reference lineExamination point, x 2 、y 2 For datum line second reference point, x L1 、y L1 Is the first end point of the straight line of the first plane edge, x L2 、y L2 Is the second end point of the straight line of the first plane edge, x R1 、y R1 Straight line first end point of second plane edge, x R2 、y R2 Is the second planar edge straight second end point.
9. A gear piece image detection system based on tooth distance segmentation compensation, comprising:
the image acquisition module is used for respectively extracting a first reference area, a second reference area and an area to be detected from the projection image according to a preset judging area;
the image segmentation module is used for carrying out segmentation recognition operation on the region to be detected by using a preset segmentation neural network to obtain a far region, a middle region and a near region which are far to near relative to the shooting lens;
the tooth surface recognition module is used for respectively executing tooth surface recognition operation on the far-end region, the middle-end region and the near-end region by using a preset recognition neural network to generate a far-end tooth set, a middle tooth set and a near-end tooth set which respectively comprise a plurality of tooth surfaces to be detected;
the datum line module is used for establishing a datum line by using the first datum region and the second datum region;
the thickness recognition module is used for respectively matching tooth surfaces to be detected in the far-end tooth set, the middle tooth set and the near-end tooth set according to the datum lines to obtain a far-end image thickness set, a middle image thickness set and a near-end image thickness set;
the coefficient correction module is used for correcting the far-end calibration coefficient k according to the abscissa value of the tooth surface to be detected in the far-end tooth set and the near-end tooth set relative to the datum line respectively f And a near calibration coefficient k n Generating a far-end correction coefficient k corresponding to each tooth surface to be detected ft Set and near correction coefficient k nt A collection;
a thickness calculation module for matching the far-end correction coefficient k by using the far-end image thickness set ft Acquiring a far tooth surface thickness measurement value set by set calculation, and matching a middle calibration coefficient k by using a middle image thickness set m Calculating to obtain a middle tooth surface thickness measurement set, and matching a near-part correction coefficient k by using the near-part image thickness set nt Acquiring a near tooth surface thickness measurement value set by set calculation;
and the image detection module is used for judging whether the gear piece to be detected is qualified or not by using the far tooth surface thickness measurement value set, the middle tooth surface thickness measurement value set and the near tooth surface thickness measurement value set to match the standard value of the gear piece to be detected.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311373451.5A CN117522784B (en) | 2023-10-23 | Gear part image detection method and system based on tooth distance segmentation compensation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311373451.5A CN117522784B (en) | 2023-10-23 | Gear part image detection method and system based on tooth distance segmentation compensation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN117522784A true CN117522784A (en) | 2024-02-06 |
CN117522784B CN117522784B (en) | 2024-07-16 |
Family
ID=
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103292701A (en) * | 2013-06-24 | 2013-09-11 | 哈尔滨工业大学 | Machine-vision-based online dimensional measurement method of precise instrument |
CN109829897A (en) * | 2019-01-17 | 2019-05-31 | 北京理工大学 | A kind of gear burr detection method and gear high-precision vision measuring system |
WO2021000524A1 (en) * | 2019-07-03 | 2021-01-07 | 研祥智能科技股份有限公司 | Hole protection cap detection method and apparatus, computer device and storage medium |
JP2021135300A (en) * | 2020-02-27 | 2021-09-13 | 由田新技股▲ふん▼有限公司 | Substrate measuring system and substrate measuring method |
CN114383505A (en) * | 2022-01-06 | 2022-04-22 | 江苏大学 | Automatic detection device for dimension of short shaft type part |
CN114758125A (en) * | 2022-03-31 | 2022-07-15 | 江苏庆慈机械制造有限公司 | Gear surface defect detection method and system based on deep learning |
CN115127479A (en) * | 2022-09-02 | 2022-09-30 | 西安西动智能科技有限公司 | Machine vision-based rubber roller thread online detection and correction method |
CN116758063A (en) * | 2023-08-11 | 2023-09-15 | 南京航空航天大学 | Workpiece size detection method based on image semantic segmentation |
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103292701A (en) * | 2013-06-24 | 2013-09-11 | 哈尔滨工业大学 | Machine-vision-based online dimensional measurement method of precise instrument |
CN109829897A (en) * | 2019-01-17 | 2019-05-31 | 北京理工大学 | A kind of gear burr detection method and gear high-precision vision measuring system |
WO2021000524A1 (en) * | 2019-07-03 | 2021-01-07 | 研祥智能科技股份有限公司 | Hole protection cap detection method and apparatus, computer device and storage medium |
JP2021135300A (en) * | 2020-02-27 | 2021-09-13 | 由田新技股▲ふん▼有限公司 | Substrate measuring system and substrate measuring method |
CN114383505A (en) * | 2022-01-06 | 2022-04-22 | 江苏大学 | Automatic detection device for dimension of short shaft type part |
CN114758125A (en) * | 2022-03-31 | 2022-07-15 | 江苏庆慈机械制造有限公司 | Gear surface defect detection method and system based on deep learning |
CN115127479A (en) * | 2022-09-02 | 2022-09-30 | 西安西动智能科技有限公司 | Machine vision-based rubber roller thread online detection and correction method |
CN116758063A (en) * | 2023-08-11 | 2023-09-15 | 南京航空航天大学 | Workpiece size detection method based on image semantic segmentation |
Non-Patent Citations (2)
Title |
---|
孙钊 等: ""机器视觉测量中透视投影误差分析控制与补偿"", 《计算机工程与应用》, vol. 54, no. 2, 13 March 2017 (2017-03-13), pages 266 - 270 * |
石照耀 等: ""齿轮视觉检测仪器与技术研究进展"", 《激光与光电子学进展》, vol. 59, no. 14, 18 July 2022 (2022-07-18), pages 74 - 86 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4440341B2 (en) | Calibration method, calibration apparatus, and calibration system including the apparatus | |
CN111080582B (en) | Method for detecting defects of inner and outer surfaces of workpiece | |
CN109489566B (en) | Lithium battery diaphragm material slitting width detection method, detection system and device | |
CN107358628B (en) | Linear array image processing method based on target | |
CN107392849B (en) | Target identification and positioning method based on image subdivision | |
CN111383194B (en) | Polar coordinate-based camera distortion image correction method | |
CN108716890A (en) | A kind of high-precision size detecting method based on machine vision | |
CN106996748A (en) | Wheel diameter measuring method based on binocular vision | |
CN113689401A (en) | Method and device for detecting diameter of crystal bar of czochralski silicon single crystal furnace | |
CN113450418A (en) | Improved method, device and system for underwater calibration based on complex distortion model | |
CN113222955A (en) | Gear size parameter automatic measurement method based on machine vision | |
CN113610929B (en) | Combined calibration method of camera and multi-line laser | |
CN113607058B (en) | Straight blade size detection method and system based on machine vision | |
CN112634375B (en) | Plane calibration and three-dimensional reconstruction method in AI intelligent detection | |
CN111815580B (en) | Image edge recognition method and small module gear module detection method | |
CN117522784B (en) | Gear part image detection method and system based on tooth distance segmentation compensation | |
CN113310426A (en) | Thread parameter measuring method and system based on three-dimensional profile | |
CN116880353A (en) | Machine tool setting method based on two-point gap | |
CN116596987A (en) | Workpiece three-dimensional size high-precision measurement method based on binocular vision | |
CN105423942B (en) | The modification method of biprism defect error in BSL 3D DIC systems | |
CN105092603B (en) | The online vision inspection apparatus and method of bowl-type workpiece inner wall | |
CN117522784A (en) | Gear part image detection method and system based on tooth distance segmentation compensation | |
CN111256612A (en) | Machine vision-based method for measuring straight tooth involute small-modulus gear | |
TW201520669A (en) | Bevel-axial auto-focus microscopic system and method thereof | |
CN116309861A (en) | Binocular ranging method with parallax map three-layer filling |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant |