CN116237266A - Flange size measuring method and device - Google Patents

Flange size measuring method and device Download PDF

Info

Publication number
CN116237266A
CN116237266A CN202310230177.XA CN202310230177A CN116237266A CN 116237266 A CN116237266 A CN 116237266A CN 202310230177 A CN202310230177 A CN 202310230177A CN 116237266 A CN116237266 A CN 116237266A
Authority
CN
China
Prior art keywords
workpiece
template
information
image
circle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310230177.XA
Other languages
Chinese (zh)
Inventor
王雷
乔贺
金玉
林琦智
林咏
林伟民
梁嘉颖
李建宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Shuanghong Technology Co ltd
Original Assignee
Zhejiang Shuanghong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Shuanghong Technology Co ltd filed Critical Zhejiang Shuanghong Technology Co ltd
Priority to CN202310230177.XA priority Critical patent/CN116237266A/en
Publication of CN116237266A publication Critical patent/CN116237266A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/04Sorting according to size
    • B07C5/10Sorting according to size measured by light-responsive means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/02Measures preceding sorting, e.g. arranging articles in a stream orientating
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/34Sorting according to other particular properties
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B07SEPARATING SOLIDS FROM SOLIDS; SORTING
    • B07CPOSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
    • B07C5/00Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
    • B07C5/36Sorting apparatus characterised by the means used for distribution
    • B07C5/361Processing or control devices therefor, e.g. escort memory
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a flange size measurement method, which comprises the following steps: s1, irradiating the lower part of an objective table towards the back surface of a workpiece by adopting a telecentric light source, and acquiring a projection image of the workpiece above the objective table by adopting a camera; s2, the analysis processing unit calls a corresponding template to be matched with the workpiece, and position correction is carried out on a projection image of the current workpiece according to the position information of the template; s3, after the position correction, the analysis processing unit screens out edge contour point sets of each circle from all inner contour point sets of the workpiece. According to the invention, the robot is used for replacing a robot, namely, an automatic non-contact optical detection system is used for measuring the size of the flange, rejected products are removed, optical image detection replaces human eyes, no vision requirement is required, definition is ensured, and the problems of personal injury and efficiency reduction caused by manual long-term labor are avoided.

Description

Flange size measuring method and device
Technical Field
The invention belongs to the field of optical measurement systems, and relates to a flange size measurement method and equipment thereof.
Background
Regarding a flange in an automobile part, the flange is manufactured through processing technologies such as blank sintering, compression molding, turning and the like, and the situation of over-size possibly exists in the processing process, so that the flange cannot achieve a sealing effect, oil leakage results occur after installation, and even potential safety hazards of subsequent automobile use possibly exist, so that size detection is needed.
At present, automobile flange manufacturers commonly adopt a tool mode of combining a vernier caliper, a go-no-go gauge and a tool clamp to measure the diameter, the distance and the length and width of the outer contour of the flange in final inspection, reject unqualified products, and therefore guarantee the product quality.
The manual detection mode has the following defects: (1) higher vision requirements; (2) labor intensity is high, and damage to eyes is high; (3) The randomness is high, and the flange quality cannot be guaranteed by manual judgment; (4) The efficiency is low, the continuous working time cannot be too long, and the production efficiency is affected; (5) The higher and higher labor costs also put great pressure on the enterprise.
Disclosure of Invention
The invention provides a flange size measuring method and equipment thereof for overcoming the defects of the prior art.
In order to achieve the above purpose, the present invention adopts the following technical scheme: a flange dimension measurement method comprising the steps of:
s1, irradiating the lower part of an objective table towards the back surface of a workpiece by adopting a telecentric light source, and acquiring a projection image of the workpiece above the objective table by adopting a camera;
s2, the analysis processing unit calls a corresponding template to be matched with the workpiece, and position correction is carried out on a projection image of the current workpiece according to the position information of the template;
s3, after the position correction, the analysis processing unit screens out an edge contour point set of each circle from all inner contour point sets of the workpiece;
s3.1, performing circle fitting on the circle by using the obtained edge contour point set of the circle to obtain circle center information;
s3.2, obtaining center distance information among the circles according to the center information;
s4, after the position correction, the analysis processing unit screens and extracts edge contour point sets of each long side of the workpiece from the contour point sets of the workpiece;
s4.1, fitting the long sides by using the obtained edge contour point set of the long sides to obtain a plurality of straight lines;
s4.2, fitting the polygon through a plurality of straight lines to obtain a plurality of diagonal lines of the polygon;
s4.3, carrying out edge searching on the workpiece according to the angle and position information of the diagonal line to obtain two points where the diagonal line intersects with the edge point set of the workpiece, and obtaining the length or width of the workpiece according to the distance between the two points;
s5, the analysis processing unit outputs all the measured size information.
Further, the template matching in step S2 includes the steps of:
s2.1, carrying out gray level binarization on each pixel point in the gray level image to be detected;
s2.2, adopting a Sobel edge detection algorithm to obtain a gray gradient image;
s2.3, carrying out contour finding processing on the gray gradient image to obtain information of an outer contour point set and an inner contour point set of the workpiece;
s2.4, acquiring the information of the ROI area of the workpiece, namely the information of the minimum circumscribed rectangle of the outer contour of the workpiece.
Further, the method also comprises the step of comparing the minimum external rectangular information of the workpiece with the template to judge the matching degree of the template and the workpiece.
Further, in step S2.1, the gray value binarization includes: the pixel gray value greater than the gray value 100 is 1, and vice versa is 0.
Further, in step S2.4, the minimum circumscribed rectangle information includes: the center point and the rotation angle of the smallest bounding rectangle.
Further, in step S2, the position correction includes: and establishing a reference of position offset according to the center point of the matching template and the angle of the matching template, and realizing coordinate rotation offset of the ROI area so that the ROI area keeps up with the change of the image angle and the pixels.
Further, in step S3.1, the circle fitting is implemented by using a method of using a ranac algorithm principle and a least square method principle.
Further, in step S4.1, the long-side fitting is performed by using a circle fitting method implemented by combining the Ransac algorithm principle with the least square method principle.
The flange size measurement equipment adopts the flange size measurement method, and comprises an objective table, a telecentric light source, a camera and an analysis processing unit; the object stage is used for placing a workpiece, the telecentric light source and the camera are respectively arranged on two sides of the workpiece relatively, and when the telecentric light source is started, the camera obtains a gray image of the workpiece; and the analysis processing unit stores template information, receives the gray level image and calls a corresponding template to perform matching and position correction.
In summary, the invention has the following advantages:
1) According to the invention, the robot is used for replacing a robot, namely, an automatic non-contact optical detection system is used for measuring the size of the flange, rejected products are removed, optical image detection replaces human eyes, no vision requirement is required, definition is ensured, and the problems of personal injury and efficiency reduction caused by manual long-term labor are avoided.
2) According to the invention, the image containing the edges of the inner contour and the outer contour of the workpiece is obtained through projection, the detection area is determined by adopting template matching, the detection precision is improved, and further, the geometric characteristics of the workpiece are obtained by precisely fitting holes and edges on the workpiece by adopting an edge contour point set, so that the geometric dimension information of the workpiece is obtained, the measurement accuracy is high, the consistency is strong, and the quality and the efficiency of the detected product are greatly improved compared with those of a human body.
Drawings
Fig. 1 is a schematic view of a flange dimension measuring apparatus of the present invention.
FIG. 2 is a schematic diagram of a defect detection system.
FIG. 3 is a flow chart of a defect detection method.
Fig. 4 is a schematic diagram of a template matching process.
Fig. 5 is a schematic diagram of a dimension measurement flow.
Fig. 6 is a schematic diagram of the dimensional composition of the flange of the present invention.
The marks in the figure are as follows: 1. a detection device; 11. a telecentric light source; 12. a coaxial light source; 13. a horizontal light source; 14. a camera; 15. an objective table; 16. a housing; 17. a telecentric lens; 2. a walking frame manipulator; 3. a conveyor belt; 4. a vibrating screen tray; 5. a photoelectric sensor.
Detailed Description
Other advantages and effects of the present invention will become apparent to those skilled in the art from the following disclosure, which describes the embodiments of the present invention with reference to specific examples. The invention may be practiced or carried out in other embodiments that depart from the specific details, and the details of the present description may be modified or varied from the spirit and scope of the present invention. It should be noted that the following embodiments and features in the embodiments may be combined with each other without conflict.
It should be noted that the illustrations provided in the following embodiments merely illustrate the basic concept of the present invention by way of illustration, and only the components related to the present invention are shown in the drawings and are not drawn according to the number, shape and size of the components in actual implementation, and the form, number and proportion of the components in actual implementation may be arbitrarily changed, and the layout of the components may be more complicated.
All directional indications (such as up, down, left, right, front, rear, lateral, longitudinal … …) in embodiments of the present invention are merely used to explain the relative positional relationship, movement, etc. between the components in a particular gesture, and if the particular gesture changes, the directional indication changes accordingly.
As shown in fig. 1, a detection system comprises a vibrating screen tray 4, a conveyor belt 3, a travelling crane manipulator 2, a photoelectric sensor 5, a detection device 1 and the like, and is used for detecting surface defects of a workpiece to be detected (particularly a flange), analyzing, counting and sorting qualified products and unqualified products.
In the detection system in this embodiment, a robot is adopted, the vibrating screen tray 4 is disposed at one of the transmission ends of the conveyor belt 3, the detection device 1 is disposed at the other transmission end of the conveyor belt 3, an analysis processing unit is disposed in the detection device 1, and the analysis processing unit is in communication connection with the conveyor belt 3, the travelling crane manipulator 2 and the photoelectric sensor 5; when detecting and screening a plurality of flanges, the flanges are paved in the vibrating screen plate 4, the vibrating screen plate 4 vibrates to shake the flanges off the conveyor belt 3 in sequence, and the conveyor belt 3 conveys the flanges towards the detecting device 1; the photoelectric sensor 5 is arranged at a position on the conveyor belt 3 close to the detection device 1, and is also in the working range of the line frame manipulator 2, the photoelectric detection range of the photoelectric sensor covers the position above the conveyor belt 3, when the flange moves into the detection range, the photoelectric sensor 5 triggers the conveyor belt 3 to pause, and simultaneously triggers the line frame manipulator 2 to grasp the flange and place the flange in the detection device 1; after the detection device 1 completes the detection, the analysis processing unit sends a detection result classification signal and a grabbing signal to the line frame manipulator 2, and the line frame manipulator 2 grabs qualified products and unqualified products into the conveyor belt 3 and the waste box respectively according to the classification signal.
The gantry manipulator 2 is provided with a controller, and the controller is used for controlling the motion of the gantry manipulator 2 and also is used for communication functions with the analysis processing unit, the photoelectric sensor 5 and the like.
The detection device 1 comprises a shell 16, wherein a camera 14, a coaxial light source 12, a horizontal light source 13 and a telecentric light source 11 are sequentially and fixedly arranged in the shell 16 from top to bottom, the camera 14 shoots towards the right lower side, and specifically, the camera 14 is coaxially arranged with the coaxial light source 12 and the telecentric light source 11; an objective table 15 is arranged between the coaxial light source 12 and the telecentric light source 11 in the vertical direction, the coaxial light source 12 irradiates the objective table 15 from top to bottom, the telecentric light source 11 irradiates the objective table 15 from bottom to top, the horizontal light source 13 irradiates the objective table 15 from side surface in an inclined manner, the objective table 15 is used for placing the flange, and the coaxial light source 12, the telecentric light source 11 and the horizontal light source 13 respectively provide light irradiation of the front surface, the back surface and the side surface for the flange.
Further, the stage 15 should be made of a light-transmitting material or be transparent, so that at least the light of the telecentric light source 11 can be irradiated to the camera 14.
The horizontal light source 13 is formed by combining four strip light sources to form a square shape, and when the flange is placed on the objective table 15, the flange should be located inside the square shape, so that the horizontal light source 13 can provide illumination with lateral inclination for the flange.
In the whole detection system of the embodiment, the camera 14, the telecentric light source 11, the coaxial light source 12 and the horizontal light source 13 are all in communication connection with the analysis processing unit and controlled by the analysis processing unit; in order to ensure the image definition of the workpiece under various light sources, the telecentric light source 11, the coaxial light source 12 and the horizontal light source 13 adopt a mode of alternative on, which is specifically as follows:
when the telecentric light source 11 is turned on (first irradiation condition), the camera 14 collects an image (projection image) of the workpiece irradiated by the telecentric light source 11 above the workpiece, wherein in the projection image, only the flange area is black, and other areas are white, so that the flange edge can be clearly highlighted;
when the coaxial light source 12 is turned on (second irradiation condition), the camera 14 collects an image of the upper surface of the flange downward above the coaxial light source 12, that is, an image (first surface image) that can be photographed by the camera 14 after the vertical light generated by the coaxial light source 12 located above the workpiece strikes the upper surface of the flange, so that defects such as pits, material shortage, vibration knife, sun and shade surface can be highlighted;
when the horizontal light source 13 is turned on (third irradiation condition), the camera 14 collects another image of the upper surface of the flange downward above the horizontal light source 13, that is, an image (second surface image) that can be captured by the camera 14 after the side light generated by the horizontal light source 13 located at the side of the workpiece is obliquely incident on the upper surface of the flange, defects such as scratches may be highlighted.
For the three types of images which can be acquired, the flange can be detected for defects, and the analysis processing unit in the detection device 1 performs detection analysis according to the images acquired by the camera 14; the detection specifically comprises through hole detection, front and back detection, size measurement, pit detection, material shortage detection, cutter vibration detection, sunny and shady surface detection, scratch detection and the like.
The flange in this embodiment is specifically a flange for fastening an oil cover, and is manufactured by processing technologies such as blank sintering, press forming, turning, etc., where the following may exist in the processing process: pits and material shortage occur in the process of producing the rough culture; each process flow causes a scratch condition of a plane; for unqualified processing planes capable of being reworked, the situation of a sunny and shady surface is caused by less processing allowance after processing; cutter damage in the machining process causes cutter vibration; in the case of the dimensional deviation during processing, it is necessary to distinguish between a pass and a fail by the above-mentioned defect detection.
In addition, the flange in the embodiment has front and back sides under the process of the flange, so that the flange needs to be distinguished; and the flange also comprises a through hole flange and a non-through hole flange, namely the through hole flange is provided with a round hole with a smooth inner wall, and the non-through hole flange is provided with a round hole with threads or flanges and other characteristics on the inner wall (namely the size of the round hole comprises an inner circle and an outer circle), so that distinction is also needed.
In the distinguishing of the through holes and the non-through holes, under the first irradiation condition, a template matching process is carried out in the projection image, and whether the flange is a through hole flange or a non-through hole flange is judged according to the matched template; and after the flange is subjected to template matching, the analysis processing unit generates a classification signal according to the flange type so as to control the row frame manipulator 2 to sort and count the flange after the subsequent detection flow.
A defect detection method executed by the defect detection system comprises the following steps:
s1, turning on a telecentric light source 11, collecting a projection image of a workpiece by a camera 14, performing a template matching process according to the projection image, and judging whether the workpiece is a through hole or a non-through hole;
s2, starting the coaxial light source 12, collecting a first surface image of the workpiece by the camera 14, and carrying out gray value statistical analysis according to the first surface image to judge whether the workpiece is the front surface or the back surface;
s3, starting the coaxial light source 12, collecting a first surface image of the workpiece by the camera 14, and performing a defect detection flow according to the first surface image to judge whether the workpiece has four defects of pits, material shortage, cutter vibration and sunny sides; turning on a horizontal light source 13, collecting a second surface image of the workpiece by a camera 14, performing a defect detection flow according to the second surface image, and judging whether the workpiece has a scratch defect or not;
s4, turning over the workpiece;
s5, starting the coaxial light source 12, collecting a first surface image of the workpiece by the camera 14, and performing a defect detection flow according to the first surface image to judge whether the workpiece has four defects of pits, material shortage, cutter vibration and sunny sides; turning on a horizontal light source 13, collecting a second surface image of the workpiece by a camera 14, performing a defect detection flow according to the second surface image, and judging whether the workpiece has a scratch defect or not;
s6, turning on a telecentric light source 11, collecting a projection image of the workpiece by a camera 14, performing a size detection flow according to the projection image, and judging whether the size of the workpiece is compliant;
s7, starting the coaxial light source 12, collecting a first surface image of the workpiece by the camera 14, and measuring the diameter of the excircle of the non-through hole according to the first surface image.
Step S6 may be performed before or after any one of steps S2, S3, S4, S5.
When the workpiece is continuously processed in step S1 and step S2, if the workpiece is determined to be a non-through hole workpiece and the front surface is the front surface, step S7 is performed after step S4; if the workpiece is determined to be a non-through hole workpiece and the front surface is the reverse surface at this time, step S7 is performed between step S2 and step S4.
The template matching process in step S1 includes the following steps:
s1.1, carrying out gray binarization on each pixel point in a gray image to be detected, namely, the gray value of the pixel point with the gray value larger than 100 is 1, and conversely, the gray value of the pixel point with the gray value larger than 100 is 0;
s1.2, adopting a Sobel edge detection algorithm to obtain a gray gradient image;
s1.3, carrying out contour finding processing on the gray gradient image to obtain information of an outer contour point set and an inner contour point set of the workpiece;
s1.4, obtaining a minimum circumscribed rectangle of the outer contour of the workpiece, wherein the minimum circumscribed rectangle comprises a center point, a rotation angle, a length and a width of the minimum circumscribed rectangle, and obtaining a minimum circumscribed circle area of a maximum inner contour;
s1.5, traversing each manufactured template model, comparing the minimum circumscribed rectangular area and the length-width ratio information of the template with the workpiece, if one of the area ratio and the length-width ratio is not in the range of 98% to 102%, judging that the matching is failed, continuing traversing the next template, if the area ratio and the length-width ratio are both in the range of 98% to 102%, further comparing the minimum circumscribed circular area of the maximum inner contour of the template with the workpiece, if the ratio is in the range of 98% to 102%, judging that the matching is successful, otherwise judging that the matching is failed, and continuing traversing the next template;
s1.6, if the matching is successful when traversing to a certain template, finishing traversing and outputting template matching information; if the complete template is traversed and the matching is not successful, the interface prompts that the template matching fails.
Firstly, a contour searching algorithm is realized by calling a contour searching function in an open source algorithm library Opencv, the contour searching principle is that for a binary image with black background and white target, if a white point is found in the image and the 8 neighborhood (or the 4 neighborhood) of the white point is also white, the point is an internal point of the target, the point is set to be black, and the internal point is visually like a hollowed-out point; otherwise, keeping the white unchanged, wherein the point is a boundary point (or called contour point) of the object, traversing the whole graph according to the method, obtaining each contour (or called contour point set), storing in a point vector mode, and obtaining topology information of an image, namely index numbers of a next contour, a previous contour, a father contour and an embedded contour of the one contour.
Step S1.1 specifically refers to changing the detection area image after the gray level binarization into a black-white two-color image, wherein a gray level value of 0 represents black and a gray level value of 255 represents white; and then carrying out Sober edge detection processing on the binarized detection area image.
The Sober edge detection processing process is as follows:
s1.2.1, supposing that a matrix A represents a pixel value matrix of the detection area image;
s1.2.2, taking the derivatives in the horizontal direction (x) and in the vertical direction (y), respectively;
horizontal direction: convolving a with a kernel of size 3, as in equation (1):
Figure BDA0004120031500000111
in the formula (1), G x Representing the gray gradient value in the horizontal direction.
Vertical direction: convolving a with a kernel of size 3, as in equation (2):
Figure BDA0004120031500000121
in the formula (2), G y Representing the gray gradient value in the vertical direction.
The gray gradient value of the horizontal direction and the vertical direction of each pixel of the image is combined by the following formula (3) to calculate the dot gray gradient value.
Figure BDA0004120031500000122
In the formula (3), G represents a gray gradient value of the current pixel point.
In step S2, for the front and back detection, the measurement of the diameter of the outer circle of the non-through hole needs to be performed from the direction of the back surface in the case of the non-through hole workpiece, and further, the front and back surfaces of the flange need to be distinguished.
In step S6, referring to fig. 5 and 6, the size detection process includes the steps of:
s6.1, acquiring ROI region information of a workpiece;
s6.1.1, the information of the ROI area of the workpiece is the minimum circumscribed rectangular information of the outline point set of the workpiece;
s6.2, carrying out position correction on the current workpiece image according to the position information of the template;
s6.2.1, the position correction is a tool for assisting in positioning, correcting the movement offset of a target and assisting in accurate positioning; establishing a reference of position offset according to the center point of the matching template and the angle of the matching template in the template matching result, and then realizing coordinate rotation offset of the ROI area, namely enabling the ROI area to keep up with the change of the image angle and the pixel;
s6.3, after the position correction, the detection frame areas of the left, middle and right circles are subjected to the position correction correspondingly, and according to each detection frame area after the position correction, edge contour point sets of the left, middle and right circles are screened out from all inner contour point sets of the workpiece;
s6.4, performing circle fitting on the left circle by using the obtained left circle contour point set to obtain a circle 1; performing circle fitting on the intermediate circle by using the obtained intermediate circle contour point set to obtain a circle 2; performing circle fitting on the right circle by using the obtained right circle contour point set to obtain a circle 3;
s6.5, obtaining the center distance from the circle 1 to the circle 2 according to the center information; obtaining the center distance from circle 2 to circle 3 according to the center information;
s6.6, after the position correction, the detection frame areas of the four long sides of the workpiece are subjected to the position correction correspondingly, and according to the detection frame areas after the position correction, the contour point sets of the four long sides of the workpiece are screened and extracted from the contour point sets of the workpiece in a concentrated manner;
s6.7, fitting the four long sides of the workpiece by using the obtained contour point sets of the four long sides of the workpiece to obtain straight lines 1,2,3 and 4;
s6.8, fitting a quadrangle through straight lines 1,2,3 and 4;
s6.9, acquiring a long diagonal line 1 and a short diagonal line 2 of the quadrangle;
s6.10, searching edges of the workpiece according to the angle and position information of the diagonal line 1;
s6.10.1, finding edges, namely solving points intersecting with the outer contour point set of the workpiece according to a linear equation of a diagonal line 1, wherein the intersection point on the left side is obtained as a left edge, and the intersection point on the right side is obtained as a right edge;
s6.11, obtaining the length of the workpiece according to the distance between the left edge point and the right edge point;
s6.12, searching edges of the workpiece according to the angle and position information of the diagonal line 2;
s6.12.1, finding edges, namely solving points intersecting with a workpiece edge point set according to a linear equation of a diagonal line 2, wherein the upper intersection point is an upper edge, and the lower intersection point is a lower edge;
s6.13, obtaining the width of the workpiece according to the distance between the upper edge and the lower edge;
s6.14, outputting all the measured size information.
The circle fitting method applied in the step S6.4 and the straight line fitting method applied in the step S6.7 are realized by combining a Ranac algorithm principle with a least square method principle.
Specifically, in step S6.4, the circle fitting method is as follows:
s6.4.1 set contour points extracted from corresponding detection region
Figure BDA0004120031500000141
Has n pixel points, wherein the point (x j ,y j ) Representing the j-th pixel point in the contour point set, assuming that the calculated round equation is: (x-A) 2 +(y-B) 2 =R 2 Wherein A represents a circle center horizontal coordinate point, B represents a circle center vertical coordinate point, and R represents a circle radius;
s6.4.2 three points are randomly extracted from n pixel points, and are substituted into a circular equation (x-A) 2 +(y-B) 2 =R 2 A, B and R are obtained;
s6.4.3, calculating the distance from other points to the circle, which is smaller than a certain threshold (the threshold is set to 2 pixels in the embodiment), and counting the number of the interior points as the interior points;
s6.4.4 repeating the steps S6.4.2 to S6.4.3M times to obtain the inner point set with the most points;
s6.4.5 the set of interior points with the greatest number of points
Figure BDA0004120031500000142
By running least squares
Figure BDA0004120031500000143
Figure BDA0004120031500000144
Figure BDA0004120031500000145
Figure BDA0004120031500000151
/>
Figure BDA0004120031500000152
Figure BDA0004120031500000153
Figure BDA0004120031500000154
Figure BDA0004120031500000155
Figure BDA0004120031500000156
Figure BDA0004120031500000157
Figure BDA0004120031500000158
Wherein C, D, E, G, H, a, b, c is an intermediate calculation parameter in A, B and R deducing process, namely, parameters used for referring to each resolution formula after reasonable resolution of the calculation formulas of A, B and R.
Obtaining the round equation (x-A) 2 +(y-B) 2 =R 2
Wherein point (x) i ,y i ) Represents the i-th pixel point in the interior point set, and N represents the pixel in the interior point setThe number of points;
the value of M can be estimated by
Figure BDA0004120031500000159
Wherein p represents the probability of an inlier, p 2 Representing the probability that 3 points are all inner points, 1-p 3 Representing the probability of 3 at least one outlier (sampling failure), z=1- (1-p) 3 ) M Representing the probability that M samples succeed at least once;
in this embodiment, assuming that p=0.8 and z=0.99, the value of M may be 7.
Specifically, in step S6.7, the straight line fitting method is as follows:
s6.7.1 set contour points extracted from corresponding detection region
Figure BDA0004120031500000161
Has n pixel points, wherein the point (x j ,y j ) Representing the j-th pixel point in the contour point set, assuming that the solved linear equation is: y=a×x+b, where a represents the slope of the straight line sought and b represents a constant;
s6.7.2, randomly extracting two points from n pixel points, substituting a linear equation y=a x+b, and obtaining a and b;
s6.7.3, calculating the distance from other points to the straight line, which is smaller than a certain threshold (the threshold is set to 2 pixels in the embodiment), and counting the number of the inner points as the inner points;
s6.7.4 repeating the steps S6.7.2 to S6.7.3M times to obtain the inner point set with the most points;
s6.7.5 the set of interior points with the greatest number of points
Figure BDA0004120031500000162
By running least squares>
Figure BDA0004120031500000163
Obtaining a straight line equation y=a×x+b;
wherein point (x) i ,y i ) Represents the i-th pixel point in the interior point set, and N represents the interior point setThe number of the pixels in the array;
the value of M can be estimated by
Figure BDA0004120031500000164
Wherein p represents the probability of an inlier, p 2 Representing the probability that 2 points are all inner points, 1-p 2 Representing the probability of 2 at least one outlier (sampling failure),
z=1-(1-p 2 ) M representing the probability that M samples succeed at least once;
in this embodiment, assuming that p=0.8 and z=0.99, the value of M may be 5.
In summary, for the above-mentioned circle fitting straight line fitting process, if the circle or straight line is fitted by using only the least square method, when there are small burrs, small convex points or small concave points on the edge part of the workpiece, the extracted contour will have outliers when used for fitting, and this situation may lead to inaccurate fitting; the method is characterized in that firstly, through the Ranac algorithm thought, the probability p that one contour point is an internal point (non-outlier point) and the probability z that M times of sampling are successful at least once are randomly extracted from a detection object (contour point set), the sampling times M are obtained, the internal point set with the largest number of points is counted, and then, least square fitting is carried out, so that outliers can be effectively removed, fitting precision is improved, and measuring precision is improved.
In step S3/S5, for pit detection and defect detection, under the second irradiation condition, the light at the pit and defect is reflected back to the photosensitive surface of the camera 14 less, so that the occupied area is darker than the other normal areas, i.e. the gray value of the pixel point contained therein is smaller and larger than the other normal areas, so that the gray map is binarized according to the set value (preferably 80 in the present embodiment), i.e. the gray value of the pixel point larger than the set value is set to 1, the gray value of the pixel point not larger than the set value is set to 0, the connected domain with the gray value of the pixel point being 0 is extracted, and if the area of the extracted connected domain is larger than the set value (preferably 1.5mm in the present embodiment) 2 ) Namely, the pit or the lack of the material is judged.
In step S3/S5, for the vibration knife detection and the sunny and shady face detection,under the second irradiation condition, the vibration knife and the sunny and shady surface areas lose the inherent ring-shaped halo shallow lines on the surface due to excessive processing, so that the brightness presented by the vibration knife and the sunny surface areas is brighter than other normal areas, namely the gray values of the contained pixel points are bigger than other normal areas, the average value filtering processing is firstly carried out on the gray images, and then the maximum inter-class variance is adopted to calculate the optimal threshold k of gray thresholding * Then judge k * Whether or not the vibration is within the set range (the preferred set range in the present embodiment is 100 to 180), if not, determining that there is no vibration knife or no sunny and sunny area, if yes, the defect is present, and using the obtained k * Counting the size of the area occupied by the vibrating knife and the sunny and shady surface, if the size is larger than a set value (preferably 4mm in the embodiment) 2 ) If the area is too small, judging that the defect does not exist.
The mean filtering is to traverse each pixel point in the gray level image, and the following processing is carried out on each pixel point: the pixel point is called a target pixel point, the target pixel point is taken as the center, eight pixel points around the target pixel point are combined to form a filtering template, and the gray level average value of all the pixel points in the template is used for replacing the gray level value of the target pixel point.
The maximum inter-class variance method obtains the optimal threshold k for gray level thresholding * The implementation thought of the method is as follows:
there are 256 gray levels 1,2 for the gray image 256]. The number of pixel points with gray level of i is n i Then the total number of pixels is
Figure BDA0004120031500000181
A normalized gray level histogram is used and considered as a probability distribution for this image:
Figure BDA0004120031500000182
wherein p in formula (4) i Representing the probability of gray level i in this image.
Let it now be assumed that these pixels are divided into two classes by a threshold of gray level k: c (C) 0 And C 1 ;C 0 The gray scale of the representation is 1,2, k]Pixel point C of (2) 1 The expression gray level is [ k+1 ], 256]Is a pixel of (a) a pixel of (b). The probability of each class appearing, then, and the average gray level of each class, is given by:
Figure BDA0004120031500000191
wherein ω in formula (5) 0 Represents C 0 Probability of occurrence, ω (k) represents probability of cumulative occurrence of gray levels from 1 to k.
Figure BDA0004120031500000192
Wherein ω in formula (6) 1 Represents C 1 Probability of occurrence.
Figure BDA0004120031500000193
Wherein mu in formula (7) 0 Represents C 0 μ (k) represents an average gray level of gray levels from 1 to k.
Figure BDA0004120031500000194
Wherein mu in formula (8) 1 Represents C 1 Average gray level, mu T Representing the average gray level of the entire image.
Figure BDA0004120031500000195
Figure BDA0004120031500000196
Respectively, the cumulative occurrence probability and the average gray level (first-order cumulative moment) of gray levels from 1 to k
Figure BDA0004120031500000197
Is the average gray level of the whole image.
For any selected k, there are:
ω 0 μ 01 μ 1 =μ T ,ω 01 =1. (12)
selecting eta in the following formula (13) as a measurement standard for evaluating the "good or bad" (separability) with k as a threshold value
Figure BDA0004120031500000201
Wherein, the liquid crystal display device comprises a liquid crystal display device,
Figure BDA0004120031500000202
representing inter-class variance>
Figure BDA0004120031500000203
Representing the gray level total variance:
Figure BDA0004120031500000204
also from equation (12), it can be derived that:
Figure BDA0004120031500000205
the two equations are the inter-class variance and the gray level total variance, respectively.
The following formula is used to select different k-value sequential searches, based on formulas (9) and (10), to find the optimal threshold k * So that eta takes a maximum value, or equal toIs worth making
Figure BDA0004120031500000206
Maximum is reached.
Figure BDA0004120031500000207
Figure BDA0004120031500000208
And, the optimal threshold k * Namely
Figure BDA0004120031500000209
The vibration knife and the concave-convex surface defect can be generated in the workpiece processing, and the front surface of the workpiece is a processing surface, so further, the vibration knife and the concave-convex surface defect detection of the embodiment are only performed on the front surface of the workpiece, in the defect detection flow, the defect detection of pits and lack of materials is performed first, and after the workpiece with pits and lack of materials is removed, the defect detection of the vibration knife and the concave-convex surface is performed, so that the accuracy of the vibration knife and the concave-convex surface defect detection is ensured.
In step S3/S5, for scratch detection, under the third irradiation condition, light at the scratch is reflected back to the camera 14 photosurface more, and light at other normal areas is reflected back to the camera 14 photosurface less, but considering that scratch defects with different shapes and different depths are possible to exist at the same time, the distinction degree between the scratch defects and the normal areas is not large, the scratch defects are detected by adopting a deep learning method, and in the labeling process of defect targets of gray images, morphological closing operation is firstly performed on the gray images, so that some intermittent scratches can be connected and complete is presented; after training the model, when detecting, firstly carrying out morphological closing operation on a gray level image to be detected, and then calling the model to detect; specifically, according to the trained model size (for example, the model size is a matrix size of 5 pixels by 5 pixels), the gray image after morphological closing operation processing is traversed, if the confidence level between a certain region in the gray image and the model reaches 90%, the scratch is determined, otherwise, the scratch is not determined.
The training process of the deep learning model comprises the following steps:
s3/5.1, collecting a large number of scratch sample images;
s3/5.2, marking the areas where scratches are located in all sample images;
s3/5.3, inputting the marked data and images to a backbone network;
s3/5.4, normalizing sample images in a training set in a network and scaling the sample images to be 32 integer times;
s3/5.5, setting the width and height of an initial candidate frame in the boundary regression module;
s3/5.6, starting iterative training of a prediction model;
s3/5.7, generating a prediction model after training is completed, and deriving;
s3/5.8, reasoning the actual detection image by using a prediction model;
s3/5.9, judging whether the detection accuracy exceeds 95%;
s3/5.10 if the accuracy is lower than 95%, enter step S3/5.11, enter step S3/5.14 if higher than 95;
s3/5.11, re-labeling the missed scratches;
s3/5.12, the normal area is erroneously detected as a removal mark of the scratch;
s3/5.13, entering the remarked image into a step S3/5.3;
s3/5.14 training is completed.
In addition, in the detection device 1 of the present embodiment, the three light sources and the camera 14 are disposed in the housing 16, so as to obtain a good irradiation effect, avoid interference of external ambient light and the like, and avoid affecting the accuracy of detection, and after the housing 16 is disposed, there is a certain limitation on the movement of the gantry robot 2 to the stage 15 in the housing 16, and the flexibility of the gantry robot 2 is limited, so that the present embodiment further includes a telescopic mechanism mounted on the stage 15 and in communication connection with the analysis processing unit, for driving the stage 15 to extend out of the housing 16, so as to facilitate the movement and turnover of the flange on the stage 15 by the gantry robot; the telescopic mechanism in this embodiment may be preferably an electric linear slide rail.
It should be noted that, in order to obtain better image information by the camera 14, the photographing end of the camera 14 is further provided with a telecentric lens 17, when the workpiece is placed on the stage 15, the effect of being coaxial with the photographing direction of the camera 14 cannot be achieved due to the influence of machine precision and the like, that is, the placement position is not uniform, in this case, the above-mentioned various detection results are affected to some extent, especially in the dimension measurement, when the workpiece is placed askew, the side surface of the workpiece is photographed together, and is easily incorporated into the upper surface range of the workpiece, thereby causing dimension measurement errors; therefore, the telecentric lens 17 is additionally arranged at the shooting end of the camera 14, and the influence on the image acquired by the camera 14 after the workpiece is put askew is reduced by utilizing the characteristics of the telecentric lens 17 such as high resolution, ultra-wide depth of field, ultra-low distortion, unique parallel light design and the like.
It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.

Claims (9)

1. A method for measuring the dimensions of a flange, comprising the steps of:
s1, irradiating the lower part of an objective table towards the back surface of a workpiece by adopting a telecentric light source, and acquiring a projection image of the workpiece above the objective table by adopting a camera;
s2, the analysis processing unit calls a corresponding template to be matched with the workpiece, and position correction is carried out on a projection image of the current workpiece according to the position information of the template;
s3, after the position correction, the analysis processing unit screens out an edge contour point set of each circle from all inner contour point sets of the workpiece;
s3.1, performing circle fitting on the circle by using the obtained edge contour point set of the circle to obtain circle center information;
s3.2, obtaining center distance information among the circles according to the center information;
s4, after the position correction, the analysis processing unit screens and extracts edge contour point sets of each long side of the workpiece from the contour point sets of the workpiece;
s4.1, fitting the long sides by using the obtained edge contour point set of the long sides to obtain a plurality of straight lines;
s4.2, fitting the polygon through a plurality of straight lines to obtain a plurality of diagonal lines of the polygon;
s4.3, carrying out edge searching on the workpiece according to the angle and position information of the diagonal line to obtain two points where the diagonal line intersects with the edge point set of the workpiece, and obtaining the length or width of the workpiece according to the distance between the two points;
s5, the analysis processing unit outputs all the measured size information.
2. The template matching method according to claim 1, wherein the template matching in step S2 comprises the steps of:
s2.1, carrying out gray level binarization on each pixel point in the gray level image to be detected;
s2.2, adopting a Sobel edge detection algorithm to obtain a gray gradient image;
s2.3, carrying out contour finding processing on the gray gradient image to obtain information of an outer contour point set and an inner contour point set of the workpiece;
s2.4, acquiring the information of the ROI area of the workpiece, namely the information of the minimum circumscribed rectangle of the outer contour of the workpiece.
3. The method of claim 2, further comprising comparing the minimum external rectangular information of the workpiece with the template to determine a degree of matching of the template with the workpiece.
4. The template matching method according to claim 2, wherein in step S2.1, the binarizing of the gray value comprises: the pixel gray value greater than the gray value 100 is 1, and vice versa is 0.
5. The method according to claim 2, wherein in step S2.4, the minimum bounding rectangle information includes: the center point and the rotation angle of the smallest bounding rectangle.
6. The method according to claim 1, wherein in step S2, the position correction includes: and establishing a reference of position offset according to the center point of the matching template and the angle of the matching template, and realizing coordinate rotation offset of the ROI area so that the ROI area keeps up with the change of the image angle and the pixels.
7. The template matching method according to claim 1, wherein in step S3.1, the circle fitting is implemented by using a Ransac algorithm principle in combination with a least square method principle.
8. The template matching method according to claim 1, wherein in step S4.1, the long-side fitting is performed by using a round fitting method implemented by combining a Ransac algorithm principle with a least square method principle.
9. A flange dimension measuring device, employing the flange dimension measuring method according to any one of claims 1-8, characterized by comprising a stage, a telecentric light source, a camera, and an analysis processing unit; the object stage is used for placing a workpiece, the telecentric light source and the camera are respectively arranged on two sides of the workpiece relatively, and when the telecentric light source is started, the camera obtains a gray image of the workpiece; and the analysis processing unit stores template information, receives the gray level image and calls a corresponding template to perform matching and position correction.
CN202310230177.XA 2023-03-10 2023-03-10 Flange size measuring method and device Pending CN116237266A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310230177.XA CN116237266A (en) 2023-03-10 2023-03-10 Flange size measuring method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310230177.XA CN116237266A (en) 2023-03-10 2023-03-10 Flange size measuring method and device

Publications (1)

Publication Number Publication Date
CN116237266A true CN116237266A (en) 2023-06-09

Family

ID=86634835

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310230177.XA Pending CN116237266A (en) 2023-03-10 2023-03-10 Flange size measuring method and device

Country Status (1)

Country Link
CN (1) CN116237266A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117422714A (en) * 2023-12-18 2024-01-19 大陆汽车电子(济南)有限公司 Assembly inspection method, apparatus, and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117422714A (en) * 2023-12-18 2024-01-19 大陆汽车电子(济南)有限公司 Assembly inspection method, apparatus, and storage medium
CN117422714B (en) * 2023-12-18 2024-03-29 大陆汽车电子(济南)有限公司 Assembly inspection method, apparatus, and storage medium

Similar Documents

Publication Publication Date Title
CN109141232B (en) Online detection method for disc castings based on machine vision
CN110935644B (en) Bearing needle roller size detection system and method based on machine vision
CN110389127B (en) System and method for identifying metal ceramic parts and detecting surface defects
JP3581314B2 (en) Automatic lens inspection system
JP4254347B2 (en) Method and apparatus for detecting foreign matter in liquid in container
CN108896574B (en) Bottled liquor impurity detection method and system based on machine vision
Liu et al. Automatic detection technology of surface defects on plastic products based on machine vision
CN110146516B (en) Fruit grading device based on orthogonal binocular machine vision
CN105403147A (en) Embedded bottle pre-form detection system and detection method
CN106780473A (en) A kind of magnet ring defect multi-vision visual detection method and system
CN210071686U (en) Fruit grading plant based on orthogonal binocular machine vision
CN110800294A (en) Method, equipment and system for detecting camera module and machine-readable storage medium
CN112001917A (en) Machine vision-based geometric tolerance detection method for circular perforated part
EP1677098A1 (en) Surface defect inspecting method and device
CN116237266A (en) Flange size measuring method and device
CN111474179A (en) Lens surface cleanliness detection device and method
CN111239142A (en) Paste appearance defect detection device and method
CN108872085A (en) The rotten heart lossless detection method of navel orange based on the detection of blue wave band transmission image
CN111487192A (en) Machine vision surface defect detection device and method based on artificial intelligence
CN116140225A (en) Defect detection system and defect detection method
CN112750113B (en) Glass bottle defect detection method and device based on deep learning and linear detection
Chiou et al. Flaw detection of cylindrical surfaces in PU-packing by using machine vision technique
CN113888539B (en) Defect classification method, device, equipment and storage medium
JP2005241304A (en) Appearance inspection method
CN114820622A (en) Interlayer foreign matter detection method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination