CN115955614B - Image acquisition device and defect detection system - Google Patents

Image acquisition device and defect detection system Download PDF

Info

Publication number
CN115955614B
CN115955614B CN202211609606.6A CN202211609606A CN115955614B CN 115955614 B CN115955614 B CN 115955614B CN 202211609606 A CN202211609606 A CN 202211609606A CN 115955614 B CN115955614 B CN 115955614B
Authority
CN
China
Prior art keywords
image
target image
target
stage
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211609606.6A
Other languages
Chinese (zh)
Other versions
CN115955614A (en
Inventor
樊逸杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Chuangxin Qizhi Technology Group Co ltd
Original Assignee
Qingdao Chuangxin Qizhi Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Chuangxin Qizhi Technology Group Co ltd filed Critical Qingdao Chuangxin Qizhi Technology Group Co ltd
Priority to CN202211609606.6A priority Critical patent/CN115955614B/en
Publication of CN115955614A publication Critical patent/CN115955614A/en
Application granted granted Critical
Publication of CN115955614B publication Critical patent/CN115955614B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The application provides an image acquisition device and a defect detection system. Wherein the object stage in the image acquisition device is arranged on the sliding table and is configured to slide on the sliding table along a first direction; the linear array camera is arranged at a first preset position of the upper ends of the sliding table and the objective table and is configured to acquire images of the objective table and an object to be detected on the objective table; the object stage comprises a first edge arranged along a first direction and a second edge arranged along a second direction; the second direction is a direction orthogonal to the first direction on a plane on which the surface of the stage is located; the first edge is provided with positioning holes with a first interval, and the second edge is provided with positioning holes with a second interval. According to the image acquisition method and device, the image of the moving object to be detected is acquired through the linear array camera, and the positioning hole is formed in the object stage, so that the image can be adjusted as the reference point of the image acquired by the linear array camera, and the image acquisition efficiency of the object to be detected is improved, and meanwhile the image accuracy is improved.

Description

Image acquisition device and defect detection system
Technical Field
The present application relates to the field of quality detection, and more particularly, to an image acquisition apparatus and a defect detection system.
Background
At present, an image acquisition device in the PCB through hole deviation detection device is usually selected to be an area array camera, the area array camera needs the PCB to stay still in the device when acquiring the PCB image, and time is reserved for the area array camera to movably acquire a plurality of groups of photos, so that the efficiency of the mode of acquiring the PCB image through the area array camera is lower. And the linear array camera only shoots one line of image every time, and forms one line of image every time into a complete image, and the pixel precision is lower than the cost of the linear array camera, and simultaneously supports the image acquisition of the detected object in motion. However, the linear array camera is very easily affected by the problem of uneven object movement speed, so that the image is locally stretched or compressed, and finally, the distance between through holes in the result image is inconsistent with the actual distance, and the accuracy of the acquired image cannot be ensured.
Disclosure of Invention
In view of the foregoing, an object of an embodiment of the present application is to provide an image capturing device and a defect detecting system. The accuracy of the image can be improved while the image acquisition efficiency is improved.
In a first aspect, an embodiment of the present application provides an image acquisition apparatus, including: objective table, slip table and linear array camera; the objective table is arranged on the sliding table and is configured to slide on the sliding table along a first direction; the linear array camera is arranged at a first preset position of the sliding table and the upper end of the objective table and is configured to acquire images of the objective table and an object to be detected on the objective table; the stage includes a first edge disposed along the first direction and a second edge disposed along a second direction; wherein the second direction is a direction orthogonal to the first direction on a plane on which a surface of the stage is located; the first edge is provided with positioning holes with a first interval, and the second edge is provided with positioning holes with a second interval.
In the implementation process, the linear array camera is arranged to acquire the object table and the image of the object to be detected on the object table, so that the image acquisition can be performed in the sliding process of the object table, the object table does not need to stay on the sliding table for matching with the camera to perform the image acquisition, and the image acquisition efficiency is improved. In addition, the positioning holes are formed in the objective table, the positions of the positioning holes in the image can be used as reference to the image, the image under the condition that the objective table slides at a constant speed is restored, the image deviation caused by uneven sliding speed of the objective table is reduced, and further the accuracy of defect detection by using the image is improved.
In one embodiment, the linear array camera is a plurality of linear array cameras; a first center dividing line of the object stage in the second direction is parallel to a second center dividing line of the sliding table in the second direction; the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, and the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction; the linear array cameras are respectively arranged on the first side of the sliding table and the second side of the sliding table and are configured to respectively acquire the object to be detected on the object table and the first side image of the object to be detected on the object table and the second side image of the object to be detected on the object table; the linear array cameras on the first side of the sliding table and the second side of the sliding table are arranged in a staggered mode in the second direction.
In the implementation process, the requirements on the structure and the functions of the linear array camera can be reduced by arranging a plurality of linear array cameras. In addition, through the dislocation set of a plurality of linear array cameras in the second direction, can prevent that the linear array camera from producing the clearance because of the shell contact between making the shooting, make the objective table in the clearance and the defect that the image of waiting the side thing on the objective table can not be obtained, can acquire the image of waiting the side thing on the more complete objective table, improved the integrality of the image of acquireing.
In one embodiment, the object carrying table is provided with an object carrying groove; the inner frame of the object carrying groove is attached to the outer frame of the object to be detected, and the inner frame is configured to fix the position of the object to be detected on the object stage.
In the implementation process, the object carrying groove is formed, the inner frame of the object carrying groove is attached to the outer frame of the object to be detected, so that the object to be detected is placed in the object carrying groove, the position of the object to be detected on the object stage is limited through the object carrying groove, the fixed position of the object to be detected on the object stage is ensured, the influence of the position change of the object to be detected on the acquired image is reduced, and the accuracy of the acquired image of the object to be detected on the object stage is improved.
In one embodiment, the image acquisition apparatus further comprises: a base and a sensor; the sliding table is arranged on the base; the sensor is arranged at a second preset position of the base and is configured to detect position information of the objective table; the second preset position indicates the acquisition range of the linear array camera in the first direction; the position information is used for triggering the linear array camera to start or end to acquire the objective table and the image of the object to be measured on the objective table.
In the implementation process, the sensor is arranged to acquire the position information of the object stage through the sensor so as to control the start or end of the linear array camera to acquire the image, so that the start or stop of the linear array camera is influenced by the position of the object stage, the linear array camera starts to acquire the image when the image can be acquired, the acquisition of the image is stopped after the object stage leaves, the acquisition of the irrelevant image by the linear array camera is reduced, the accurate acquisition of the target object to be detected by the linear array camera is ensured, and the loss of the linear array camera is reduced.
In one embodiment, the image acquisition apparatus further comprises: a linear array light source; the linear array light source is configured to illuminate the object table and the object to be measured on the object table when the linear array camera acquires the images of the object table and the object to be measured on the object table.
In the implementation process, the object to be detected on the object table and the object table can be illuminated by arranging the linear array light source, so that the linear array camera can acquire clear images of the object table and the object to be detected on the object table, and the definition of the images of the object table and the object to be detected on the object table is improved.
In a second aspect, embodiments of the present application further provide a defect detection system, including: a defect detecting device and the image acquiring device according to any one of the first aspects; the defect detection device is connected with the image acquisition device; the defect detection device is used for acquiring the target image sent by the image acquisition device; the defect detection device is further used for calibrating the target image according to the positioning holes in the target image so as to generate a calibrated target image; and matching the calibrated target image with the standard target image, and judging whether the object to be detected corresponding to the target image has a hole deviation defect or not.
In the implementation process, by arranging the defect detection device, after the image acquired by the image acquisition device is calibrated through the positioning hole, the image of the object table and the object to be detected on the object table, which are acquired by the image acquisition device when the object table slides at a uniform speed, is restored, the influence of external factors on the detection result is reduced, and the accuracy of defect detection is improved.
In one embodiment, in the process of generating the calibrated target image, the defect detection device is specifically configured to: determining a plurality of target positioning holes in the target image, the target positioning holes comprising the positioning holes of the first edge and the positioning holes of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image after the whole image is corrected so as to generate the calibrated target image.
In the implementation process, the coordinate system where the target image is located is mapped according to the actual coordinates of the target positioning holes in the target image, so that the whole target image is corrected. Since the target positioning hole is selected as the positioning hole in the second direction that is not affected by the stage sliding speed, the actual coordinates of the target positioning hole can be regarded as the coordinates of the target positioning hole in the standard image, and thus the target image coordinate system mapped by taking the actual coordinates of the target positioning hole as the reference coordinates is the standard coordinate system of the target image. The whole target image is corrected through the coordinate system, so that the influence of defects such as whole deviation and distortion of the target image on a detection result is avoided, and the accuracy of defect detection is improved.
In one embodiment, in the process of generating the calibrated target image, the defect detection device is specifically configured to: determining the center point of a positioning hole of the target image in the first direction after the integral correction of the image, and respectively making straight lines parallel to the second direction along the center point; respectively determining partial images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the partial images according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate calibrated target images.
In the implementation process, according to the characteristic that the distance between the positioning holes is constant, the relation between the distance between the positioning holes in the target image and the actual distance is determined whether the image between two adjacent positioning holes is stretched or compressed due to the influence of the sliding speed of the object stage, the compressed or stretched partial image is correspondingly pulled or compressed according to the difference value between the distance in the target image and the actual distance, each partial image of the target image is calibrated, the influence of the speed of the object stage on the detection result is avoided, and the accuracy of defect detection is improved.
In one embodiment, in the process of determining whether the to-be-detected object corresponding to the target image has a hole deviation defect, the defect detection device is specifically configured to: and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
In the implementation process, the coordinate positions of the through holes in the calibrated target image are respectively matched with the coordinate positions of the through holes in the standard image, so that whether the through holes are subjected to hole deviation defects or not can be determined, whether the object to be detected has the hole deviation defects or not is further determined, whether the through holes in the object to be detected have the hole deviation defects or not can be accurately checked, and the quality of the object to be detected is ensured.
In one embodiment, the first center dividing line of the object stage in the second direction is parallel to the second center dividing line of the sliding table in the second direction; the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image; in the process of matching the calibrated target image and the standard target image, the defect detection device is further used for: and splicing the first side target image and the second side target image along the second direction to form a complete objective table and an image of an object to be detected on the objective table.
In the implementation process, the first side target image and the second target image are spliced in the second direction, so that the complete objective table and the image of the object to be detected on the objective table can be restored more accurately, the coordinate position of the through hole on the image is more attached to the actual coordinate position of the through hole, and the detection accuracy is improved.
In a third aspect, an embodiment of the present application further provides a defect detection method, including: acquiring a target image processed by the image acquisition device; calibrating the target image according to the positioning holes in the target image to generate a calibrated target image; and matching the calibrated target image with the standard target image, and judging whether the object to be detected corresponding to the target image has a hole deviation defect or not.
In one embodiment, calibrating the target image according to the locating hole in the target image to generate a calibrated target image includes: determining a plurality of target positioning holes in the target image, the target positioning holes comprising the positioning holes of the first edge and the positioning holes of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image after the whole image is corrected so as to generate the calibrated target image.
In one embodiment, calibrating a local image in the target image with the target image corrected for the entirety of the image to generate the calibrated target image includes: determining the center point of a positioning hole of the target image in the first direction after the whole image is corrected, and respectively making straight lines parallel to the second direction along the center point; respectively determining partial images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the partial images according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate calibrated target images.
In one embodiment, matching the calibrated target image with the standard target image, and determining whether the object to be detected corresponding to the target image has a hole deviation defect includes: and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
In one embodiment, a first center split line of the stage in the second direction is parallel to a second center split line of the slide table in the second direction; the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image; matching the calibrated target image with the standard target image, and judging whether the object to be detected corresponding to the target image has a hole deviation defect or not, wherein the method comprises the following steps: and splicing the first side target image and the second side target image along the second direction to form a complete objective table and an image of an object to be detected on the objective table.
In a fourth aspect, an embodiment of the present application further provides a defect detection apparatus, including: a processor, a memory storing machine-readable instructions executable by the processor, which when executed by the processor, perform the steps of the method of the first aspect, or any of the possible implementations of the first aspect.
In a fifth aspect, the embodiments of the present application further provide a computer readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the defect detection method of the first aspect, or any of the possible implementations of the first aspect.
In order to make the above objects, features and advantages of the present application more comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a three-dimensional view of an image acquisition device including a sliding table and a line camera according to an embodiment of the present application;
fig. 2 is a three-dimensional view of an image acquisition device including two sliding tables and two line cameras provided in an embodiment of the present application;
FIG. 3 is a top view of a stage on which an object to be measured is placed and the object to be measured on the stage according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of a defect detection system according to an embodiment of the present disclosure;
FIG. 5 is a schematic block diagram of a defect detection apparatus according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram of determining three target positioning holes in a target image provided in an embodiment of the present application;
FIG. 7 is a schematic diagram of determining four target positioning holes in a target image provided in an embodiment of the present application;
FIG. 8 is a schematic view of a partial image of a target image for determining adjacent positioning holes according to an embodiment of the present disclosure;
FIG. 9 is a flowchart of a defect detection method according to an embodiment of the present disclosure;
fig. 10 is a schematic functional block diagram of a defect detecting device according to an embodiment of the present application.
Reference numerals: the device comprises an image acquisition device-10, an object stage-100, an object carrying groove-110, a sliding table-200, a linear array camera-300, a base-400, a sensor-500, a defect detection device-20, a memory-210, a processor-220, a peripheral interface-230, an acquisition module-221, a calibration module-222 and a judgment module-223.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application.
It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures. Meanwhile, in the description of the present application, the terms "first", "second", and the like are used only to distinguish the description, and are not to be construed as indicating or implying relative importance.
Printed circuit boards (Printed circuit boards, hereinafter referred to as PCBs) are an important component of electronic devices, and as electronic devices are increasingly demanded, the requirements for the precision of the manufacturing process of PCBs are continuously increasing, wherein defects in the through holes Kong Pian on PCBs are a key point affecting the precision of the subsequent processes. A through hole is a drilled hole in the PCB for mounting a plug or communicating interlayer routing. In the PCB production and processing, the drilling position is inconsistent with the design position of the through hole and is deviated, so that the method is one of common defects. The higher the degree of refinement of the subsequent processing, the lower the tolerance to via hole deviation defects.
At present, a PCB (printed circuit board) production line is generally provided with PCB through hole deviation detection equipment, and the detection principle of the existing detection equipment is as follows: through putting the PCB into the objective table, transporting to the detection box, standing, be furnished with in the detection box and can drive an area array camera to take a picture fast to every part of PCB in the mechanical structure that PCB place plane top XY direction was removed to the transportation, take a plurality of groups of photos according to the detection precision demand to give the industrial computer processing with the result of taking a picture, calculate the positional information of through-hole in the photo, whether the investigation has the through-hole Kong Pian defect.
The existing PCB through hole deviation detection equipment is higher in detection fineness, longer in detection time and more expensive in cost, is suitable for detecting products with high-precision specifications, cannot adapt to fast-paced production of a production line for PCB products with common specifications, and can be matched with the production speed of the production line only by matching multiple sets of detection equipment for one production line. Not only can more cost be consumed, but more space is occupied.
With the gradual development of PCBs, general specification production lines with higher yield requirements but lower precision requirements are increasing, if existing PCB through hole deviation detection equipment is used for detecting the PCB through hole deviation defects of the production lines, the production efficiency is seriously affected, so that in order to avoid excessive waste of production cost, some production lines adopt an original method of manual spot inspection, and the problems of randomness in detection and lag in detection result are caused.
In view of this, the present inventors propose an image acquisition apparatus, which acquires an image of an object to be detected by using a line camera, and calibrates the image acquired by the line camera by setting equidistant positioning holes at the edge of an objective table and using the positioning holes of the objective table as a reference, so as to overcome the defect that the line camera cannot be applied to defect detection, and improve the detection accuracy while improving the defect detection efficiency.
For the convenience of understanding the present embodiment, a detailed description will be first made of an image acquisition apparatus that performs the disclosure of the embodiments of the present application.
As shown in fig. 1, 2, and 3, the image acquisition apparatus 10 includes: stage 100, slide table 200, and line camera 300.
Wherein the stage 100 is disposed on the slide table 200 and configured to slide on the slide table 200 in a first direction; the line camera 300 is disposed at a first preset position of the upper ends of the slide table 200 and the stage 100, and is configured to acquire images of the stage 100 and the object to be measured on the stage 100.
The stage 100 herein includes a first edge disposed along a first direction and a second edge disposed along a second direction; the second direction is a direction orthogonal to the first direction on a plane on which the surface of the stage 100 is located; the first edge is provided with positioning holes at a first pitch, and the second edge is provided with positioning holes at a second pitch (as shown in fig. 3).
The stage 100 may be square, circular, trapezoidal, irregular polygonal, etc. The positioning holes of the stage 100 may be adaptively adjusted according to the actual shape of the stage 100, but it should be noted that the positioning holes are required to be provided in both the first direction and the second direction.
The first distance and the second distance may be equal or unequal, and the specific distance between the first distance and the second distance may be adjusted according to practical situations, which is not specifically limited in this application.
The sliding table 200 may be a single sliding rail (as shown in fig. 1), may also be two sliding rails in cooperation (as shown in fig. 2), and may also be multiple sliding rails in cooperation, and the specific setting mode of the sliding table 200 may be adjusted according to actual situations, which is not specifically limited in this application.
The surface that this objective table 100 and slip table 200 contacted is provided with this slide rail complex slider, and this slider can be a slider, two sliders or a plurality of sliders, and this slider setting mode corresponds with the setting of the last slide rail of slip table 200, and the quantity and the position of this slider can be adjusted according to actual conditions, does not do the specific restriction in this application.
It will be appreciated that the line camera 300 may include one or more. If the line camera 300 is one, the line camera 300 needs to be provided with a larger camera, so that the objective table 100 and the whole image of the object to be measured on the objective table 100 can be acquired during each image acquisition.
The first preset position should be directly above the sliding path of the stage 100 on the sliding table 200, so as to ensure that the linear camera 300 can vertically collect the images of the stage 100 and the object to be measured on the stage 100 when the stage 100 slides under the linear camera 300, and reduce the images of the distorted or deformed stage 100 and the object to be measured on the stage 100 due to the angle difference between the linear camera 300 and the object to be measured.
In the implementation process, the linear array camera is arranged to acquire the object table and the image of the object to be detected on the object table, so that the image acquisition can be performed in the sliding process of the object table, the object table does not need to stay on the sliding table for matching with the camera to perform the image acquisition, and the image acquisition efficiency is improved. In addition, the positioning holes are formed in the objective table, the positions of the positioning holes in the image can be used as reference to the image, the image under the condition that the objective table slides at a constant speed is restored, the image deviation caused by uneven sliding speed of the objective table is reduced, and further the accuracy of defect detection by using the image is improved.
In one possible implementation, the number of line cameras 300 is a plurality.
Here, the first center dividing line of the stage 100 in the second direction is parallel to the second center dividing line of the slide table 200 in the second direction; the first center dividing line divides the stage 100 into a first side of the stage 100 and a second side of the stage 100 in the second direction, and the second center dividing line divides the slide table 200 into a first side of the slide table 200 and a second side of the slide table 200 in the second direction.
The plurality of line cameras 300 are respectively arranged on the first side of the sliding table 200 and the second side of the sliding table 200 and are configured to respectively acquire the first side image of the object to be detected on the object stage 100 and the second side image of the object to be detected on the object stage 100; the plurality of line cameras 300 on the first side of the sliding table 200 and the second side of the sliding table 200 are arranged in a staggered manner in the second direction.
In actual image acquisition, due to the oversized stage 100, one line camera 300 cannot acquire all images of the stage 100 and the object under test on the stage 100 every time an image is acquired. At this time, a plurality of line cameras 300 may be disposed on the first side of the sliding table 200 and the second side of the sliding table 200, respectively, to disperse the image range collected by each line camera 300, so that all the images of the object to be measured on the object table 100 and the object table 100 can be collected.
In the actual image capturing process, due to the influence of the shape and size of the cameras of the line cameras 300, if two line cameras 300 are arranged side by side in the direction of the first side of the sliding table 200 and the second side of the sliding table 200, there may be a camera gap at the position where the two line cameras 300 are in contact, and further the stage 100 in the gap and the image of the object to be side arranged on the stage 100 cannot be captured. Therefore, the linear cameras 300 on the first side of the sliding table 200 and the second side of the sliding table 200 are arranged in a staggered manner, so that the two linear cameras 300 can be prevented from being contacted, and gaps are generated, so that the acquired images of the object stage 100 and the object to be detected on the object stage 100 are incomplete.
In the implementation process, the requirements on the structure and the functions of the linear array camera can be reduced by arranging a plurality of linear array cameras. In addition, through the dislocation set of a plurality of linear array cameras in the second direction, can prevent that the linear array camera from producing the clearance because of the shell contact between making the shooting, make the objective table in the clearance and the defect that the image of waiting the side thing on the objective table can not be obtained, can acquire the image of waiting the side thing on the more complete objective table, improved the integrality of the image of acquireing.
In one possible implementation, the carrier 100 has a carrier slot 110 disposed therein.
The inner frame of the carrying groove 110 is attached to the outer frame of the object to be measured, and is configured to fix the position of the object to be measured on the stage 100.
The structure of the carrying groove 110 is matched with the structure of the object to be measured, and the object to be measured is placed inside the carrying groove 110 so as to be placed at a fixed position of the stage 100.
It will be appreciated that other limiting structures may be provided on the stage 100 to limit the subject to a specific location. For example, the limiting structure may also be a protrusion, a boss, or the like.
In the implementation process, the object carrying groove is formed, the inner frame of the object carrying groove is attached to the outer frame of the object to be detected, so that the object to be detected is placed in the object carrying groove, the position of the object to be detected on the object stage is limited through the object carrying groove, the fixed position of the object to be detected on the object stage is ensured, the influence of the position change of the object to be detected on the acquired image is reduced, and the accuracy of the acquired image of the object to be detected on the object stage is improved.
In one possible implementation, the image acquisition apparatus 10 further includes: base 400 and sensor 500.
Wherein the sliding table 200 is disposed on the base 400; the sensor 500 is disposed at a second preset position of the base 400 and configured to detect positional information of the stage 100;
the second preset position here indicates the acquisition range of the line camera 300 in the first direction; the positional information is used to trigger the line camera 300 to start or end acquiring the image of the stage 100 and the object to be measured on the stage 100.
The sensor 500 may be a position sensor 500, a pressure sensor 500, a light sensor 500, or the like. The type of the sensor 500 may be selected according to practical situations, and the present application is not particularly limited. The sensor 500 is disposed on the sliding path of the stage 100, and the disposition position of the sensor 500 is determined according to the acquired image position of the line camera 300.
For example, if the line camera 300 is one, the second preset position may be set at the position where the first edge of the first line camera 300 is located when the line camera 300 can just acquire the image near the first edge.
If the number of the line cameras 300 is two, the arrangement positions of the first line camera 300 and the second line camera 300 in the first direction are: the first line camera 300 is disposed near the slide start position of the stage 100, and the second line camera 300 is disposed near the slide end position of the stage 100. The second preset position may be set at a position where the first edge of the first line camera 300 is located just when the first line camera 300 can acquire an image near the first edge.
In some embodiments, the sensor 500 may also be disposed on the sliding table 200, and the disposition position of the sensor 500 on the sliding table 200 is similar to the disposition position of the sensor 500 on the base 400, which is not described herein.
As can be appreciated, when the stage 100 reaches the position where the sensor 500 is disposed near the first edge of the line camera 300, the sensor 500 acquires the arrival information of the stage 100 and sends the arrival information to the line camera 300 controller to control the line camera 300 to start acquiring images of the stage 100 and the object to be measured on the stage 100. When the first edge of the stage 100 away from the line camera 300 leaves the position where the sensor 500 is disposed, the sensor 500 acquires the leaving information of the stage 100 and sends the leaving information to the line camera 300 controller to control the line camera 300 to stop acquiring images of the stage 100 and the object to be measured on the stage 100.
In some embodiments, the image capturing apparatus 10 includes a plurality of line cameras 300, where the plurality of line cameras 300 may share one sensor 500, or a plurality of sensors 500 may be disposed, and each line camera 300 corresponds to one line camera 300.
Illustratively, the number of line cameras 300 of the image acquisition device 10 is two, and two sensors 500 are also provided, and the two sensors 500 are respectively connected to one line camera 300. When the first edge of the stage 100 near the third line camera 300 reaches the position where the sensor 500 is disposed, the third sensor 500 connected to the third line camera 300 acquires the arrival information of the stage 100 and sends the arrival information to the third line camera 300 controller to control the third line camera 300 to start acquiring images of the stage 100 and the object to be measured on the stage 100. When the stage 100 is located away from the third sensor 500 at a position where the first edge of the third line camera 300 is located, the third sensor 500 obtains the departure information of the stage 100 and sends the departure information to the third line camera 300 controller to control the third line camera 300 to stop obtaining the images of the stage 100 and the object to be measured on the stage 100.
Similarly, when the first edge of the stage 100 near the fourth line camera 300 reaches the position where the sensor 500 is disposed, the fourth sensor 500 connected to the fourth line camera 300 acquires the arrival information of the stage 100 and sends the arrival information to the fourth line camera 300 controller to control the fourth line camera 300 to start acquiring images of the stage 100 and the object to be measured on the stage 100. When the stage 100 is located away from the fourth sensor 500 at a position where the first edge of the fourth line camera 300 is located away from the fourth sensor 500, the fourth sensor 500 acquires the departure information of the stage 100 and sends the departure information to the fourth line camera 300 controller to control the fourth line camera 300 to stop acquiring images of the stage 100 and the object to be measured on the stage 100.
Alternatively, when the image capturing apparatus 10 includes a plurality of line cameras 300, the plurality of line cameras 300 may be simultaneously activated or deactivated, or may be activated or deactivated at intervals. The on or off mode of the plurality of line cameras 300 may be set according to actual situations, and the present application is not particularly limited.
In the implementation process, the sensor is arranged to acquire the position information of the object stage through the sensor so as to control the start or end of the linear array camera to acquire the image, so that the start or stop of the linear array camera is influenced by the position of the object stage, the linear array camera starts to acquire the image when the image can be acquired, the acquisition of the image is stopped after the object stage leaves, the acquisition of the irrelevant image by the linear array camera is reduced, the accurate acquisition of the target object to be detected by the linear array camera is ensured, and the loss of the linear array camera is reduced.
In one possible implementation, the image acquisition apparatus 10 further includes: a linear array light source.
The line-scan light source is configured to illuminate the stage 100 and the object to be measured on the stage 100 when the line-scan camera 300 acquires images of the stage 100 and the object to be measured on the stage 100.
The linear array light source can be arranged in a matched manner with the linear array camera 300, or one linear array light source can correspond to a plurality of linear array cameras 300. The setting of the line-array light source can be adaptively adjusted according to the setting relationship between the line-array light source and the line-array camera 300.
For example, if the line-up light source is configured to match the line-up camera 300, the line-up light source may be disposed at a location of the line-up camera 300 corresponding thereto.
If one line light source is provided corresponding to a plurality of line cameras 300, the line light source may be provided at a position of an intermediate line camera 300 among the plurality of line cameras 300.
It will be appreciated that the above is merely exemplary, and the linear array light source may not be provided with the linear array camera 300, and a separate linear array light source holder may be provided for placing the linear array light source, or the linear array light source may be provided at a position capable of irradiating the entire sliding path of the stage 100, or the like.
In the implementation process, the object to be detected on the object table and the object table can be illuminated by arranging the linear array light source, so that the linear array camera can acquire clear images of the object table and the object to be detected on the object table, and the definition of the images of the object table and the object to be detected on the object table is improved.
FIG. 4 is a schematic diagram of a defect detection system according to an embodiment of the present application. The defect detection system includes: the defect detecting device 20 and the image acquiring device 10 described above.
Wherein the defect detecting device 20 is connected with the image acquiring device 10; the defect detecting device 20 is used for acquiring the target image sent by the image acquiring device 10; the defect detection device 20 is further configured to calibrate the target image according to the positioning hole in the target image, so as to generate a calibrated target image; and matching the calibrated target image with the standard target image, and judging whether the object to be detected corresponding to the target image has a hole deviation defect or not.
The defect detection device 20 herein is communicatively coupled to one or more image acquisition devices 10 via a network for data communication or interaction. The defect detection device 20 may be a web server, database server, personal computer (personal computer, PC), tablet computer, smart phone, personal digital assistant (personal digital assistant, PDA), etc.
As shown in fig. 5, a schematic block diagram of the defect detection apparatus 20 is shown. Defect detection device 20 may include a memory 210, a processor 220, and a peripheral interface 230. It will be appreciated by those of ordinary skill in the art that the configuration shown in fig. 5 is merely illustrative and is not intended to limit the configuration of defect detection device 20. For example, the defect detection apparatus 20 may also include more or fewer components than shown in fig. 5, or have a different configuration than shown in fig. 5.
The above-mentioned memory 210, processor 220 and peripheral interface 230 are electrically connected directly or indirectly to each other to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The processor 220 is configured to execute executable modules stored in the memory.
The Memory 210 may be, but is not limited to, a random access Memory (Random Access Memory, RAM), a Read Only Memory (ROM), a programmable Read Only Memory (Programmable Read-Only Memory, PROM), an erasable Read Only Memory (Erasable Programmable Read-Only Memory, EPROM), an electrically erasable Read Only Memory (Electric Erasable Programmable Read-Only Memory, EEPROM), etc. The memory 210 is configured to store a program, and the processor 220 executes the program after receiving an execution instruction, and the method executed by the defect detection device 20 defined by the process disclosed in any embodiment of the present application may be applied to the processor 220 or implemented by the processor 220.
The processor 220 may be an integrated circuit chip having signal processing capabilities. The processor 220 may be a general-purpose processor, including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but also digital signal processors (digital signal processor, DSP for short), application specific integrated circuits (Application Specific Integrated Circuit, ASIC for short), field Programmable Gate Arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The disclosed methods, steps, and logic blocks in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The above-described peripheral interface 230 couples various input/output devices to the processor 220 and the memory 210. In some embodiments, the peripheral interface 230, the processor 220, and the memory controller 112 may be implemented in a single chip. In other examples, they may be implemented by separate chips.
The target image may be an image of the stage 100 and the object to be measured on the stage 100 acquired by the image acquisition device 10, or an image of the stage 100 and the object to be measured on the stage 100 processed by the image acquisition device 10. Since the image obtained by the line camera 300 is typically a plurality of partial images of the object to be measured on the object stage 100 and the object stage 100, the line camera 300 can stitch the plurality of partial images to obtain a whole image. The image acquisition device 10 sends the stitched image to a defect processing device.
The standard image is an image of the object to be tested without defect, and may be stored in the memory 210, and when the processor 220 needs to match the calibrated target image with the standard image, the standard image is retrieved from the memory 210.
In the implementation process, by arranging the defect detection device, after the image acquired by the image acquisition device is calibrated through the positioning hole, the image of the object table and the object to be detected on the object table, which are acquired by the image acquisition device when the object table slides at a uniform speed, is restored, the influence of external factors on the detection result is reduced, and the accuracy of defect detection is improved.
In one possible implementation, in the process of generating the calibrated target image, the defect detection device 20 is specifically configured to: determining a plurality of target positioning holes in the target image, and correcting the whole image of the target image according to the actual coordinate relation of the plurality of target positioning holes; and calibrating the local image in the target image through the target image after the whole image is corrected so as to generate a calibrated target image.
The target positioning hole comprises a positioning hole of a second edge in the target image. The target positioning hole may be determined when the defect detecting device 20 detects a defect for the first time, or may be determined each time the defect detecting device 20 detects a defect. The target positioning holes may be the same or different each time defect detection is performed, and the application is not particularly limited.
Generally, the target image includes two first edges and two second edges, so that at least three target positioning holes are usually required to be selected to calibrate the whole target image, wherein two target positioning holes are used for determining the coordinates of the image in the first direction, and two target positioning holes are used for determining the coordinates of the target image in the second direction.
Illustratively, as shown in fig. 6, 3 target positioning holes, namely, a target positioning hole a, a target positioning hole B, and a target positioning hole C, are selected in the target image as shown in fig. 6. Wherein, the line of the target positioning hole A and the target positioning hole B is in the first direction, and the line of the target positioning hole A and the target positioning hole C is in the second direction. Since the positioning hole in the second direction is not shifted on the target image by the sliding speed of the stage 100 when the stage 100 is slid. Therefore, the actual coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C can be determined to be the coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C in the standard image, and then the target image can be mapped to the actual coordinate system in the target image according to the actual coordinates of the target positioning hole a, the target positioning hole B and the target positioning hole C, and then the whole target image is corrected.
In some cases, if there is a certain angle difference between the line camera 300 and the object stage 100, there may be a certain deformation of the acquired target image. In this case, if 4 target positioning holes are selected to correct the target image, the target image can be corrected more accurately and stably. As shown in fig. 7, 4 target positioning holes, i.e., a target positioning hole E, a target positioning hole F, a target positioning hole G, and a target positioning hole H, are selected in the target image shown in fig. 7. The connection line of the target positioning hole E and the target positioning hole F and the connection line of the target positioning hole G and the target positioning hole H are in the first direction, the connection line of the target positioning hole E and the target positioning hole H and the connection line of the target positioning hole F and the target positioning hole G are in the second direction, and the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H are four vertexes of a quadrangle. Since the positioning hole in the second direction is not shifted in the image by the sliding speed of the stage 100 when the stage 100 slides. Therefore, the actual coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H can be determined to be the coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H in the standard image, and then the target image can be mapped into the actual coordinate system in the target image according to the actual coordinates of the target positioning hole E, the target positioning hole F, the target positioning hole G and the target positioning hole H, and further the whole target image is corrected.
It will be appreciated that the above selection of the target positioning hole is merely exemplary, and the selection of the target positioning hole may be adjusted according to practical situations, which is not particularly limited in this application.
In some embodiments, to reduce the amount of computation and simplify the establishment of the actual coordinate system of the target image, the target positioning holes may select the positioning holes of the plurality of second edges.
In the implementation process, the coordinate system where the target image is located is mapped according to the actual coordinates of the target positioning holes in the target image, so that the whole target image is corrected. Since the target positioning hole is selected as the positioning hole in the second direction that is not affected by the stage sliding speed, the actual coordinates of the target positioning hole can be regarded as the coordinates of the target positioning hole in the standard image, and thus the target image coordinate system mapped by taking the actual coordinates of the target positioning hole as the reference coordinates is the standard coordinate system of the target image. The whole target image is corrected through the coordinate system, so that the influence of defects such as whole deviation and distortion of the target image on a detection result is avoided, and the accuracy of defect detection is improved.
In one possible implementation, in the process of generating the calibrated target image, the defect detection device 20 is specifically configured to: determining the center point of a positioning hole of the target image in the first direction after the integral correction of the image, and respectively making straight lines parallel to the second direction along the center point; respectively determining partial images between adjacent positioning holes through straight lines between the two adjacent positioning holes; and stretching or shrinking the partial images respectively according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate the calibrated target images.
The center point of the positioning hole is the center of the positioning hole. Since the positioning holes in the second direction are not affected by the sliding speed of the stage 100, the positioning holes in both the first direction and the second direction (e.g., the target point C in fig. 6) may be used as reference positioning holes for the positioning holes in the first direction to determine partial images between adjacent amounts of positioning holes, respectively.
In actual operation, the center points of the positioning holes in the first direction are respectively made to be parallel to the straight lines in the second direction, and then the image between the straight lines of the two adjacent positioning holes is a partial image. Since the first spacing between two adjacent alignment holes on the stage 100 is determined, the distance between two adjacent lines in the target image should also be the first spacing. Whether the linear array camera 300 slides the object stage 100 at a constant speed in the time period of collecting the adjacent positioning holes can be determined by judging whether the distance between two adjacent straight lines in the target image is spaced by a first distance, if the distance between two adjacent straight lines in the target image is spaced by the first distance, it is determined that the linear array camera 300 slides the object stage 100 at a constant speed in the time period of collecting the adjacent positioning holes, and then the local images between the adjacent positioning holes do not need to be processed. If the distance between two adjacent straight lines in the target image is not the first interval, it is determined that the stage 100 does not slide at a constant speed during the period of time when the line camera 300 collects the adjacent positioning holes, and then the local image between the adjacent positioning holes needs to be processed according to the distance between the adjacent positioning holes.
For example, as shown in fig. 8, if the distance between the positioning hole S1 and the positioning hole S2 is smaller than the first interval, it is determined that the stage 100 does not slide at a constant speed during the period in which the positioning hole S1 and the positioning hole S2 are collected by the line camera 300. And further pulling up the partial image between the positioning hole S1 and the positioning hole S2 according to the difference between the distance between the positioning hole S1 and the positioning hole S2 and the first interval.
After the partial image between the positioning hole S1 and the positioning hole S2 is pulled up, the distance between the positioning hole S2 and the positioning hole S3 is further compared with the first pitch. If the distance between the positioning hole S2 and the positioning hole S3 is greater than the first distance, it is determined that the stage 100 does not slide at a constant speed during the period of time in which the positioning hole S2 and the positioning hole S3 are collected by the line camera 300. The partial image between the positioning hole S2 and the positioning hole S3 is further contracted according to the difference between the distance between the positioning hole S2 and the positioning hole S3 and the first pitch.
After the partial image between the positioning hole S2 and the positioning hole S3 is shrunk, the distance between the positioning hole S3 and the positioning hole S4 is further compared with the first pitch. If the distance between the positioning hole S3 and the positioning hole S4 is equal to the first distance, it is determined that the stage 100 slides at a constant speed during the period of time when the linear camera 300 collects the positioning hole S3 and the positioning hole S4. The partial images between the positioning holes S3 and S4 do not need to be processed, and the relationship between the distance between the positioning holes S4 and S5 and the first pitch is further determined until the partial images in the target image after the whole image is corrected are calibrated.
In the implementation process, according to the characteristic that the distance between the positioning holes is constant, the relation between the distance between the positioning holes in the target image and the actual distance is determined whether the image between two adjacent positioning holes is stretched or compressed due to the influence of the sliding speed of the object stage, the compressed or stretched partial image is correspondingly pulled or compressed according to the difference value between the distance in the target image and the actual distance, each partial image of the target image is calibrated, the influence of the speed of the object stage on the detection result is avoided, and the accuracy of defect detection is improved.
In one possible implementation manner, in determining whether the object to be detected corresponding to the target image has a hole deviation defect, the defect detecting device 20 is specifically configured to: and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
It will be appreciated that the target image is restored in a uniform velocity state of the stage 100 after the entire target image and the partial images in the target image are all aligned. Therefore, the coordinate positions of all through holes of the object to be detected in the calibrated target image can be further obtained and matched with the coordinate positions of the corresponding through holes in the standard image. If the matching is consistent, the through hole is free from hole deviation defect. If the matching is inconsistent, kong Pian defects exist in the through hole.
Usually, the object to be measured is provided with a plurality of through holes, and all the through holes in the object to be measured can be matched at the same time, or can be matched in sequence according to the arrangement of the through holes in the target image.
If all the through holes in the object to be detected are sequentially matched according to the arrangement of the through holes in the target image, when partial through holes are determined to have hole deviation defects, continuous matching can be stopped, and the object to be detected with the through holes is directly judged to have the hole deviation defects. The partial through holes can be one through hole or a plurality of through holes specified according to the quality requirement, and the number and the setting rule of the partial through holes can be adjusted according to actual conditions, so that the method is not particularly limited.
In the implementation process, the coordinate positions of the through holes in the calibrated target image are respectively matched with the coordinate positions of the through holes in the standard image, so that whether the through holes are subjected to hole deviation defects or not can be determined, whether the object to be detected has the hole deviation defects or not is further determined, whether the through holes in the object to be detected have the hole deviation defects or not can be accurately checked, and the quality of the object to be detected is ensured.
In one possible implementation, in the process of matching the calibrated target image with the standard target image, the defect detection device 20 is further configured to: the first side target image and the second side target image are stitched along the second direction to form the complete stage 100 and the image of the object to be measured on the stage 100.
In some embodiments, if the line camera 300 includes a plurality of line cameras, and the plurality of line cameras 300 are respectively disposed on the first side of the sliding table 200 and the second side of the sliding table 200 and configured to respectively acquire the first side image of the object to be measured on the stage 100 and the second side image of the object to be measured on the stage 100, the target image acquired by the defect detecting device 20 includes the first side target image and the second side target image.
Therefore, in order to enable the calibrated target image to more restore the object to be measured on the object stage 100 and the image of the object to be measured on the object stage 100 when the object to be measured slides at a uniform speed, the first side target image and the second side target image may be first stitched along the second direction, so as to form a complete object to be measured on the object stage 100 and the object to be measured on the object stage 100 when the object to be measured slides at a uniform speed. And then matching the complete object stage 100 sliding at a uniform speed with the coordinate position of the through hole in the image of the object to be detected on the object stage 100 and the coordinate position of the through hole in the standard image, and determining whether the object to be detected has a hole deviation defect.
In the implementation process, the first side target image and the second target image are spliced in the second direction, so that the complete objective table and the image of the object to be detected on the objective table can be restored more accurately, the coordinate position of the through hole on the image is more attached to the actual coordinate position of the through hole, and the detection accuracy is improved.
The defect detection apparatus 20 in the present embodiment may be used to perform each step in each method provided in the embodiments of the present application. The implementation of the defect detection method is described in detail below by means of several embodiments.
Referring to fig. 9, a flowchart of a defect detection method according to an embodiment of the present application is shown. The specific flow shown in fig. 9 will be described in detail.
Step S201, acquiring the target image processed by the image acquisition device.
Step S202, calibrating the target image according to the positioning hole in the target image, so as to generate a calibrated target image.
Step S203, matching the calibrated target image with the standard target image, and determining whether the object to be detected corresponding to the target image has a hole deviation defect.
In one possible implementation, step S202 includes: determining a plurality of target positioning holes in the target image, the target positioning holes comprising the positioning holes of the first edge and the positioning holes of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image after the whole image is corrected so as to generate the calibrated target image.
In one possible implementation, calibrating a local image in the target image with the target image corrected for the entirety of the image to generate the calibrated target image includes: determining the center point of a positioning hole of the target image in the first direction after the whole image is corrected, and respectively making straight lines parallel to the second direction along the center point; respectively determining partial images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the partial images according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate calibrated target images.
In one possible implementation, step S203 includes: and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
In one possible implementation, step 203 further includes: and splicing the first side target image and the second side target image along the second direction to form a complete objective table and an image of an object to be detected on the objective table.
Based on the same application conception, the embodiment of the present application further provides a defect detection device corresponding to the defect detection method, and since the principle of solving the problem of the device in the embodiment of the present application is similar to that of the foregoing embodiment of the defect detection method, the implementation of the device in the embodiment of the present application may refer to the description in the embodiment of the foregoing method, and the repetition is omitted.
Fig. 10 is a schematic functional block diagram of a defect detecting device according to an embodiment of the present application. The respective modules in the defect detecting apparatus in the present embodiment are used to perform the respective steps in the above-described method embodiments. The defect detection device comprises an acquisition module 221, a calibration module 222 and a judgment module 223;
wherein,
the acquiring module 221 is configured to acquire the target image processed by the image acquiring device.
The calibration module 222 is configured to calibrate the target image according to the positioning hole in the target image, so as to generate a calibrated target image.
The judging module 223 is configured to match the calibrated target image with a standard target image, and judge whether the object to be detected corresponding to the target image has a hole deviation defect.
In a possible implementation, the calibration module 222 is further configured to: determining a plurality of target positioning holes in the target image, the target positioning holes comprising the positioning holes of the first edge and the positioning holes of the second edge in the target image; correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes; and calibrating a local image in the target image through the target image after the whole image is corrected so as to generate the calibrated target image.
In a possible implementation, the calibration module 222 is specifically configured to: calibrating a local image in the target image through the target image after correcting the whole image to generate the calibrated target image comprises the following steps: determining the center point of a positioning hole of the target image in the motion direction of the object stage after the integral correction of the image, and respectively making straight lines perpendicular to the motion direction along the center point; respectively determining partial images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes; and respectively stretching or shrinking the partial images according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate calibrated target images.
In a possible implementation manner, the judging module 223 is further configured to: and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
In a possible implementation manner, the judging module 223 is further configured to: and splicing the first side target image and the second side target image along the second direction to form a complete objective table and an image of an object to be detected on the objective table.
Furthermore, the present application provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor performs the steps of the defect detection method described in the above method embodiments.
The computer program product of the defect detection method provided in the embodiments of the present application includes a computer readable storage medium storing program code, where the program code includes instructions for executing the steps of the defect detection method described in the above method embodiments, and the detailed description thereof will be omitted herein.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other manners as well. The apparatus embodiments described above are merely illustrative, for example, flow diagrams and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, the functional modules in the embodiments of the present application may be integrated together to form a single part, or each module may exist alone, or two or more modules may be integrated to form a single part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, or other various media capable of storing program codes. It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other like elements in a process, method, article or apparatus that comprises the element.
The foregoing description is only of the preferred embodiments of the present application and is not intended to limit the same, but rather, various modifications and variations may be made by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principles of the present application should be included in the protection scope of the present application. It should be noted that: like reference numerals and letters denote like items in the following figures, and thus once an item is defined in one figure, no further definition or explanation thereof is necessary in the following figures.
The foregoing is merely specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes and substitutions are intended to be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. An image acquisition apparatus, comprising: objective table, slip table and linear array camera;
the objective table is arranged on the sliding table and is configured to slide on the sliding table along a first direction;
The linear array camera is arranged at a first preset position of the sliding table and the upper end of the objective table and is configured to acquire images of the objective table and an object to be detected on the objective table;
the stage includes a first edge disposed along the first direction and a second edge disposed along a second direction; wherein the second direction is a direction orthogonal to the first direction on a plane on which a surface of the stage is located;
the first edge is provided with positioning holes with a first interval, and the second edge is provided with positioning holes with a second interval;
the image acquisition device is configured to acquire a target image, wherein the target image is an image of the objective table and the object to be detected on the objective table; the target image comprises a plurality of positioning holes, and the positioning holes are used for calibrating the target image to generate a calibrated target image; the calibrated target image is used for matching with a standard image and judging whether an object to be detected corresponding to the target image has a hole deviation defect or not; the standard image is an image when the object to be detected has no defect.
2. The apparatus of claim 1, wherein the line camera is a plurality of;
A first center dividing line of the object stage in the second direction is parallel to a second center dividing line of the sliding table in the second direction;
the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, and the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction;
the linear array cameras are respectively arranged on the first side of the sliding table and the second side of the sliding table and are configured to respectively acquire the object to be detected on the object table and the first side image of the object to be detected on the object table and the second side image of the object to be detected on the object table;
the linear array cameras on the first side of the sliding table and the second side of the sliding table are arranged in a staggered mode in the second direction.
3. The device according to claim 1 or 2, wherein the object carrying table is provided with an object carrying groove;
the inner frame of the object carrying groove is attached to the outer frame of the object to be detected, and the inner frame is configured to fix the position of the object to be detected on the object stage.
4. The apparatus of claim 1, wherein the image acquisition apparatus further comprises: a base and a sensor;
The sliding table is arranged on the base;
the sensor is arranged at a second preset position of the base and is configured to detect position information of the objective table;
the second preset position indicates the acquisition range of the linear array camera in the first direction;
the position information is used for triggering the linear array camera to start or end to acquire the objective table and the image of the object to be measured on the objective table.
5. The apparatus of claim 4, wherein the image acquisition apparatus further comprises: a linear array light source;
the linear array light source is configured to illuminate the object table and the object to be measured on the object table when the linear array camera acquires the images of the object table and the object to be measured on the object table.
6. A defect detection system, comprising: defect detection apparatus and image acquisition apparatus according to any one of claims 1 to 5;
the defect detection device is connected with the image acquisition device;
the defect detection device is used for acquiring the target image sent by the image acquisition device;
the defect detection device is further used for calibrating the target image according to the positioning holes in the target image so as to generate a calibrated target image; and matching the calibrated target image with the standard image, and judging whether the object to be detected corresponding to the target image has a hole deviation defect or not.
7. The system according to claim 6, wherein in generating the calibrated target image, the defect detection device is specifically configured to:
determining a plurality of target positioning holes in the target image, the target positioning holes comprising the positioning holes of the second edge in the target image;
correcting the whole image of the target image according to the actual coordinate relation of the target positioning holes;
and calibrating a local image in the target image through the target image after the whole image is corrected so as to generate the calibrated target image.
8. The system of claim 7, wherein in generating the calibrated target image, the defect detection device is specifically configured to:
determining the center point of a positioning hole of the target image in the first direction after the whole image is corrected, and respectively making straight lines parallel to the second direction along the center point;
respectively determining partial images between two adjacent positioning holes through the straight lines between the two adjacent positioning holes;
and respectively stretching or shrinking the partial images according to the actual positions of the positioning holes until the partial images in the target images after the whole image is corrected are calibrated, so as to generate calibrated target images.
9. The system according to claim 7, wherein in determining whether the object to be detected corresponding to the target image has a hole deviation defect, the defect detecting device is specifically configured to:
and respectively matching the coordinate positions of the through holes of the to-be-detected object on the calibrated image with the coordinate positions of the through holes of the to-be-detected object in the standard image, and sequentially judging whether the through holes of the to-be-detected object have hole deviation defects according to the matching result.
10. The system of claim 6, wherein a first center split line of the stage in a second direction is parallel to a second center split line of the slide in the second direction; the first center dividing line divides the object stage into an object stage first side and an object stage second side along the second direction, the second center dividing line divides the sliding table into a sliding table first side and a sliding table second side along the second direction, and the target image comprises a first side target image and a second side target image;
in the process of matching the calibrated target image and the standard image, the defect detection device is further used for: and splicing the first side target image and the second side target image along the second direction to form a complete objective table and an image of an object to be detected on the objective table.
CN202211609606.6A 2022-12-14 2022-12-14 Image acquisition device and defect detection system Active CN115955614B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211609606.6A CN115955614B (en) 2022-12-14 2022-12-14 Image acquisition device and defect detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211609606.6A CN115955614B (en) 2022-12-14 2022-12-14 Image acquisition device and defect detection system

Publications (2)

Publication Number Publication Date
CN115955614A CN115955614A (en) 2023-04-11
CN115955614B true CN115955614B (en) 2024-01-26

Family

ID=87288779

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211609606.6A Active CN115955614B (en) 2022-12-14 2022-12-14 Image acquisition device and defect detection system

Country Status (1)

Country Link
CN (1) CN115955614B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105510348A (en) * 2015-12-31 2016-04-20 南京协辰电子科技有限公司 Flaw detection method and device of printed circuit board and detection equipment
CN105699399A (en) * 2016-03-11 2016-06-22 河北工业大学 Equipment and method for detecting quality of SMT (surface-mount technology) stencil
CN109900723A (en) * 2019-04-26 2019-06-18 李配灯 Glass surface defects detection method and device
CN112816501A (en) * 2021-01-05 2021-05-18 中钞印制技术研究院有限公司 Bill quality detection device, evaluation device and bill quality detection method
CN113467203A (en) * 2021-06-10 2021-10-01 东莞市多普光电设备有限公司 Method for aligning platform by using camera, aligning device and direct imaging photoetching equipment
CN113532316A (en) * 2021-07-05 2021-10-22 深圳市先地图像科技有限公司 Device and method capable of simultaneously detecting shape and position deviations of multiple PCBs
CN215868286U (en) * 2021-09-16 2022-02-18 华南理工大学 Machine vision teaching experiment platform of linear array scanning type
CN216159821U (en) * 2021-01-19 2022-04-01 深圳市全洲自动化设备有限公司 Detection apparatus for realize that adjacent region shoots synchronous
CN114354629A (en) * 2022-01-07 2022-04-15 苏州维嘉科技股份有限公司 Detection equipment
CN114688998A (en) * 2020-12-31 2022-07-01 深圳中科飞测科技股份有限公司 Method and device for adjusting flatness of slide glass table
CN114720376A (en) * 2022-03-07 2022-07-08 武汉海微科技有限公司 Image acquisition device and method for detecting screen defects
CN115334227A (en) * 2022-10-18 2022-11-11 菲特(天津)检测技术有限公司 Gear image acquisition device and method, gear image acquisition method and electronic equipment
CN115420746A (en) * 2022-09-01 2022-12-02 深圳市源川科技有限公司 Quality detection method, quality detection device and quality detection equipment for printed parts

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105510348A (en) * 2015-12-31 2016-04-20 南京协辰电子科技有限公司 Flaw detection method and device of printed circuit board and detection equipment
CN105699399A (en) * 2016-03-11 2016-06-22 河北工业大学 Equipment and method for detecting quality of SMT (surface-mount technology) stencil
CN109900723A (en) * 2019-04-26 2019-06-18 李配灯 Glass surface defects detection method and device
CN114688998A (en) * 2020-12-31 2022-07-01 深圳中科飞测科技股份有限公司 Method and device for adjusting flatness of slide glass table
CN112816501A (en) * 2021-01-05 2021-05-18 中钞印制技术研究院有限公司 Bill quality detection device, evaluation device and bill quality detection method
CN216159821U (en) * 2021-01-19 2022-04-01 深圳市全洲自动化设备有限公司 Detection apparatus for realize that adjacent region shoots synchronous
CN113467203A (en) * 2021-06-10 2021-10-01 东莞市多普光电设备有限公司 Method for aligning platform by using camera, aligning device and direct imaging photoetching equipment
CN113532316A (en) * 2021-07-05 2021-10-22 深圳市先地图像科技有限公司 Device and method capable of simultaneously detecting shape and position deviations of multiple PCBs
CN215868286U (en) * 2021-09-16 2022-02-18 华南理工大学 Machine vision teaching experiment platform of linear array scanning type
CN114354629A (en) * 2022-01-07 2022-04-15 苏州维嘉科技股份有限公司 Detection equipment
CN114720376A (en) * 2022-03-07 2022-07-08 武汉海微科技有限公司 Image acquisition device and method for detecting screen defects
CN115420746A (en) * 2022-09-01 2022-12-02 深圳市源川科技有限公司 Quality detection method, quality detection device and quality detection equipment for printed parts
CN115334227A (en) * 2022-10-18 2022-11-11 菲特(天津)检测技术有限公司 Gear image acquisition device and method, gear image acquisition method and electronic equipment

Also Published As

Publication number Publication date
CN115955614A (en) 2023-04-11

Similar Documents

Publication Publication Date Title
TWI440847B (en) Inspection method
US20140210993A1 (en) Automatic programming of solder paste inspection system
US20120327214A1 (en) System and method for image calibration
TW502111B (en) Inspection method for foreign matters inside through hole
KR20170065499A (en) Electrical test system with vision-guided alignment
TW571081B (en) Method and apparatus for examining foreign matters in through holes
JP2008185514A (en) Substrate visual inspection apparatus
KR20120054689A (en) Inspection method
KR101132779B1 (en) Inspection method
JP2012108130A (en) Board inspection method
US9743527B2 (en) Stencil programming and inspection using solder paste inspection system
KR101545186B1 (en) method of correction of defect location using predetermined wafer image targets
CN115955614B (en) Image acquisition device and defect detection system
CN104034259A (en) Method for correcting image measurement instrument
CN103297799A (en) Testing an optical characteristic of a camera component
CN109425327A (en) Inspection system and the inspection modification method of image
CN110823897A (en) Sample plate
CN114581431A (en) Flexible circuit board reinforcing sheet position detection method and image acquisition system
CN207832425U (en) Lens detecting device
CN111077155A (en) Display panel inspection system and display panel inspection method
KR101657949B1 (en) Inspection method
JP4757595B2 (en) Mounting component inspection equipment
JP4261535B2 (en) Alignment method and evaluation method in mask inspection apparatus
JPH0160766B2 (en)
KR20150076544A (en) Method of reviewing defect of substrate

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant