CN109539978B - Image detection system, image detection device, and image detection method - Google Patents

Image detection system, image detection device, and image detection method Download PDF

Info

Publication number
CN109539978B
CN109539978B CN201710867556.4A CN201710867556A CN109539978B CN 109539978 B CN109539978 B CN 109539978B CN 201710867556 A CN201710867556 A CN 201710867556A CN 109539978 B CN109539978 B CN 109539978B
Authority
CN
China
Prior art keywords
image
light sources
piece
module
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710867556.4A
Other languages
Chinese (zh)
Other versions
CN109539978A (en
Inventor
朱志浩
周扬
陈阁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenji Shanghai Intelligent System R&d Design Co ltd
Original Assignee
Shenji Shanghai Intelligent System R&d Design Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenji Shanghai Intelligent System R&d Design Co ltd filed Critical Shenji Shanghai Intelligent System R&d Design Co ltd
Priority to CN201710867556.4A priority Critical patent/CN109539978B/en
Publication of CN109539978A publication Critical patent/CN109539978A/en
Application granted granted Critical
Publication of CN109539978B publication Critical patent/CN109539978B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Abstract

The invention provides an image detection system, an image detection device and an image detection method, which are suitable for detecting a piece to be detected, wherein the image detection system comprises: the illumination module is suitable for illuminating the piece to be tested and comprises a plurality of light sources, and light emitted by the light sources is obliquely incident to the piece to be tested from different directions; the image acquisition module is arranged above the piece to be detected and is suitable for acquiring images of the piece to be detected; the acquisition control module is connected with the illumination module and the image acquisition module, is suitable for controlling the plurality of light sources to be respectively turned on and off, and controls the image acquisition module to acquire images of the piece to be detected when the light sources are turned on so as to obtain a plurality of images of the piece to be detected under the illumination of different light sources; and the image processing module is connected with the image acquisition module, is suitable for synthesizing a plurality of images to obtain a synthesized image, and is also suitable for extracting edge characteristics of the synthesized image to realize the detection of the to-be-detected piece. The invention ensures that the piece to be detected in the synthetic image has complete edge characteristics, thereby obtaining higher detection precision.

Description

Image detection system, image detection device, and image detection method
Technical Field
The present invention relates to the field of computer vision measurement, and in particular, to an image detection system, an image detection apparatus, and an image detection method.
Background
With the continuous development of the technology, geometric measurement has been developed from simple planar and regular geometric bodies such as dimensions and positions to complex two-dimensional and three-dimensional bodies, the dimensional range has also changed greatly, and the traditional contact measurement can not meet the requirements, so that various non-contact measurement methods come into force. Computer vision measurement has become more and more widely used in various fields as an efficient and advanced non-contact measurement means.
The computer vision measurement is a measurement technology which is based on a computer vision technology and integrates an electronic technology, a computer technology, a close-range photogrammetry technology and an image processing technology into a whole, and the geometric information of a piece to be measured in a three-dimensional space is calculated from image information for the purpose of measurement.
However, the quality of images obtained by computer vision measurement is currently in need of improvement.
Disclosure of Invention
The invention provides an image detection system, an image detection device and an image detection method, which can improve the quality of images obtained by computer vision measurement, thereby improving the detection precision of the computer vision measurement.
In order to solve the above problem, the present invention provides an image detection system, which is suitable for detecting a workpiece to be detected, the image detection system comprising: the illumination module is suitable for illuminating the piece to be tested and comprises a plurality of light sources, and light emitted by the light sources is obliquely incident to the piece to be tested from different directions; the image acquisition module is arranged above the piece to be detected and is suitable for acquiring images of the piece to be detected; the acquisition control module is connected with the illumination module and the image acquisition module, is suitable for controlling the plurality of light sources to be respectively turned on and off, and controls the image acquisition module to acquire images of the piece to be detected when the light sources are turned on so as to obtain a plurality of images of the piece to be detected under the illumination of different light sources; and the image processing module is connected with the image acquisition module, is suitable for synthesizing the plurality of images to obtain a synthesized image, and is also suitable for extracting edge characteristics of the synthesized image to realize the detection of the piece to be detected.
Correspondingly, the invention also provides an image detection device, comprising: the image detection system of the present invention; and the bracket is suitable for fixing the lighting module and the image acquisition module.
In addition, the invention also provides an image detection method, which is suitable for detecting the piece to be detected, and the image detection method comprises the following steps: arranging an illumination module to illuminate the piece to be tested, wherein the illumination module comprises a plurality of light sources, and light energy emitted by the light sources is obliquely incident to the piece to be tested from different directions; controlling the plurality of light sources to be respectively turned on and off, and acquiring images of the to-be-detected piece when the light sources illuminate the to-be-detected piece to obtain a plurality of images of the to-be-detected piece under illumination of different light sources; and synthesizing the plurality of images to obtain a synthesized image, and extracting edge features of the synthesized image to realize the detection of the piece to be detected.
Compared with the prior art, the technical scheme of the invention has the following advantages:
the invention provides an image detection system suitable for detecting a piece to be detected, which adopts a mode that a plurality of light sources are obliquely incident to the piece to be detected from different directions to acquire images and synthesize a plurality of acquired images, so that clear edge characteristics of the piece to be detected can be respectively presented under the illumination of different light sources, and the piece to be detected has more complete edge characteristics in the synthesized image, thereby improving the quality of the synthesized image obtained by computer vision measurement and further obtaining higher detection precision when the synthesized image is used for extracting and detecting the edge characteristics.
Drawings
FIG. 1 is a schematic diagram of an image detection method;
FIG. 2 is a schematic illustration of an image obtained using the detection method described in FIG. 1;
FIG. 3 is a functional block diagram of an embodiment of an image detection system of the present invention;
FIG. 4 is a schematic diagram of the image inspection system of FIG. 3;
FIG. 5 is a schematic diagram of an embodiment of an illumination module in the image inspection system of FIG. 3;
fig. 6 to 9 are images of the object to be tested obtained by the illumination module shown in fig. 5 under different light sources;
FIG. 10 is a composite image formed from the images shown in FIGS. 6-9;
FIG. 11 is a schematic diagram of another embodiment of an illumination module in the image inspection system of FIG. 3;
FIGS. 12 and 13 are images of a test object obtained by the illumination module of FIG. 11 under different illumination conditions;
FIG. 14 is a composite image formed from the images shown in FIGS. 12 and 13;
FIG. 15 is a flowchart illustrating an image detection method according to an embodiment of the present invention.
Detailed Description
As is known in the art, the quality of images obtained by computer vision measurement needs to be improved.
The reason why the image quality needs to be improved is analyzed by combining an image detection method. Referring to fig. 1 and 2 in combination, fig. 1 shows a schematic structural diagram of an image detection method, and fig. 2 is a schematic structural diagram of an image obtained by the detection method of fig. 1. The image detection method comprises the following steps: providing a piece to be tested 20; providing a light source 10; the light source 10 is adopted to illuminate the piece to be measured 20 above the piece to be measured 20, and after the light source 10 illuminates the piece to be measured 20, the piece to be measured 20 is shot once, so that an image 30 (shown in fig. 2) is obtained.
The above-mentioned illumination manner can better acquire the edge features with color distinction of the object to be tested, but when the edge features of the object to be tested 20 are concave-convex edge features without color distinction, it is difficult for the illumination manner to better display the concave-convex edge features on the image 30 (as shown in fig. 2, the dotted lines in the figure indicate the areas where the edge features cannot be presented in the image 30). Specifically, the image 30 formed by the illumination mode can only partially present the edge information of the object 20, so that the edge features of the object 20 in the image 30 are incomplete, and thus the quality of the obtained image is poor, which is not beneficial to the extraction of the edge features of the image in computer vision measurement, and is not beneficial to the improvement of the detection accuracy.
In addition, for the piece to be measured with a highlight surface, highlight reflection easily occurs on the surface of the piece to be measured during light irradiation, and even if the edge features of the piece to be measured have color differences, all the edge features of the obtained image cannot be presented easily due to highlight reflection, so that the extraction of the edge features of the image is not facilitated.
In order to solve the technical problem, the invention provides an image detection system suitable for detecting a to-be-detected piece, which adopts a mode that a plurality of light sources are obliquely incident to the to-be-detected piece from different directions to acquire images and synthesize a plurality of acquired images, so that clear edge characteristics of the to-be-detected piece can be respectively presented under the illumination of different light sources, and thus, the to-be-detected piece has more complete edge characteristics in the synthesized image, the quality of the synthesized image obtained by computer vision measurement can be improved, and higher detection precision can be obtained when the synthesized image is used for extracting and detecting the edge characteristics.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
With reference to fig. 3 and fig. 4, a functional block diagram and a structural schematic diagram of an embodiment of the image detection system of the present invention are respectively shown.
In this embodiment, the image inspection system is suitable for performing image inspection on the object 120 to be inspected (as shown in fig. 4).
Specifically, the image detection system includes: an illumination module 100 (shown in fig. 3) adapted to illuminate the object 120 to be tested, wherein the illumination module 100 includes a plurality of light sources 110 (shown in fig. 4), and light emitted by the light sources 110 is obliquely incident on the object 120 to be tested from different directions; an image acquisition module 200 (shown in fig. 3) disposed above the to-be-detected piece 120 and adapted to acquire an image of the to-be-detected piece 120; an acquisition control module 300 (shown in fig. 3), connected to the illumination module 100 and the image acquisition module 200, and adapted to control the light sources 110 to be turned on and off respectively, and control the image acquisition module 200 to perform image acquisition on the to-be-measured object 120 when the light sources 110 are turned on, so as to obtain a plurality of images of the to-be-measured object 120 under illumination of different light sources 110; an image processing module 400 (as shown in fig. 3) connected to the image capturing module 200, and adapted to synthesize the plurality of images to obtain a composite image, and further adapted to perform edge feature extraction on the composite image to detect the dut 120.
By adopting the mode that the light sources 110 are obliquely incident to the piece to be detected 120 from different directions and image acquisition is carried out and a plurality of acquired images are synthesized, the clear edge characteristics of the piece to be detected 120 can be respectively presented under the illumination of different light sources 110, so that the piece to be detected 120 has complete edge characteristics in a synthesized image, the quality of the synthesized image obtained by computer vision measurement can be improved, and higher detection precision is obtained when the synthesized image is used for extracting and detecting the edge characteristics.
The image detection system provided in the present embodiment will be described in detail below with reference to the accompanying drawings.
With continuing reference to fig. 3 and 4 in conjunction with fig. 5, fig. 5 is a schematic diagram of an embodiment of an illumination module in the image detection system shown in fig. 3. For convenience of illustration, fig. 5 also shows a schematic structural diagram of the to-be-tested object.
In this embodiment, the device under test 120 has a concave-convex edge feature. Specifically, the shape of the device under test 120 is an a-shaped structural member with a certain protrusion height.
When the illumination module 100 illuminates the object 120 to be tested, the light sources 110 are obliquely incident on the object 120 to be tested from different directions, and concave-convex edge features of the object 120 to be tested appear as shadows in different directions under the illumination of the different light sources 110; in this embodiment, after obtaining a plurality of images of the object 120 under illumination by different light sources 110, the parts with obvious edge features in the plurality of images can be synthesized, so that the object 120 has more complete edge features in the synthesized image.
It should be noted that the incident angle of the light emitted from the light source 110 on the surface of the dut 120 should not be too small, and should not be too large. If the incident angle is too small, it is easy to cause the shadow of the concave-convex edge feature under the illumination of the light source 110 to be inconspicuous, thereby being unfavorable for the appearance of the edge feature in the obtained composite image; if the angle of incidence is too large, it may be difficult to fully illuminate the edge feature, particularly the raised portions of the edge feature, and correspondingly may not facilitate the presentation of the edge feature of the edge test piece 120 in the resulting composite image. For this reason, in this embodiment, the incident angle of the light emitted from the light source 110 on the surface of the dut 120 is 30 ° to 60 °.
It should also be noted that when the concave-convex difference of the concave-convex edge feature is shallow, the angle of the incident angle can be increased to improve the visibility of the shadow. For example, the angle of the incident angle is increased so that the incident angle is in the range of 45 ° to 60 °.
In this embodiment, the light source 110 is an LED light source. The LED light source has the characteristics of high stability, high response speed, high brightness and the like, so that stable brightness can be quickly achieved after the LED light source is started, and the detection efficiency is favorably improved; in the process of illuminating the to-be-measured part 120 by using different light sources 110 and acquiring the image of the to-be-measured part 120, the brightness consistency under illumination by different light sources 110 can be improved, so that the quality of the obtained composite image and the good presentation of the edge characteristics in the composite image are improved.
The light source 110 may be one or more of a bar light source, an arc light source, and a point light source. In this embodiment, the light source 110 is a bar light source. The size of the light emitting surface of the bar light source is large, so when the bar light source is used for illuminating the to-be-tested part 120, the edge feature close to one side of the light source 110 can be completely illuminated.
In this embodiment, the light source 110 includes a light emitting surface (not labeled), and a maximum length of the light emitting surface close to one side of the device under test 120 is greater than a maximum length of the device under test 120 in a direction parallel to the light emitting surface; therefore, when the light sources 110 are obliquely incident on the device under test 120 from different directions, the edge features on the side close to the light sources 110 can be ensured to be completely illuminated, so that the edge features can be respectively presented under the illumination of different light sources 110, and the probability of the occurrence of the problem of missing edge feature parts in the composite image is reduced.
In this embodiment, the to-be-measured object 120 has concave-convex edge characteristics, and correspondingly, the number of the light sources 110 in the lighting module 100 is greater than or equal to four, and the light sources 110 surround the to-be-measured object 120 to prevent the problem that part of the edge characteristics are not illuminated, so that in a composite image, the to-be-measured object 120 has more complete edge characteristics, and the problem of edge characteristic missing is avoided.
Specifically, the light sources 110 surround the to-be-detected piece 120, so that when the light sources 110 are obliquely incident to the to-be-detected piece 120 from different directions, the uniformity of the illumination effect of the to-be-detected piece 120 can be improved, and the composite image is favorably formed, wherein the to-be-detected piece 120 has complete edge characteristics.
As shown in fig. 5, in the present embodiment, the illumination module 100 includes four strip light sources, and the four strip light sources are arranged in a rectangular shape around the device under test 120. Specifically, the four light sources 110 include a first light source 111, a second light source 112, a third light source 113, and a fourth light source 114, which are sequentially arranged.
With continuing reference to fig. 3 and 4, the image capturing module 200 is configured to capture an image of the dut 120 to obtain a plurality of images.
In this embodiment, the image capturing module 200 is an industrial camera having a triggering photographing function, so that the photographing can be triggered after the light source 110 is turned on.
The industrial camera has high image stability, high transmission capability, high anti-interference capability and the like, so that the shooting quality is favorably improved, and the display effect of the edge characteristics of the piece to be measured 120 under the illumination of different light sources 110 is favorably improved.
Specifically, the image capturing module 200 is a camera and lens assembly with a trigger capturing function, for example, the image capturing module 200 may be a CCD (Charge Coupled Device) camera or a CMOS (Complementary Metal Oxide Semiconductor) camera.
With continuing reference to fig. 3 and 4, the acquisition control module 300 is adapted to control the light sources 110 to be turned on and off respectively, and control the image acquisition module 200 to acquire an image of the dut 120 when the light sources 110 are turned on.
The acquisition control module 300 is configured to control the lighting module 100 to be turned on and off, and is further configured to trigger the image acquisition module 200 to perform image acquisition after the light sources 110 are turned on and before the light sources 110 are turned off, so that it can be ensured that only one light source 110 is turned on and the remaining light sources 110 are turned off when the to-be-detected piece 120 performs image acquisition each time.
With combined reference to fig. 6 to 9, an image of the dut obtained by the acquisition control module controlling the illumination module and the image acquisition module shown in fig. 5 is shown.
Specifically, the acquisition control module 300 turns on the first light source 111 (as shown in fig. 5), turns off the second light source 112 (as shown in fig. 5), the third light source 113 (as shown in fig. 5), and the fourth light source 114 (as shown in fig. 5), illuminates the a-shaped object to be measured 120 with the first light source 111 (as shown in fig. 5), and controls the image acquisition module 200 to acquire an image of the object to be measured 120 when the first light source 111 illuminates the object to be measured 120, so as to obtain a first image 610 (as shown in fig. 6), where a top area of the a-shape in the first image 610 is a bright area, and a bottom of the a-shape is a shadow area. It should be noted that the bright area is not shaded at the edge of the a-shaped structure because it faces the direction of light, so the edge features are not sufficiently obvious, and the shaded area is shaded due to the raised portion of the a-shaped structure, so the edge features are relatively clear.
After the first image 610 is obtained, the acquisition control module 300 turns on the second light source 112, turns off the first light source 111, the third light source 113, and the fourth light source 114, illuminates the a-shaped object to be measured 120 with the second light source 112, and controls the image acquisition module 200 to acquire an image of the object to be measured 120 when the second light source 112 illuminates the object to be measured 120, so as to obtain a second image 620 (as shown in fig. 7), where the second image 620 corresponds to an O area (as shown in fig. 5) as a bright area, and corresponds to an O' area (as shown in fig. 5) as a shadow area.
Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
After the second image 620 is obtained, the acquisition control module 300 turns on the third light source 113, turns off the first light source 111, the second light source 112, and the fourth light source 114, illuminates the to-be-detected piece 120 with the third light source 113, and controls the image acquisition module 200 to acquire an image of the to-be-detected piece 120 when the to-be-detected piece 120 is illuminated by the third light source 113, so as to obtain a third image 630 (as shown in fig. 8), wherein the top area of the a-shape in the third image 630 is a shadow area, and the bottom of the a-shape is a bright area. Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
After the third image 630 is obtained, the acquisition control module 300 turns on the fourth light source 114, turns off the first light source 111, the second light source 112, and the third light source 113, illuminates the to-be-detected piece 120 with the fourth light source 114, and controls the image acquisition module 200 to acquire an image of the to-be-detected piece 120 when the to-be-detected piece 120 is illuminated by the fourth light source 114, so as to obtain a fourth image 640 (as shown in fig. 9), where a shadow area corresponds to an O area (as shown in fig. 5) and a bright area corresponds to an O' area (as shown in fig. 5) in the fourth image 640. Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
In this embodiment, the acquisition control module 300 is a Programmable Logic Controller (PLC) and a relay. The programmable logic controller is a digital operation electronic system specially designed for application in industrial environment, has the advantages of high reliability, strong anti-interference capability and the like, has more control functions, and can conveniently modify programs, change control methods and control objects; the relay is an automatic switch element with an isolation function, is widely applied to remote control, remote measurement, communication, automatic control, electromechanical integration and power electronic equipment, and is one of the most important control elements.
With continued reference to fig. 3, the image processing module 400 is connected to the image capturing module 200, and is adapted to synthesize the plurality of images to obtain a composite image, and further adapted to perform edge feature extraction on the composite image to realize the detection of the dut 120 (shown in fig. 4).
Referring to fig. 10 in combination, a composite image formed by the images shown in fig. 6 to 9 is shown, the image processing module 400 converts the information such as pixel distribution and brightness in the plurality of images into a digitized signal, processes the digitized signal to obtain a composite image 650 (shown in fig. 10), and performs edge detection on the composite image 650 to extract edge features in the composite image 650.
In this embodiment, the image processing module 400 combines the regions with relatively obvious edge features (shaded regions) in fig. 6 to 9 to form the composite image 650.
Specifically, the image processing module 400 is adapted to obtain the gray scale of each pixel point in each image, and use the minimum gray scale value of the same pixel point in a plurality of images as the gray scale of the corresponding pixel point in the synthesized image, so as to synthesize the plurality of images. Because the pixel points with smaller gray scale correspond to the pixel points in the shadows, the shadows in the multiple images are reserved by taking the minimum gray scale value of the same pixel point in the multiple images, so that the to-be-detected object 120 in the composite image 650 has all the shadows, and the reserved shadows and the raised parts form bright and dark edge features, so that the formed composite image 650 can be used for edge feature extraction and image detection.
In this embodiment, the image processing module 400 is an Industrial Personal Computer (IPC), i.e., an Industrial control Computer. The industrial personal computer has important computer attributes and characteristics, such as a computer CPU, a hard disk, a memory, peripherals and interfaces, an operating system, a control network and protocol, computing power and a friendly Human-Machine Interface (HMI). In other embodiments, the image processing module may also be a computer processing system.
It should be noted that the image processing module 400 is further connected to the acquisition control module 300, and is adapted to send an instruction to the acquisition control module 300 to trigger the acquisition control module 300 to control the illumination module 100 and the image acquisition module 200.
Specifically, the image processing module 400 controls the illumination module 100 and the image capturing module 200 by sending image capturing logic instructions to the capturing control module 300.
It should be further noted that, in the present embodiment, the workpiece 120 has a concave-convex edge characteristic as an example for description. In other embodiments, the object to be measured may also have no obvious concave-convex edge, but due to reasons such as high surface smoothness of the object to be measured, or high reflectivity of the surface material of the object to be measured to light, the surface of the object to be measured is prone to high light reflection during light irradiation, so that a high light surface is generated.
Correspondingly, with adopt single light source to be in it illuminates to await measuring the top compare the scheme of awaiting measuring, through adopting a plurality of light sources from different directions oblique incidence extremely await measuring, can avoid because the high light reflection is difficult to present the problem of marginal feature in the image, in correspondingly also can making the composite image, it has comparatively complete marginal feature to await measuring.
Referring to fig. 11 in combination, a schematic structural diagram of another embodiment of the illumination module in the image detection system shown in fig. 3 is shown. It should be noted that, for convenience of illustration, fig. 11 also shows a schematic structural diagram of the dut, and the surface of the dut has an a-shaped pattern.
The parts of this embodiment that are the same as the parts of the foregoing embodiment will not be described again. The present embodiment differs from the previous embodiments in that: the illumination module 800 includes at least two light sources 810, and the two light sources 810 respectively illuminate the device under test 700 to form highlight areas at different positions of the highlight surface.
The image formed by the highlight area is easy to have the problem that the edge features are not obvious, and the highlight area is positioned at different positions of the highlight surface of the to-be-measured piece 700 through the irradiation of the two light sources 810, so that the influence of the highlight area in a plurality of images can be eliminated when the images are synthesized, the edge features of the non-highlight area which are obvious are reserved, and the to-be-measured piece 700 in the synthesized image has complete edge features.
Specifically, the highlight surface is an arc surface, and the lighting module 800 correspondingly includes two bar light sources, which are parallel to each other and located above the outer sides of the two top ends of the arc surface, respectively. It should be noted that, as shown in fig. 11, the arc surface includes a first rib 701 and a second rib 702 that are parallel to each other, and an arc line (not labeled) located between the first rib 701 and the second rib 702, the arc surface includes a first top end M between an arc top LL ' and the first rib 701, and a second top end M ' between an arc top LL ' and the second rib 702, and outer sides of the two top ends of the arc surface refer to: the first top end M is close to one side of the first rib 701, and the second top end M' is close to one side of the second rib 702.
Since the two strip light sources can obliquely enter the object 700 from different directions, different highlight areas are generated in the obtained images, and the formed highlight areas are located at different positions of the highlight surface.
In this embodiment, the light source 810 includes a first bar light source 801 (near the top of the A-shape) and a second bar light source 802 (near the bottom of the A-shape).
With combined reference to fig. 12 and 13, an image of the dut obtained by the acquisition control module controlling the illumination module and the image acquisition module shown in fig. 11 is shown.
Specifically, the acquisition control module turns on the first bar-shaped light source 801 (as shown in fig. 11), turns off the second bar-shaped light source 802 (as shown in fig. 11), illuminates the a-shaped object 700 to be tested with the first bar-shaped light source 801 (as shown in fig. 11), and acquires an image of the object 700 to be tested when the first bar-shaped light source 801 illuminates the object 700 to be tested, so as to obtain a first image 710 (as shown in fig. 12); since the first strip light source 801 is located at the top of the a-shaped device under test 700, accordingly, a highlight region (shown as region a in fig. 12) is formed at the top of the a-shape in the first image 710, and other regions in the first image 710 are non-highlight regions. It should be noted that, for a high light region, because the image capturing module is located in the specular reflection direction, and the brightness of the high light region is relatively high, overexposure is easily generated in the image capturing module, so that the image captured by the image capturing module is almost all white, which results in that the edge feature of the a-shaped pattern in the high light region in the first image 710 is not obvious enough, and for a non-high light region, because the image capturing module is not located in the specular reflection direction, the light reflected by the non-high light region into the image capturing module is relatively little, so that the problem of overexposure does not occur in the image capturing module, and therefore the edge feature of the a-shaped pattern in the non-high light region in the first image 710 is relatively clear.
After the first image 710 is obtained, the acquisition control module turns on the second bar light source 802, turns off the first bar light source 801, illuminates the a-shaped object 700 to be tested with the second bar light source 802, and acquires an image of the object 700 to be tested when the second bar light source 802 illuminates the object 700 to be tested, so as to obtain a second image 720 (as shown in fig. 13); since the second strip light source 802 is located at the bottom of the a-shaped dut 700, correspondingly, a highlight region (shown as region B in fig. 13) is formed at the bottom of the a-shape in the second image 720, and the other regions in the second image 720 are non-highlight regions. Similarly, for a high light area, because the image capturing module is located in the specular reflection direction, and the brightness of the high light area is relatively large, overexposure is easily generated in the image capturing module, so that the image captured by the image capturing module is almost all white, which results in that the edge feature of the a-shaped pattern in the high light area in the second image 720 is not obvious enough, and for a non-high light area, because the image capturing module is not located in the specular reflection direction, the light reflected by the non-high light area into the image capturing module is relatively less, so that the problem of overexposure does not occur in the image capturing module, and the edge feature of the a-shaped pattern in the non-high light area in the second image 720 is relatively clear.
Referring collectively to fig. 14, a composite image formed from the images shown in fig. 12 and 13 is shown. In this embodiment, the image processing module combines the non-highlight region with relatively obvious edge features in fig. 12 and 13 to obtain a composite image 730 with relatively complete edge features.
The image processing module is suitable for obtaining the gray scale of each pixel point in each image, and the minimum gray scale value of the same pixel point in a plurality of images is used as the gray scale of the corresponding pixel point in the synthesized image, so that the synthesis of the plurality of images is realized. Because the pixel points with smaller gray scale correspond to the pixel points in the non-highlight area, the gray scale minimum value of the same pixel point in the plurality of images is taken, so that the obvious edge characteristics can be kept when the images are synthesized, and the parts with the obvious edge characteristics in the plurality of images are synthesized, so that the edge characteristics of the to-be-detected piece 700 in the synthesized image 730 are complete, and the formed synthesized image 730 is favorable for edge characteristic extraction and image detection.
It should be noted that, when the dut 700 has a high-light surface, the number of the light sources 810 is not limited to two. In other embodiments, the lighting module includes four strip light sources, the four strip light sources are arranged in a rectangular shape around the to-be-detected piece, and two suitable strip light sources arranged oppositely are selected from the four strip light sources according to the specific shape and the placing direction of the to-be-detected piece.
Correspondingly, the invention also provides an image detection device.
With continued reference to fig. 4, a schematic structural diagram of an embodiment of the image detection apparatus of the present invention is shown.
The image detection apparatus includes: the image detection system of the present invention; a bracket 350 adapted to secure the illumination module 100 (shown in fig. 3) and the image capture module 200.
In this embodiment, the image inspection system is suitable for inspecting a device under test 120 (as shown in fig. 4).
For a detailed description of the image detection system, please refer to the corresponding description in the foregoing embodiments, which is not repeated herein.
In this embodiment, the bracket 350 is used to fix the illumination module 100 and the image capturing module 200, so as to improve the stability of the illumination module 100 and the image capturing module 200, and facilitate the improvement of the convenience of image detection on the to-be-detected object 120.
Specifically, the bracket 350 includes: a base 450; the first fixing bars 150 are located on the base 450 and adapted to fix the light sources 110, respectively, the first fixing bars 150 are located above the to-be-tested object 120, and the first fixing bars 150 are connected end to form a ring shape; and a second fixing rod 250 adapted to fix the image capturing module 200, wherein the second fixing rod 250 is protrudingly disposed on any one of the first fixing rods 150.
In this embodiment, the lighting module 100 includes four bar-shaped light sources, the bracket 350 includes four bar-shaped first fixing bars 150, and the four bar-shaped first fixing bars 150 are connected end to form a rectangle; the second fixing bar 250 includes a supporting bar 251 located on any one of the bar-shaped first fixing bars 150, and a cantilever 252 connected to the supporting bar 251, the cantilever 252 being disposed toward the rectangle.
Correspondingly, when the image detection device is used for carrying out image detection on the piece to be detected 120, the piece to be detected 120 can be borne by a manipulator, the piece to be detected 120 is arranged below the illumination module 100 and the image acquisition module 200, and the image detection of the piece to be detected 120 is realized through the image detection system.
In addition, the invention also provides an image detection method which is suitable for detecting the piece to be detected.
Referring to fig. 15, a flow chart of an embodiment of the image detection method of the present invention is shown, and the image detection method of the present embodiment includes the following basic steps:
step S1: arranging an illumination module to illuminate the piece to be tested, wherein the illumination module comprises a plurality of light sources, and light energy emitted by the light sources is obliquely incident to the piece to be tested from different directions;
step S2: controlling the plurality of light sources to be respectively turned on and off, and acquiring images of the to-be-detected piece when the light sources illuminate the to-be-detected piece to obtain a plurality of images of the to-be-detected piece under illumination of different light sources;
step S3: and synthesizing the plurality of images to obtain a synthesized image, and extracting edge features of the synthesized image to realize the detection of the piece to be detected.
Specific embodiments of the present invention will be further described with reference to the accompanying drawings.
With continuing reference to fig. 5 to 10, schematic structural diagrams corresponding to each step of the embodiment of the image detection method shown in fig. 15 are shown.
Referring to fig. 15 in combination with fig. 5, fig. 5 shows a schematic structural diagram corresponding to step S1 in fig. 15, step S1 is executed, an illumination module 100 is arranged to illuminate the device under test 120, the illumination module 100 includes a plurality of light sources 110, and light emitted by the plurality of light sources 110 is obliquely incident on the device under test 120 from different directions.
In this embodiment, the device under test 120 has a concave-convex edge feature. Specifically, the shape of the device under test 120 is an a-shaped structural member with a certain protrusion height.
Correspondingly, when the illumination module 100 illuminates the object to be tested 120, the light sources 110 are obliquely incident to the object to be tested 120 from different directions, and the concave-convex edge features of the object to be tested 120 are shaded in different directions under the illumination of the light sources 110; in this embodiment, after obtaining a plurality of images of the object 120 under illumination by different light sources 110, the parts with obvious edge features in the plurality of images can be synthesized, so that the object 120 has more complete edge features in the synthesized image.
In this embodiment, the step of providing the lighting module 100 includes: the light source 110 is installed such that the incident angle of the light emitted from the light source 110 on the surface of the dut 120 is 30 ° to 60 °.
The incident angle of the light emitted from the light source 110 on the surface of the device under test 120 is not too small or too large. If the incident angle is less than 30 °, the shadow of the concave-convex edge feature under the illumination of the light source 110 is not obvious, which is not favorable for the appearance of the edge feature of the edge object 120 in the obtained composite image; if the angle of incidence is greater than 60 deg., it is difficult to fully illuminate the edge feature, particularly the raised portions of the edge feature, and accordingly the presentation of the edge feature of the edge test piece 120 in the resulting composite image is also not facilitated.
It should be noted that when the concave-convex difference of the concave-convex edge feature is shallow, the angle of the incident angle can be increased to improve the visibility of the shadow. For example, when the angle of the incident angle is increased, the incident angle is made to be in the range of 45 ° to 60 °.
In this embodiment, the step of providing the lighting module 100 includes: providing at least four light sources 110; the light sources 110 are installed such that the light sources 110 are evenly distributed around the dut 120.
Through making in lighting module 110 the quantity more than or equal to four of light source 110, just a plurality of light sources 110 center on to be measured 120 evenly distributed to prevent that partial edge characteristic is not lighted the problem, and be favorable to improving the homogeneity of the illumination effect of to be measured 120, thereby in making the composite image, to be measured 120 has comparatively complete edge characteristic, avoids appearing the problem of edge characteristic disappearance.
In this embodiment, the step of providing the lighting module 100 includes: providing four strip-shaped light sources; and installing the four strip-shaped light sources to enable the four strip-shaped light sources to surround the piece to be detected 120 and be arranged in a rectangular shape.
Specifically, as shown in fig. 5, the four light sources 110 include a first light source 111, a second light source 112, a third light source 113, and a fourth light source 114, which are sequentially arranged.
For a detailed description of the illumination module 100, please refer to the corresponding description in the image detection system, which is not repeated herein.
With continuing reference to fig. 15 and with combined reference to fig. 6 to 9, a schematic structural diagram corresponding to step S2 in fig. 15 is shown, and step S2 is executed to control the light sources 110 (shown in fig. 5) to be turned on and off respectively, and perform image acquisition on the to-be-tested object 120 when the light sources 110 illuminate the to-be-tested object 120 (shown in fig. 5), so as to obtain a plurality of images of the to-be-tested object 120 illuminated by different light sources 110.
In this embodiment, in order to enable the edge features of the to-be-detected piece 120 to be respectively presented under the illumination of different light sources 110, when the to-be-detected piece 120 is subjected to image acquisition once, only one light source 110 is in an on state, and the remaining light sources 110 are in an off state.
In this embodiment, an industrial camera having a function of triggering photographing is used to acquire an image of the to-be-detected part 120.
In this embodiment, the first light source 111 (shown in fig. 5), the second light source 112 (shown in fig. 5), the third light source 113 (shown in fig. 5), and the fourth light source 114 (shown in fig. 5) are sequentially adopted to illuminate the device under test 120 (shown in fig. 5) for example.
Specifically, the first light source 111 is turned on, the second light source 112, the third light source 113 and the fourth light source 114 are turned off, the a-shaped object to be measured 120 is illuminated by the first light source 111, and when the first light source 111 illuminates the object to be measured 120, the object to be measured 120 is subjected to image acquisition, so as to obtain a first image 610 (as shown in fig. 6), wherein the top area of the a-shape in the first image 610 is a bright area, and the bottom of the a-shape is a shadow area. It should be noted that the bright area is not shaded at the edge of the a-shaped structure because it faces the direction of light, so the edge features are not sufficiently obvious, and the shaded area is shaded due to the raised portion of the a-shaped structure, so the edge features are relatively clear.
After the first image 610 is obtained, the second light source 112 is turned on, the first light source 111, the third light source 113 and the fourth light source 114 are turned off, the a-shaped object to be measured 120 is illuminated by the second light source 112, and image acquisition is performed on the object to be measured 120 when the second light source 112 illuminates the object to be measured 120, so as to obtain a second image 620 (as shown in fig. 7), wherein a bright area corresponds to an O area (as shown in fig. 5) in the second image 620, and a shadow area corresponds to an O' area (as shown in fig. 5). Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
After the second image 620 is obtained, the third light source 113 is turned on, the first light source 111, the second light source 112 and the fourth light source 114 are turned off, the a-shaped object to be measured 120 is illuminated by the third light source 113, and image acquisition is performed on the object to be measured 120 when the object to be measured 120 is illuminated by the third light source 113, so as to obtain a third image 630 (as shown in fig. 8), wherein the top area of the a-shape in the third image 630 is a shadow area, and the bottom of the a-shape is a bright area. Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
After the third image 630 is obtained, the fourth light source 114 is turned on, the first light source 111, the second light source 112 and the third light source 113 are turned off, the a-shaped object to be tested 120 is illuminated by the fourth light source 114, and image acquisition is performed on the object to be tested 120 when the fourth light source 114 illuminates the object to be tested 120, so as to obtain a fourth image 640 (as shown in fig. 9), wherein a shadow area corresponding to an O area (as shown in fig. 5) and a bright area corresponding to an O' area (as shown in fig. 5) in the fourth image 640. Similarly, the bright area is opposite to the illumination direction, the edge of the A-shaped structural member has no shadow, so the edge characteristic is not obvious enough, and the shadow area generates a shadow due to the raised part of the A-shaped structural member, so the edge characteristic is relatively clear.
With continuing reference to fig. 15 and with combined reference to fig. 10, fig. 10 shows a schematic structural diagram corresponding to step S3 in fig. 15, and step S3 is executed to synthesize the multiple images to obtain a synthesized image 650, and perform edge feature extraction on the synthesized image 650 to implement detection on the object 120 to be tested (as shown in fig. 5).
In this embodiment, the areas with relatively obvious edge features (shaded areas) in fig. 6 to 9 are combined to form the combined image 650.
Specifically, the gray scale of each pixel point in each image is obtained, and the minimum gray scale value of the same pixel point in a plurality of images is used as the gray scale of the corresponding pixel point in the synthesized image, so that the synthesis of the plurality of images is realized. Because the pixel points with smaller gray scale correspond to the pixel points in the shadows, the shadows in the multiple images are reserved by taking the minimum gray scale value of the same pixel point in the multiple images, so that the to-be-detected part 120 in the composite image 650 has all the shadows, and the reserved shadows and the raised parts of the A-shaped structural part form bright and dark edge features, so that the formed composite image 650 can be used for edge feature extraction and image detection.
It should be noted that, in the present embodiment, the workpiece 120 has a concave-convex edge as an example for description. However, the edge feature of the object is not limited to have concave-convex distinction.
Referring to fig. 11 to 14, schematic structural diagrams corresponding to steps of another embodiment of the image detection method shown in fig. 15 are shown.
The parts of this embodiment that are the same as the parts of the foregoing embodiment will not be described again. The present embodiment differs from the previous embodiments in that: the object 700 (as shown in fig. 11) has no obvious concave-convex edge, but due to the reason that the surface smoothness of the object 700 is high, or the reflectivity of the surface material of the object 700 to light is high, the surface of the object 700 is prone to generate high light reflection during light irradiation, and thus a high light surface is generated.
Correspondingly, with adopt single light source to be in it illuminates to await measuring the top compare the scheme of awaiting measuring, through adopting a plurality of light sources from different directions oblique incidence to await measuring 700, can avoid showing the problem of marginal feature in the image because highlight reflection is difficult to, in correspondingly also can making the composite image, awaiting measuring 700 has comparatively complete marginal feature.
As shown in fig. 11, in the present embodiment, the surface of the dut 700 has an a-shaped pattern.
In this embodiment, the step of providing the lighting module 800 includes: providing at least two light sources 810; when the light sources 810 are mounted so that the two light sources 810 illuminate the device under test 700, highlight regions are formed at different positions on the highlight surface.
The image formed by the highlight area is easy to have the problem that the edge features are not obvious, and the highlight area is positioned at different positions of the highlight surface of the to-be-measured piece 700 through the irradiation of the two light sources 810, so that the influence of the highlight area in a plurality of images can be eliminated when the images are synthesized, the edge features of the non-highlight area which are obvious are reserved, and the to-be-measured piece 700 in the synthesized image has complete edge features.
Specifically, the highlight surface is an arc surface; accordingly, the step of providing the lighting module 800 includes: and providing two strip-shaped light sources which are parallel to each other and are respectively positioned above the outer sides of the two top ends of the cambered surface. It should be noted that, as shown in fig. 11, the arc surface includes a first rib 701 and a second rib 702 that are parallel to each other, and an arc line (not labeled) located between the first rib 701 and the second rib 702, the arc surface includes a first top end M between an arc top LL ' and the first rib 701, and a second top end M ' between an arc top LL ' and the second rib 702, and outer sides of the two top ends of the arc surface refer to: the first top end M is close to one side of the first rib 701, and the second top end M' is close to one side of the second rib 702.
Since the two strip light sources can obliquely enter the object 700 from different directions, different highlight areas are generated in the obtained images, and the formed highlight areas are located at different positions of the highlight surface.
In this embodiment, the light source 810 includes a first bar light source 801 (near the top of the A-shape) and a second bar light source 802 (near the bottom of the A-shape).
In this embodiment, the first bar light source 801 and the second bar light source 802 are sequentially used to illuminate the dut 700 for example.
Specifically, the first bar light source 801 is turned on, the second bar light source 802 is turned off, the a-shaped object 700 to be tested is illuminated by the first bar light source 801, and when the first bar light source 801 illuminates the object 700 to be tested, image acquisition is performed on the object 700 to be tested, so as to obtain a first image 710 (as shown in fig. 12); since the first strip light source 801 is located at the top of the a-shaped device under test 700, accordingly, a highlight region (shown as region a in fig. 12) is formed at the top of the a-shape in the first image 710, and other regions in the first image 710 are non-highlight regions. It should be noted that, for a high light region, because the image capturing module is located in the specular reflection direction, and the brightness of the high light region is relatively high, overexposure is easily generated in the image capturing module, so that the image captured by the image capturing module is almost all white, which results in that the edge feature of the a-shaped pattern in the high light region in the first image 710 is not obvious enough, and for a non-high light region, because the image capturing module is not located in the specular reflection direction, the light reflected by the non-high light region into the image capturing module is relatively little, so that the problem of overexposure does not occur in the image capturing module, and therefore the edge feature of the a-shaped pattern in the non-high light region in the first image 710 is relatively clear.
After the first image 710 is obtained, turning on the second bar light source 802, turning off the first bar light source 801, illuminating the a-shaped object 700 to be tested with the second bar light source 802, and performing image acquisition on the object 700 to be tested when the second bar light source 802 illuminates the object 700 to be tested, so as to obtain a second image 720 (as shown in fig. 13); since the second strip light source 802 is located at the bottom of the a-shaped dut 700, correspondingly, a highlight region (shown as region B in fig. 13) is formed at the bottom of the a-shape in the second image 720, and the other regions in the second image 720 are non-highlight regions. Similarly, for a high light area, because the image capturing module is located in the specular reflection direction, and the brightness of the high light area is relatively large, overexposure is easily generated in the image capturing module, so that the image captured by the image capturing module is almost all white, which results in that the edge feature of the a-shaped pattern in the high light area in the second image 720 is not obvious enough, and for a non-high light area, because the image capturing module is not located in the specular reflection direction, the light reflected by the non-high light area into the image capturing module is relatively less, so that the problem of overexposure does not occur in the image capturing module, and the edge feature of the a-shaped pattern in the non-high light area in the second image 720 is relatively clear.
Referring to fig. 14 in combination, a composite image formed by the images shown in fig. 12 and 13 is shown, after the first image 710 and the second image 720 are obtained, non-highlight areas with relatively obvious edge features in the two images are synthesized to obtain a composite image 730 with complete edge features, and edge feature extraction is performed on the composite image 730 to implement detection on the device under test 700 (shown in fig. 11).
The image processing module is suitable for obtaining the gray scale of each pixel point in each image, and the minimum gray scale value of the same pixel point in a plurality of images is used as the gray scale of the corresponding pixel point in the synthesized image, so that the synthesis of the plurality of images is realized. Because the pixel points with smaller gray scale correspond to the pixel points in the non-highlight area, the gray scale minimum value of the same pixel point in the multiple images is taken, so that the obvious edge characteristics can be kept when the images are synthesized, and the parts with the obvious edge characteristics in the multiple images are synthesized, so that the edge characteristics of the to-be-detected piece 700 in the synthesized image 730 are complete, and the formed synthesized image 730 is favorable for edge characteristic extraction and image detection.
It should be noted that, when the dut 700 has a high-light surface, the number of the light sources 810 is not limited to two. In other embodiments, in the step of setting the lighting module, four strip light sources may be further provided, the four strip light sources are arranged in a rectangular shape around the to-be-detected piece, and two suitable strip light sources arranged oppositely are selected from the four strip light sources according to the specific shape and the placing direction of the to-be-detected piece.
Although the present invention has been disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (18)

1. An image inspection system adapted to inspect a workpiece, the image inspection system comprising:
the illumination module is suitable for illuminating the piece to be measured and comprises a plurality of light sources, light emitted by the light sources obliquely enters the piece to be measured from different directions respectively, and the incident angle of the light emitted by the light sources on the surface of the piece to be measured is 30-60 degrees;
the image acquisition module is arranged above the piece to be detected and is suitable for acquiring images of the piece to be detected;
the acquisition control module is connected with the illumination module and the image acquisition module, is suitable for controlling the plurality of light sources to be respectively turned on and off, and controls the image acquisition module to acquire images of the piece to be detected when the light sources are turned on so as to obtain a plurality of images of the piece to be detected under the illumination of different light sources;
the image processing module is connected with the image acquisition module, is suitable for synthesizing the plurality of images to obtain a synthesized image, and is also suitable for extracting edge characteristics of the synthesized image to realize the detection of the piece to be detected;
the image processing module is adapted to synthesize the plurality of images, including: and obtaining the gray level of each pixel point in each image, and taking the minimum gray level of the same pixel point in a plurality of images as the gray level of the corresponding pixel point in the synthesized image to realize the synthesis of the plurality of images.
2. The image inspection system of claim 1, wherein the light source includes a light emitting surface, and a maximum length of the light emitting surface on a side close to the device under test is greater than a maximum length of the device under test in a direction parallel to the light emitting surface.
3. The image inspection system of claim 1, wherein the light source is one or more of a bar light source, an arc light source, and a point light source.
4. The image inspection system of claim 1, wherein the image capture module is an industrial camera having a triggered photograph function.
5. The image detection system of claim 1, wherein the image processing module is an industrial personal computer or a computer processing system.
6. The image inspection system of claim 1, wherein the image processing module is further coupled to the acquisition control module and adapted to send instructions to the acquisition control module to trigger the acquisition control module to control the illumination module and the image acquisition module.
7. The image inspection system of claim 1, wherein the acquisition control module is a programmable logic controller and a relay.
8. The image inspection system of claim 1, wherein the piece under test has a concave-convex edge feature;
the number of the light sources in the lighting module is more than or equal to four, and the light sources are arranged around the piece to be tested.
9. The image inspection system of claim 8, wherein the plurality of light sources are evenly distributed around the dut.
10. The image inspection system of claim 8, wherein the illumination module includes four strip light sources, and the four strip light sources are arranged in a rectangle around the dut.
11. The image inspection system of claim 1, wherein the dut has a high gloss surface;
the lighting module comprises at least two light sources, and when the light sources respectively illuminate the piece to be measured, high light areas are formed at different positions of the high light surface.
12. The image inspection system of claim 11, wherein the highlight surface is a curved surface;
the lighting module comprises two strip-shaped light sources which are parallel to each other and are respectively positioned above the outer sides of the two top ends of the cambered surface.
13. An image detection apparatus, characterized by comprising:
an image inspection system according to any one of claims 1 to 12;
and the bracket is suitable for fixing the lighting module and the image acquisition module.
14. An image detection method is suitable for detecting a piece to be detected, and is characterized by comprising the following steps:
arranging an illumination module to illuminate the piece to be tested, wherein the illumination module comprises a plurality of light sources, and light energy emitted by the light sources is obliquely incident to the piece to be tested from different directions;
controlling the plurality of light sources to be respectively turned on and off, and acquiring images of the to-be-detected piece when the light sources illuminate the to-be-detected piece to obtain a plurality of images of the to-be-detected piece under illumination of different light sources;
synthesizing the plurality of images to obtain a synthesized image, and extracting edge features of the synthesized image to detect the piece to be detected;
the step of providing the lighting module comprises: installing the light source, so that the incident angle of the light emitted by the light source on the surface of the piece to be detected is 30-60 degrees;
the step of synthesizing the plurality of images comprises: and obtaining the gray scale of each pixel point in each image, and taking the minimum gray scale value of the same pixel point position in the plurality of images as the gray scale value of the corresponding pixel point position in the synthesized image to realize the synthesis of the edge characteristics of the plurality of images.
15. The image inspection method of claim 14, wherein the test object has a concave-convex edge feature;
the step of providing the lighting module comprises: providing at least four light sources; and installing the light sources to ensure that the light sources are uniformly distributed around the piece to be detected.
16. The image sensing method of claim 15, wherein the step of providing an illumination module comprises: providing four strip-shaped light sources; and installing the four strip-shaped light sources to enable the four strip-shaped light sources to surround the piece to be detected and be arranged in a rectangular shape.
17. The image inspection method of claim 14, wherein the dut has a high gloss surface;
the step of providing the lighting module comprises: providing at least two light sources; and mounting the light sources, and forming high light areas at different positions of the high light surface when the light sources respectively illuminate the piece to be measured.
18. The image inspection method of claim 17, wherein the highlight surface is a curved surface;
the step of providing the lighting module comprises: providing two strip-shaped light sources; and installing the strip-shaped light sources to enable the two strip-shaped light sources to be parallel to each other and to be respectively positioned above the outer sides of the two top ends of the cambered surface.
CN201710867556.4A 2017-09-22 2017-09-22 Image detection system, image detection device, and image detection method Active CN109539978B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710867556.4A CN109539978B (en) 2017-09-22 2017-09-22 Image detection system, image detection device, and image detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710867556.4A CN109539978B (en) 2017-09-22 2017-09-22 Image detection system, image detection device, and image detection method

Publications (2)

Publication Number Publication Date
CN109539978A CN109539978A (en) 2019-03-29
CN109539978B true CN109539978B (en) 2021-06-25

Family

ID=65828226

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710867556.4A Active CN109539978B (en) 2017-09-22 2017-09-22 Image detection system, image detection device, and image detection method

Country Status (1)

Country Link
CN (1) CN109539978B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110057289A (en) * 2019-04-25 2019-07-26 江苏理工学院 Multiple light courcess locating and detecting device and its control method before a kind of lithium battery pole slice welds
CN113376164A (en) * 2020-03-10 2021-09-10 觉芯电子(无锡)有限公司 Surface scratch detection method and device
CN111505023A (en) * 2020-05-08 2020-08-07 苏州英士派克光电科技有限公司 Optical detection method for transparent material
CN112135043A (en) * 2020-09-21 2020-12-25 珠海格力电器股份有限公司 Method and apparatus for synthesizing photograph, storage medium, and electronic apparatus
CN112344879B (en) * 2020-09-29 2022-03-25 联想(北京)有限公司 Method, device and equipment for detecting glue road
CN115598136B (en) * 2022-10-28 2023-08-08 深圳市元硕自动化科技有限公司 Screen gluing quality detection device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009300351A (en) * 2008-06-17 2009-12-24 Toray Ind Inc Inspection apparatus and inspection method
CN101617340A (en) * 2007-03-06 2009-12-30 株式会社岛津制作所 Edge evaluation method, edge detection method, image correcting method and image processing system
CN102509290A (en) * 2011-10-25 2012-06-20 西安电子科技大学 Saliency-based synthetic aperture radar (SAR) image airfield runway edge detection method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8184194B2 (en) * 2008-06-26 2012-05-22 Panasonic Corporation Image processing apparatus, image division program and image synthesising method
KR101108672B1 (en) * 2009-05-12 2012-01-25 (주)제이티 Vision inspection apparatus and vision inspection method therefor
CN102359758B (en) * 2011-07-21 2013-03-20 华中科技大学 Method for detecting appearance of semiconductor chip
EP3007432B1 (en) * 2013-06-07 2020-01-01 Panasonic Intellectual Property Management Co., Ltd. Image acquisition device and image acquisition method
JP6424020B2 (en) * 2014-06-09 2018-11-14 株式会社キーエンス Image inspection apparatus, image inspection method, image inspection program, computer-readable recording medium, and recorded apparatus
JP6700721B2 (en) * 2015-11-04 2020-05-27 キヤノン株式会社 Image reading device and image reading method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101617340A (en) * 2007-03-06 2009-12-30 株式会社岛津制作所 Edge evaluation method, edge detection method, image correcting method and image processing system
JP2009300351A (en) * 2008-06-17 2009-12-24 Toray Ind Inc Inspection apparatus and inspection method
CN102509290A (en) * 2011-10-25 2012-06-20 西安电子科技大学 Saliency-based synthetic aperture radar (SAR) image airfield runway edge detection method

Also Published As

Publication number Publication date
CN109539978A (en) 2019-03-29

Similar Documents

Publication Publication Date Title
CN109539978B (en) Image detection system, image detection device, and image detection method
WO2022179186A1 (en) Stroboscopic stepped illumination defect detection system
EP2800946B1 (en) Arrangement for optical measurements and related method
US9531967B2 (en) Dynamic range of a line scanner having a photosensitive array that provides variable exposure
US9838583B2 (en) Method and apparatus for verifying lighting setup used for visual inspection
WO2007129047A1 (en) Scanner system and method for scanning
JP5728699B2 (en) Surface inspection apparatus, surface inspection method, and surface inspection program
JP2015040796A (en) Defect detection device
JP2017040600A (en) Inspection method, inspection device, image processor, program and record medium
CN111627008A (en) Object surface detection method and system based on image fusion and storage medium
JP2017040510A (en) Inspection apparatus, inspection method, and object manufacturing method
CN107271445B (en) Defect detection method and device
JP2014074631A (en) Exterior appearance inspection apparatus and exterior appearance inspection method
CN110715930A (en) Precise optical surface weak defect microscopic illumination method and device
JP2018205004A (en) Image inspection device
WO2008120883A1 (en) Apparatus for inspection of semiconductor device and method for inspection using the same
KR101884557B1 (en) Inspection method using machine vision for check the terminal crimping of wire harness cable
JP2002228417A (en) Crack measuring apparatus
CN113474619A (en) Generating texture models using a movable scanner
EP3240993B1 (en) An arrangement of optical measurement
JP2001004339A5 (en) Lighting unevenness measurement method and measuring device
TWI452285B (en) Detecting light bar machine and method for detecting
Zhao et al. Research on measurement of mechanical parts based on vision
JP2019161526A (en) Image processing system, image processing apparatus, and image processing program
CN220022902U (en) Chip detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant