CN111566438A - Image acquisition method and system - Google Patents

Image acquisition method and system Download PDF

Info

Publication number
CN111566438A
CN111566438A CN201880068586.1A CN201880068586A CN111566438A CN 111566438 A CN111566438 A CN 111566438A CN 201880068586 A CN201880068586 A CN 201880068586A CN 111566438 A CN111566438 A CN 111566438A
Authority
CN
China
Prior art keywords
target
image
sub
dimensional
led
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201880068586.1A
Other languages
Chinese (zh)
Other versions
CN111566438B (en
Inventor
王星泽
何良雨
舒远
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Heren Technology Shenzhen Co ltd
Original Assignee
Heren Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Heren Technology Shenzhen Co ltd filed Critical Heren Technology Shenzhen Co ltd
Publication of CN111566438A publication Critical patent/CN111566438A/en
Application granted granted Critical
Publication of CN111566438B publication Critical patent/CN111566438B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Abstract

An image acquisition method, the method comprising: projecting module initial grating stripes to a measured surface (201) of a target object, and receiving initial modulation grating stripes (202) of the initial grating stripes after being reflected by the side surface by an imaging module; the imaging module determines a reference three-dimensional image (203) of the measured surface according to the initial modulation grating stripes; if the image quality of the reference three-dimensional image is lower than a preset image quality threshold, determining a target grating stripe (204) according to the reference three-dimensional image; the micro LED array is controlled in a regional self-adaptive mode, the grating stripes are accurately adjusted, and the target grating stripes are projected to the measured surface (205); receiving target modulation grating stripes (206) of the target grating stripes after the target grating stripes are reflected by the measured surface; and determining a target three-dimensional image (207) of the detected surface according to the target modulation grating stripes. The method highlights dark areas in the detection scene, weakens saturated areas, and can improve the accuracy of image acquisition of the detected surface of the target object.

Description

Image acquisition method and system Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image acquisition method and system.
Background
In modern industrial manufacturing, automatic Optical inspection system aoi (automatic Optical inspection) is often used to inspect appearance defects at high speed. It is generally done in a 2D fashion (because of easy implementation), i.e. in the form of a light source + camera system. However, in recent years, with the rapid development of industrial technologies, the demand for 3D detection is increasing. For example, in the electronic industry, components of Printed Circuit Boards (PCBs) are continuously reduced and complexity is continuously increased, and for high-end PCB manufacturing (for example, in the smart phone and automobile industries), it is difficult to find components with defects or mounting errors by simply adopting 2D detection, that is, 3D detection has become an industry development trend.
The grating projection technology is widely applied to 3D topography measurement due to the advantages of simple measurement system, high precision, high resolution and the like. However, due to the complexity of the surface of the detected object, specular reflection and diffuse reflection may occur in different areas, and even insufficient light intensity may occur, so that the grating stripes projected onto the detected surface cannot accurately reflect the real surface topography. For example, the highly reflective specular reflection area may lose detail information of the object surface due to gray saturation formed on the image because the light intensity exceeds the sensing range of the camera, thereby resulting in low accuracy of acquiring the image of the flanked surface.
Disclosure of Invention
The application provides an image acquisition method and system, which can improve the accuracy of acquiring an image of a measured surface of a measured object.
A first aspect of an embodiment of the present application provides an image acquisition method, where the method includes:
projecting initial grating stripes to a measured surface of a target object;
receiving initial modulation grating stripes of the initial grating stripes after the initial grating stripes are reflected by the measured surface;
determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes;
acquiring the image quality of the reference three-dimensional image, and determining a target grating stripe according to the reference three-dimensional image if the image quality of the reference three-dimensional image is lower than a preset image quality threshold;
projecting the target grating stripe to the measured surface;
receiving target modulation grating stripes of the target grating stripes after the target grating stripes are reflected by the measured surface;
and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes.
With reference to the first aspect of the embodiment of the present application, in a first possible implementation manner of the first aspect, the acquiring image quality of the reference three-dimensional image includes:
splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
obtaining an image quality of each sub-reference three-dimensional image of the plurality of sub-reference three-dimensional images;
and taking the minimum value in the image qualities of the plurality of sub-reference three-dimensional images as the image quality of the reference three-dimensional image.
With reference to the first possible implementation manner of the first aspect of the embodiment of the present application, in a second possible implementation manner of the first aspect, the determining a target grating stripe according to the reference three-dimensional image includes:
obtaining the light intensity of the area corresponding to the sub-reference three-dimensional images in the measured surface according to the sub-reference three-dimensional images;
determining target light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface according to the light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface and a preset genetic algorithm;
determining the target luminous intensity of each LED module in the LED microarray generating the target light intensity according to a preset light intensity calculation formula;
adjusting the luminous intensity of each LED module in the LED microarray to the target luminous intensity of each LED module to obtain the target grating stripe
With reference to the second possible implementation manner of the first aspect of the embodiment of the present application, in a third possible implementation manner of the first aspect, the LED micro array is a square array composed of N × M LED modules, and the preset light intensity calculation formula includes:
when N and M are odd numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000001
when N and M are even numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000002
wherein, ILED=SLEDLLEDIs the luminous intensity, S, of the LED micro-arrayLEDIs the light emitting area, L, of the LED micro-arrayLEDFor the radiation brightness of the LED micro array, d is the distance between two adjacent LED modules, x, y, z, i and j are real numbers, and m can be calculated by the following formula:
Figure PCTCN2018122173-APPB-000003
wherein, theta1/2The viewing angle corresponds to half of the illumination when the illumination is 0 degree.
With reference to the first possible implementation manner of the first aspect to the third possible implementation manner of the first aspect in the embodiment of the present application, in a fourth possible implementation manner of the first aspect, the method further includes:
if the image quality of the target three-dimensional image is lower than the preset image quality threshold, splitting the target three-dimensional image according to the region of the detected surface to obtain a plurality of sub-target three-dimensional images, wherein the sub-target three-dimensional images correspond to the sub-reference three-dimensional images one by one;
extracting at least one first reference image from the plurality of sub-target three-dimensional images, wherein the first reference image is a sub-target three-dimensional image of which the image quality is lower than the preset image quality threshold value, and extracting at least one second reference image from the plurality of sub-reference target images, and the second reference image is an image corresponding to the first reference image in the plurality of sub-reference three-dimensional images;
fusing the at least one first reference image and the at least one second reference image to obtain at least one fused sub-target three-dimensional image;
and combining the at least one fused sub-target three-dimensional image with a third reference image to obtain a real three-dimensional image of the measured surface, wherein the third reference image is an image except the first reference image in the plurality of sub-target three-dimensional images.
A second aspect of embodiments of the present application provides an image acquisition system comprising a projection module, a brightness adjustment module, and an imaging module, wherein,
the projection module is used for projecting the initial grating stripes to a measured surface of a target object and projecting the target grating stripes to the measured surface;
the imaging module is used for receiving initial modulation grating stripes after the initial grating stripes are reflected by the measured surface, determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes, receiving target modulation grating stripes after the target grating stripes are reflected by the measured surface, and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes;
the brightness adjusting module is used for obtaining the image quality of the reference three-dimensional image, and if the image quality of the reference three-dimensional image is lower than a preset image quality threshold, determining a target grating stripe according to the reference three-dimensional image.
A third aspect of embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps as described in the first aspect of embodiments of the present application.
The embodiment of the application has at least the following beneficial effects:
according to the embodiment of the application, the initial grating stripe is projected to the measured surface of the target object, the initial modulation grating stripe after the initial grating stripe is reflected by the measured surface is received, the reference three-dimensional image of the measured surface is determined according to the initial modulation grating stripe, the image quality of the reference three-dimensional image is obtained, if the image quality of the reference three-dimensional image is lower than the preset image quality threshold value, the target grating stripe is determined according to the reference three-dimensional image, the target three-dimensional image of the measured surface is determined according to the target grating stripe, compared with the prior art, when the measured surface with the complex detected object is detected, the accuracy of obtaining the image of the measured surface is lower, the image quality of the reference three-dimensional image of the measured surface is judged, and when the image quality is lower than the preset image quality threshold value, and obtaining a target grating stripe according to the reference three-dimensional image, and obtaining a target three-dimensional image of the measured surface through the grating stripe, so that the accuracy of obtaining the three-dimensional image of the measured surface can be improved to a certain extent.
Drawings
Reference will now be made in brief to the drawings that are needed in describing embodiments or prior art.
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an image acquisition system according to an embodiment of the present application;
fig. 2A is a schematic flowchart of an image acquisition method according to an embodiment of the present application;
FIG. 2B is a schematic diagram of a projection module according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart diagram illustrating another image acquisition method according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an image acquisition system according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to better understand the image acquisition method provided in the embodiments of the present application, an image acquisition system using the image acquisition method will be briefly described below. As shown in fig. 1, the image acquisition system includes: the projection module 101, the imaging module 102 and the regional adaptive brightness adjustment module 103, the imaging module 102 includes a micro LED array, which may also be referred to as an LED micro array, the micro LED array is a two-dimensional array of LED chips integrated on the same substrate and having a certain periodicity, a high density and a micro size, and has the characteristics of small pixel size, flexible and compact structure, high resolution, controllable period, etc., the luminance of the LED chips in the micro LED array is controllable in regions, and the luminance of each LED chip can be independently controlled in regions, wherein the projection module 101 projects an initial grating stripe onto a measured surface of a target object 104, the measured surface of the target object 104 reflects the initial grating stripe to form an initial modulation grating stripe, the imaging module 102 receives the initial modulation grating stripe formed by the initial grating stripe reflected by the measured surface of the target object 104, and determines a reference three-dimensional image of the measured surface of the target object 104 according to the initial modulation grating stripes, performs image quality calculation on the reference three-dimensional image to obtain the image quality of the reference three-dimensional image, when the image quality of the reference three-dimensional image is lower than a preset image quality threshold, the sub-area adaptive brightness adjustment module 103 determines target grating stripes according to the reference three-dimensional image, the target grating stripes can be formed by adjusting the brightness of the micro LED array by the projection module 101, the projection module 101 projects the target grating stripes onto the measured surface of the target object 104, the imaging module 102 receives the target modulation grating stripes formed by the target grating stripes reflected by the measured surface of the target object 104, the imaging module 102 obtains the target three-dimensional image of the measured surface of the target object 104 according to the target modulation grating stripes, therefore, when the detected surface of the detected object has complexity in the existing scheme, the accuracy of obtaining the image of the measured surface is low, the image quality of the reference three-dimensional image of the measured surface is judged, when the image quality is lower than a preset image quality threshold value, the target grating stripe is obtained according to the reference three-dimensional image, the target three-dimensional image of the measured surface is obtained through the grating stripe, and the accuracy of obtaining the three-dimensional image of the measured surface can be improved to a certain extent.
Referring to fig. 2A, fig. 2A is a schematic flowchart of an image acquisition method according to an embodiment of the present disclosure. As shown in fig. 2A, the image obtaining method is applied to an image obtaining system, where the image obtaining system includes a projection module, an imaging module, and a sub-region adaptive brightness adjustment module, and the image obtaining method includes steps 201 and 207, and specifically includes the following steps:
201. the projection module projects the initial grating stripes onto the measured surface of the target object.
Optionally, when the image acquisition system projects the sinusoidal grating stripes through the projection modules, a plurality of projection modules may be used to project the sinusoidal grating stripes simultaneously, for example, referring to fig. 2B, four projection modules 21 are provided, the four projection modules are located on a same circle 22 on a same plane, the plane is parallel to the detection plane 23, a central angle between two adjacent projection modules in the four projection modules is 90 degrees, and the sinusoidal grating stripes are projected on the detected surface of the target object. The projection module is arranged, so that the shadow condition of the single light source can be well solved, and the accuracy of the three-dimensional image acquisition is improved.
Alternatively, the initial grating stripes may be sinusoidal grating stripes.
202. And the imaging module receives the initial modulation grating stripe after the initial grating stripe is reflected by the measured surface.
Alternatively, as shown in fig. 2B, the imaging module 24 may be disposed at the center of the circle where the projection module is located, or may be disposed on a straight line 25 passing through the center of the circle and perpendicular to the circle, and located at the upper portion or the lower portion of the circle, where the upper portion of the circle is a portion where the distance from the plane where the target object is located is greater than the distance from any point on the circle to the plane where the object is located.
Optionally, another arrangement of the projection module and the imaging module is as follows: and replacing the imaging module and the projection module, namely arranging the imaging module and the projection module, and arranging the projection module at the position of the imaging module. The projection module system is vertically arranged right above the detection platform, so that the whole 3D view field is well illuminated, and unnecessary shadow is avoided. The cameras are positioned at different angles of the periphery and detect the reflected grating stripes in real time
Alternatively, the modulated grating stripe may be understood as a grating stripe carrying information of the surface shape of the surface of the target object to be measured.
203. And the imaging module determines a reference three-dimensional image of the measured surface according to the initial modulation grating stripes.
Optionally, according to the initial modulation grating stripe, determining a reference three-dimensional image of the measured surface of the target object may obtain the reference three-dimensional image through a phase shift analysis method, where the phase shift analysis method may be understood as: and performing phase shift analysis on the initial modulation grating stripes to obtain surface information of the target object, and drawing according to the surface information of the object to obtain a reference three-dimensional image.
204. And the regional self-adaptive brightness adjustment module acquires the image quality of the reference three-dimensional image, and determines a target grating stripe according to the reference three-dimensional image if the image quality of the reference three-dimensional image is lower than a preset image quality threshold.
Optionally, the detected surface of the target object includes a plurality of regions, and one possible method for obtaining the image quality of the reference three-dimensional image includes steps a1-A3, as follows:
a1, splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
the plurality of regions included in the measured surface may be a plurality of regions obtained by equally dividing the measured surface, for example, the measured surface is equally divided into 4 regions in an area manner to obtain 4 regions of the measured surface, or the regions may be divided according to the profile of the measured surface, the profile of the measured surface is divided according to the convex portion and the concave portion of the profile, each convex portion is used as a separate region, each concave portion is used as a separate region, or the regions may be divided according to the stripe of the measured surface, and if the measured surface has a plurality of stripes, the region between each two stripes is used as a separate region to obtain a plurality of regions.
Optionally, after the reference three-dimensional image is divided according to the region of the measured surface, a plurality of sub-reference three-dimensional images can be obtained.
A2, acquiring the image quality of each sub-reference three-dimensional image in the plurality of sub-reference three-dimensional images;
the image quality may include a gray value of the image, and when the image quality is the gray value, the gray value may reflect a brightness of the image, so the gray value may be used as a specific data of the image quality. The image quality may also include the saturation of the image, and since the image becomes unclear when the saturation of the image is high, the saturation may be used as a specific data of the image quality. Of course, the image quality may also include both gray value and saturation.
Optionally, when the image quality is a gray value of an image, the preset image threshold is a preset gray threshold, and the method for obtaining the image quality of the multiple sub-reference three-dimensional images may be: acquiring the gray value of each pixel point of each sub-reference three-dimensional image; extracting a target gray value of each sub-reference three-dimensional image, wherein the target gray value is a gray value of which the gray value is lower than a preset gray value threshold; and taking the average value of the target gray value as the image quality of each sub-reference three-dimensional image.
Optionally, when the image quality is the saturation of the image, the preset image threshold is a preset saturation threshold, and the method for obtaining the image quality of the multi-sub-reference three-dimensional image may be: acquiring the saturation of the color of each pixel point of each sub-reference three-dimensional image; extracting the target saturation of each sub-reference three-dimensional image, wherein the target saturation is the saturation with the saturation higher than a preset saturation threshold; and taking the average value of the target saturation as the image quality of each reference image.
The preset saturation threshold and the preset gray value threshold may be set according to empirical values or historical data.
A3, taking the minimum value in the image quality of the plurality of sub-reference three-dimensional images as the image quality of the reference three-dimensional image.
Optionally, when the image quality is a gray value, the minimum value of the gray values in the sub-reference three-dimensional image is used as the image quality of the reference three-dimensional image, and when the image quality is a saturation, the value with the highest saturation in the sub-reference three-dimensional image is used as the image quality of the reference three-dimensional image.
Optionally, a possible method for determining a target grating stripe according to a reference three-dimensional image includes steps B1-B4, as follows:
b1, obtaining the light intensity of the regions of the measured surface corresponding to the multiple sub-reference three-dimensional images according to the multiple sub-reference three-dimensional images;
the light intensity of the area corresponding to the sub-reference three-dimensional images in the measured surface can be understood as the light intensity on the measured surface after the initial grating stripes are irradiated on the measured surface, and the light intensity can be understood as the illumination intensity. The light intensity of the detected surface obtained by referring to the three-dimensional image can be obtained according to the gray value or the saturation of the reference three-dimensional image. The method for obtaining the light intensity of the region of the measured surface through the gray value can be as follows: and determining the light intensity of the area of the measured surface according to the mapping relation between the preset gray value and the light intensity. The mapping relationship between the gray value and the light intensity may be: since the gray value is a reflection of the light intensity, the light intensity can be obtained from the gray value by means of a histogram during image processing.
B2, determining the target light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface according to the light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface and a preset genetic algorithm;
alternatively, the predetermined genetic algorithm may be a parallel genetic algorithm encoded by a gene block, or the like. The target light intensity can be obtained through a target light intensity obtaining model, the target light intensity obtaining model is obtained after the target light intensity obtaining model learns the sample data by adopting machine learning, wherein the machine learning model is a supervised learning model, the supervised learning model can be a weight model in an artificial neural network method and the like, and one establishing method of the target light intensity obtaining model is as follows: firstly, extracting characteristics of a sample to obtain a characteristic set, then inputting the characteristic set into a training model, learning the training model according to an algorithm in the training model, wherein the algorithm can be a gradient descent method, a Newton algorithm, a conjugate gradient algorithm and the like, and finally obtaining a target light intensity acquisition model. Through the machine learning model, study a large amount of samples, can comparatively accurate determine target light intensity and obtain the model to the accuracy when target light intensity obtains has been promoted.
Alternatively, the target light intensity of the region of the side surface corresponding to the plurality of sub-reference three-dimensional images may be the light intensity of the region irradiated by each LED chip in the micro LED array forming the initial grating stripe.
B3, determining the target luminous intensity of each LED module in the LED microarray generating the target light intensity according to a preset light intensity calculation formula;
optionally, the LED micro array is a square array composed of N × M LED modules, and the preset light intensity calculation formula may be:
when N and M are odd numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000004
when N and M are even numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000005
wherein, ILED=SLEDLLEDIs the luminous intensity, S, of the LED micro-arrayLEDIs the light emitting area, L, of the LED micro-arrayLEDFor the radiation brightness of the LED micro array, d is the distance between two adjacent LED modules, x, y, z, i and j are real numbers, and m can be calculated by the following formula:
Figure PCTCN2018122173-APPB-000006
wherein, theta1/2The viewing angle corresponds to half of the illumination when the illumination is 0 degree.
And B4, adjusting the luminous intensity of each LED module in the LED microarray to the target luminous intensity of each LED module to obtain the target grating stripe.
205. And the projection module projects the target grating stripe to the measured surface.
Optionally, the projection module projects the target grating stripe onto the measured surface of the target object, which is specifically executed with reference to step 201.
206. And the imaging module receives the target modulation grating stripes after the target grating stripes are reflected by the detected surface.
Optionally, the receiving, by the imaging module, the target modulation grating stripe after the target grating stripe is reflected by the measured surface of the target object may be specifically executed with reference to step 202.
207. And the imaging module determines a target three-dimensional image of the measured surface according to the target modulation grating stripes.
Optionally, the imaging module determines a target three-dimensional image of the measured surface of the target object according to the target modulation grating stripes, and the determining may be performed with reference to the relevant portion in step 203.
Optionally, step 204 may be further performed after the target three-dimensional image is obtained, and image quality analysis is performed on the target three-dimensional image again until the image quality of the obtained target three-dimensional image is higher than the preset image quality threshold.
In one possible example, the image obtaining method may further include the following steps, specifically including steps C1-C4, specifically as follows:
c1, if the image quality of the target three-dimensional image is lower than the preset image quality threshold, splitting the target three-dimensional image according to the area of the detected surface to obtain a plurality of sub-target three-dimensional images, wherein the sub-target three-dimensional images correspond to the sub-reference three-dimensional images one to one;
c2, extracting at least one first reference image from the plurality of sub-target three-dimensional images, wherein the first reference image is a sub-target three-dimensional image with the image quality lower than the preset image quality threshold value, and extracting at least one second reference image from the plurality of sub-reference target images, and the second reference image is an image corresponding to the first reference image in the plurality of sub-reference three-dimensional images;
the preset image quality threshold may be determined according to a specific type of image quality, and the specific type of image quality may be a gray value, a saturation, and the like. The setting is made in reference to the setting manner in step a 2.
C3, fusing the at least one first reference image and the at least one second reference image to obtain at least one fused sub-target three-dimensional image;
when the at least one first reference image and the at least one second reference image are fused, the gray values of corresponding points in the at least one first reference image and the at least one second reference image are superposed to obtain the gray values of the fused sub-target three-dimensional target images at the pixel points, and therefore the images are synthesized again to obtain the fused sub-target three-dimensional images.
And C4, combining the at least one fused sub-target three-dimensional image with a third reference image to obtain a real three-dimensional image of the detected surface, wherein the third reference image is an image except the first reference image in the plurality of sub-target three-dimensional images.
When the sub-target three-dimensional images are combined, the sub-target three-dimensional images can be combined according to the reverse steps when the sub-target three-dimensional images are split, and a real three-dimensional image of the measured surface of the target object is obtained.
Referring to fig. 3, fig. 3 is a schematic flowchart of another image acquisition method according to an embodiment of the present disclosure. As shown in fig. 3, the image acquiring method includes steps 301 and 309, which are as follows:
301. projecting initial grating stripes to a measured surface of a target object;
302. receiving initial modulation grating stripes of the initial grating stripes after the initial grating stripes are reflected by the measured surface;
303. determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes;
304. splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
305. obtaining an image quality of each sub-reference three-dimensional image of the plurality of sub-reference three-dimensional images;
306. taking the minimum value in the image qualities of the sub-reference three-dimensional images as the image quality of the reference three-dimensional image, and if the image quality of the reference three-dimensional image is lower than a preset image quality threshold, determining a target grating stripe according to the reference three-dimensional image;
307. projecting the target grating stripe to the measured surface;
308. receiving target modulation grating stripes of the target grating stripes after the target grating stripes are reflected by the measured surface;
309. and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes.
In this example, the reference three-dimensional image is split into a plurality of sub-reference three-dimensional images, and the image quality of the reference three-dimensional image is determined according to the image quality of each sub-reference three-dimensional image, so that the accuracy of obtaining the image quality of the reference three-dimensional image can be improved to a certain extent.
In accordance with the foregoing embodiments, please refer to fig. 4, fig. 4 is a schematic structural diagram of a terminal provided in the present application, and as shown in the figure, the terminal includes a processor 42, an input device 41, an output device 43, and a memory 44, where the processor 42, the input device 41, the output device 43, and the memory 44 are connected to each other, where the memory 44 is used to store a computer program, the computer program includes program instructions, the processor 42 is configured to call the program instructions, and the program includes instructions for performing the following steps;
projecting initial grating stripes to a measured surface of a target object;
receiving initial modulation grating stripes of the initial grating stripes after the initial grating stripes are reflected by the measured surface;
determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes;
acquiring the image quality of the reference three-dimensional image, and determining a target grating stripe according to the reference three-dimensional image if the image quality of the reference three-dimensional image is lower than a preset image quality threshold;
projecting the target grating stripe to the measured surface;
receiving target modulation grating stripes of the target grating stripes after the target grating stripes are reflected by the measured surface;
and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes.
According to the embodiment of the application, the initial grating stripe is projected to the measured surface of the target object, the initial modulation grating stripe after the initial grating stripe is reflected by the measured surface is received, the reference three-dimensional image of the measured surface is determined according to the initial modulation grating stripe, the image quality of the reference three-dimensional image is obtained, if the image quality of the reference three-dimensional image is lower than the preset image quality threshold value, the target grating stripe is determined according to the reference three-dimensional image, the target three-dimensional image of the measured surface is determined according to the target grating stripe, compared with the prior art, when the measured surface with the complex detected object is detected, the accuracy of obtaining the image of the measured surface is lower, the image quality of the reference three-dimensional image of the measured surface is judged, and when the image quality is lower than the preset image quality threshold value, and obtaining a target grating stripe according to the reference three-dimensional image, and obtaining a target three-dimensional image of the measured surface through the grating stripe, so that the accuracy of obtaining the three-dimensional image of the measured surface can be improved to a certain extent.
In accordance with the above, referring to fig. 5, fig. 5 is a schematic structural diagram of an image capturing system according to an embodiment of the present application, the system includes a projection module 501, a brightness adjustment module 502, and an imaging module 503, wherein,
the projection module 501 is configured to project initial grating stripes onto a measured surface of a target object, and project target grating stripes onto the measured surface;
the imaging module 503 is configured to receive an initial modulation grating stripe after the initial grating stripe is reflected by the measured surface, determine a reference three-dimensional image of the measured surface according to the initial modulation grating stripe, receive a target modulation grating stripe after the target grating stripe is reflected by the measured surface, and determine a target three-dimensional image of the measured surface according to the target modulation grating stripe;
the brightness adjustment module 502 is configured to obtain image quality of the reference three-dimensional image, and determine a target grating stripe according to the reference three-dimensional image if the image quality of the reference three-dimensional image is lower than a preset image quality threshold.
Optionally, the surface to be measured includes a plurality of regions, and in terms of the image quality of the reference three-dimensional image, the brightness adjustment module 502 is specifically configured to:
splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
obtaining an image quality of each sub-reference three-dimensional image of the plurality of sub-reference three-dimensional images;
and taking the minimum value in the image qualities of the plurality of sub-reference three-dimensional images as the image quality of the reference three-dimensional image.
Optionally, in the aspect of determining the target grating stripe according to the reference three-dimensional image, the brightness adjustment module 502 is further specifically configured to:
obtaining the light intensity of the area corresponding to the sub-reference three-dimensional images in the measured surface according to the sub-reference three-dimensional images;
determining target light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface according to the light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface and a preset genetic algorithm;
determining the target luminous intensity of each LED module in the LED microarray generating the target light intensity according to a preset light intensity calculation formula;
and adjusting the luminous intensity of each LED module in the LED microarray to be the target luminous intensity of each LED module to obtain the target grating stripe.
Optionally, the LED micro array is a square array composed of N × M LED modules, and the preset light intensity calculation formula includes:
when N and M are odd numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000007
when N and M are even numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
Figure PCTCN2018122173-APPB-000008
wherein, ILED=SLEDLLEDIs the luminous intensity, S, of the LED micro-arrayLEDIs the light emitting area, L, of the LED micro-arrayLEDFor the radiation brightness of the LED micro array, d is the distance between two adjacent LED modules, x, y, z, i and j are real numbers, m is a constant greater than 1, and m can be calculated by the following formula:
Figure PCTCN2018122173-APPB-000009
wherein, theta1/2The viewing angle corresponds to half of the illumination when the illumination is 0 degree.
Optionally, the image acquiring system is further specifically configured to:
if the image quality of the target three-dimensional image is lower than the preset image quality threshold, splitting the target three-dimensional image according to the region of the detected surface to obtain a plurality of sub-target three-dimensional images, wherein the sub-target three-dimensional images correspond to the sub-reference three-dimensional images one by one;
extracting at least one first reference image from the plurality of sub-target three-dimensional images, wherein the first reference image is a sub-target three-dimensional image of which the image quality is lower than the preset image quality threshold value, and extracting at least one second reference image from the plurality of sub-reference target images, and the second reference image is an image corresponding to the first reference image in the plurality of sub-reference three-dimensional images;
fusing the at least one first reference image and the at least one second reference image to obtain at least one fused sub-target three-dimensional image;
and combining the at least one fused sub-target three-dimensional image with a third reference image to obtain a real three-dimensional image of the measured surface, wherein the third reference image is an image except the first reference image in the plurality of sub-target three-dimensional images.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the image acquisition methods as described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute part or all of the steps of any one of the image acquisition methods as described in the above method embodiments.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

  1. An image acquisition method, characterized in that the method comprises:
    projecting initial grating stripes to a measured surface of a target object;
    receiving initial modulation grating stripes of the initial grating stripes after the initial grating stripes are reflected by the measured surface;
    determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes;
    acquiring the image quality of the reference three-dimensional image, and determining a target grating stripe according to the reference three-dimensional image if the image quality of the reference three-dimensional image is lower than a preset image quality threshold;
    projecting the target grating stripe to the measured surface;
    receiving target modulation grating stripes of the target grating stripes after the target grating stripes are reflected by the measured surface;
    and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes.
  2. The method of claim 1, wherein the surface under test comprises a plurality of regions, and wherein obtaining the image quality of the reference three-dimensional image comprises:
    splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
    obtaining an image quality of each sub-reference three-dimensional image of the plurality of sub-reference three-dimensional images;
    and taking the minimum value in the image qualities of the plurality of sub-reference three-dimensional images as the image quality of the reference three-dimensional image.
  3. The method of claim 2, wherein determining a target grating stripe from the reference three-dimensional image comprises:
    obtaining the light intensity of the area corresponding to the sub-reference three-dimensional images in the measured surface according to the sub-reference three-dimensional images;
    determining target light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface according to the light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface and a preset genetic algorithm;
    determining the target luminous intensity of each LED module in the LED microarray generating the target light intensity according to a preset light intensity calculation formula;
    and adjusting the luminous intensity of each LED module in the LED microarray to be the target luminous intensity of each LED module to obtain the target grating stripe.
  4. The method of claim 3, wherein the LED micro array is a square array of N x M LED modules, and the predetermined light intensity calculation formula comprises:
    when N and M are odd numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
    Figure PCTCN2018122173-APPB-100001
    when N and M are even numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
    Figure PCTCN2018122173-APPB-100002
    wherein, ILED=SLEDLLEDIs the luminous intensity, S, of the LED micro-arrayLEDIs the light emitting area of the LED micro-array,LLEDfor the radiation brightness of the LED micro array, d is the distance between two adjacent LED modules, x, y, z, i and j are real numbers, and m can be calculated by the following formula:
    Figure PCTCN2018122173-APPB-100003
    wherein, theta1/2The viewing angle corresponds to half of the illumination when the illumination is 0 degree.
  5. The method according to any one of claims 2 to 4, further comprising:
    if the image quality of the target three-dimensional image is lower than the preset image quality threshold, splitting the target three-dimensional image according to the region of the detected surface to obtain a plurality of sub-target three-dimensional images, wherein the sub-target three-dimensional images correspond to the sub-reference three-dimensional images one by one;
    extracting at least one first reference image from the plurality of sub-target three-dimensional images, wherein the first reference image is a sub-target three-dimensional image of which the image quality is lower than the preset image quality threshold value, and extracting at least one second reference image from the plurality of sub-reference target images, and the second reference image is an image corresponding to the first reference image in the plurality of sub-reference three-dimensional images;
    fusing the at least one first reference image and the at least one second reference image to obtain at least one fused sub-target three-dimensional image;
    and combining the at least one fused sub-target three-dimensional image with a third reference image to obtain a real three-dimensional image of the measured surface, wherein the third reference image is an image except the first reference image in the plurality of sub-target three-dimensional images.
  6. An image acquisition system comprising a projection module, a brightness adjustment module, and an imaging module, wherein,
    the projection module is used for projecting the initial grating stripes to a measured surface of a target object and projecting the target grating stripes to the measured surface;
    the imaging module is used for receiving initial modulation grating stripes after the initial grating stripes are reflected by the measured surface, determining a reference three-dimensional image of the measured surface according to the initial modulation grating stripes, receiving target modulation grating stripes after the target grating stripes are reflected by the measured surface, and determining a target three-dimensional image of the measured surface according to the target modulation grating stripes;
    the brightness adjusting module is used for obtaining the image quality of the reference three-dimensional image, and if the image quality of the reference three-dimensional image is lower than a preset image quality threshold, determining a target grating stripe according to the reference three-dimensional image.
  7. The system of claim 6, wherein the surface under test comprises a plurality of regions, and wherein the brightness adjustment module is specifically configured to:
    splitting the reference three-dimensional image according to the area of the measured surface to obtain a plurality of sub-reference three-dimensional images;
    obtaining an image quality of each sub-reference three-dimensional image of the plurality of sub-reference three-dimensional images;
    and taking the minimum value in the image qualities of the plurality of sub-reference three-dimensional images as the image quality of the reference three-dimensional image.
  8. The system of claim 7, wherein the brightness adjustment module is further specifically configured to:
    obtaining the light intensity of the area corresponding to the sub-reference three-dimensional images in the measured surface according to the sub-reference three-dimensional images;
    determining target light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface according to the light intensity of the regions corresponding to the multiple sub-reference three-dimensional images in the detected surface and a preset genetic algorithm;
    determining the target luminous intensity of each LED module in the LED microarray generating the target light intensity according to a preset light intensity calculation formula;
    and adjusting the luminous intensity of each LED module in the LED microarray to be the target luminous intensity of each LED module to obtain the target grating stripe.
  9. The system of claim 8, wherein the LED micro array is a square array of N x M LED modules, and the predetermined light intensity calculation formula comprises:
    when N and M are odd numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
    Figure PCTCN2018122173-APPB-100004
    when N and M are even numbers, the light intensity at the point P (x, y, z) on the measured surface of the target object may be represented as E (x, y, z), and the preset light intensity calculation formula is specifically as follows:
    Figure PCTCN2018122173-APPB-100005
    wherein, ILED=SLEDLLEDIs the luminous intensity, S, of the LED micro-arrayLEDIs the light emitting area, L, of the LED micro-arrayLEDFor the radiation brightness of the LED micro array, d is the distance between two adjacent LED modules, x, y, z, i and j are real numbers, and m can be calculated by the following formula:
    Figure PCTCN2018122173-APPB-100006
    wherein, theta1/2The viewing angle corresponds to half of the illumination when the illumination is 0 degree.
  10. The system of any one of claims 7 to 9, wherein the brightness adjustment module is further specifically configured to:
    if the image quality of the target three-dimensional image is lower than the preset image quality threshold, splitting the target three-dimensional image according to the region of the detected surface to obtain a plurality of sub-target three-dimensional images, wherein the sub-target three-dimensional images correspond to the sub-reference three-dimensional images one by one;
    extracting at least one first reference image from the plurality of sub-target three-dimensional images, wherein the first reference image is a sub-target three-dimensional image of which the image quality is lower than the preset image quality threshold value, and extracting at least one second reference image from the plurality of sub-reference target images, and the second reference image is an image corresponding to the first reference image in the plurality of sub-reference three-dimensional images;
    fusing the at least one first reference image and the at least one second reference image to obtain at least one fused sub-target three-dimensional image;
    and combining the at least one fused sub-target three-dimensional image with a third reference image to obtain a real three-dimensional image of the measured surface, wherein the third reference image is an image except the first reference image in the plurality of sub-target three-dimensional images.
CN201880068586.1A 2018-12-19 2018-12-19 Image acquisition method and system Active CN111566438B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/122173 WO2020124460A1 (en) 2018-12-19 2018-12-19 Image acquisition method and system

Publications (2)

Publication Number Publication Date
CN111566438A true CN111566438A (en) 2020-08-21
CN111566438B CN111566438B (en) 2022-03-25

Family

ID=71102555

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880068586.1A Active CN111566438B (en) 2018-12-19 2018-12-19 Image acquisition method and system

Country Status (2)

Country Link
CN (1) CN111566438B (en)
WO (1) WO2020124460A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116067306A (en) * 2023-03-07 2023-05-05 深圳明锐理想科技有限公司 Automatic dimming method, three-dimensional measuring method, device and system

Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040046966A1 (en) * 2000-06-07 2004-03-11 Hiroo Fujita Lattice pattern projector using liquid crystal lattice
CN101029820A (en) * 2006-01-26 2007-09-05 株式会社高永科技 Method for measuring three-dimension shape
CN201311277Y (en) * 2007-09-05 2009-09-16 中国船舶重工集团公司第七一一研究所 Variable-frequency projected grating line three-dimensional topography measuring instrument
CN102305601A (en) * 2011-05-18 2012-01-04 天津大学 High-precision non-contact measurement method and device for three-dimensional profile of optical freeform curved surface
CN103712576A (en) * 2013-12-20 2014-04-09 上海瑞立柯信息技术有限公司 Programming-controlled grating projection device
CN104634278A (en) * 2015-03-03 2015-05-20 湖北汽车工业学院 Shadow Moire measuring system for automatic compensation of contrast of fringe pattern
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
JP2016142727A (en) * 2015-02-05 2016-08-08 三菱重工業株式会社 Fringe pattern image acquisition device, fringe pattern image acquisition method, three-dimensional position specification device, three-dimensional position specification method, and program
CN106091986A (en) * 2016-06-08 2016-11-09 韶关学院 A kind of method for three-dimensional measurement being applicable to glossy surface
CN106289109A (en) * 2016-10-26 2017-01-04 长安大学 A kind of three-dimensional reconstruction system based on structure light and method
CN106705855A (en) * 2017-03-10 2017-05-24 东南大学 High-dynamic performance three-dimensional measurement method based on adaptive grating projection
CN106996754A (en) * 2017-03-02 2017-08-01 天津大学 A kind of adaptive illumination optimization method projected based on sinusoidal grating
CN107193123A (en) * 2017-05-25 2017-09-22 西安知象光电科技有限公司 A kind of closed loop modulator approach of adaptive line-structured light
CN107339953A (en) * 2017-03-02 2017-11-10 天津大学 A kind of adaptive illumination optimization method suitable for multiple reflection scene
CN107576277A (en) * 2017-08-28 2018-01-12 广东大黄蜂机器人有限公司 A kind of 3d space scanning imaging system and its imaging method
CN107917679A (en) * 2017-07-20 2018-04-17 重庆大学 Highlighted in a kind of three-dimensional measurement based on area-structure light, cross dark areas dynamic detection, the method for compensation
CN108828885A (en) * 2018-05-03 2018-11-16 合刃科技(深圳)有限公司 Light source module group and optical projection system
CN109029294A (en) * 2018-08-21 2018-12-18 合肥工业大学 Based on the Fast gray striped synthetic method for focusing two-value pattern

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2003219433A1 (en) * 2003-04-25 2004-11-23 Ecole Polytechnique Federale De Lausanne (Epfl) Shape and deformation measurements of large objects by fringe projection
CN106595522B (en) * 2016-12-15 2018-11-09 东南大学 A kind of error calibration method of optical grating projection three-dimension measuring system

Patent Citations (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040046966A1 (en) * 2000-06-07 2004-03-11 Hiroo Fujita Lattice pattern projector using liquid crystal lattice
CN101029820A (en) * 2006-01-26 2007-09-05 株式会社高永科技 Method for measuring three-dimension shape
CN201311277Y (en) * 2007-09-05 2009-09-16 中国船舶重工集团公司第七一一研究所 Variable-frequency projected grating line three-dimensional topography measuring instrument
CN102305601A (en) * 2011-05-18 2012-01-04 天津大学 High-precision non-contact measurement method and device for three-dimensional profile of optical freeform curved surface
CN103712576A (en) * 2013-12-20 2014-04-09 上海瑞立柯信息技术有限公司 Programming-controlled grating projection device
JP2016142727A (en) * 2015-02-05 2016-08-08 三菱重工業株式会社 Fringe pattern image acquisition device, fringe pattern image acquisition method, three-dimensional position specification device, three-dimensional position specification method, and program
CN104634278A (en) * 2015-03-03 2015-05-20 湖北汽车工业学院 Shadow Moire measuring system for automatic compensation of contrast of fringe pattern
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
CN106091986A (en) * 2016-06-08 2016-11-09 韶关学院 A kind of method for three-dimensional measurement being applicable to glossy surface
CN106289109A (en) * 2016-10-26 2017-01-04 长安大学 A kind of three-dimensional reconstruction system based on structure light and method
CN106996754A (en) * 2017-03-02 2017-08-01 天津大学 A kind of adaptive illumination optimization method projected based on sinusoidal grating
CN107339953A (en) * 2017-03-02 2017-11-10 天津大学 A kind of adaptive illumination optimization method suitable for multiple reflection scene
CN106705855A (en) * 2017-03-10 2017-05-24 东南大学 High-dynamic performance three-dimensional measurement method based on adaptive grating projection
CN107193123A (en) * 2017-05-25 2017-09-22 西安知象光电科技有限公司 A kind of closed loop modulator approach of adaptive line-structured light
CN107917679A (en) * 2017-07-20 2018-04-17 重庆大学 Highlighted in a kind of three-dimensional measurement based on area-structure light, cross dark areas dynamic detection, the method for compensation
CN107576277A (en) * 2017-08-28 2018-01-12 广东大黄蜂机器人有限公司 A kind of 3d space scanning imaging system and its imaging method
CN108828885A (en) * 2018-05-03 2018-11-16 合刃科技(深圳)有限公司 Light source module group and optical projection system
CN109029294A (en) * 2018-08-21 2018-12-18 合肥工业大学 Based on the Fast gray striped synthetic method for focusing two-value pattern

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YANG, GUOWEI 等: "《High-speed scanning stroboscopic fringe-pattern projection technology for three-dimensional shape precision measurement》", 《APPLIED OPTICS 》 *
陈超 等: "《基于自适应条纹投影的彩色物体三维形貌测量》", 《光学学报》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116067306A (en) * 2023-03-07 2023-05-05 深圳明锐理想科技有限公司 Automatic dimming method, three-dimensional measuring method, device and system
CN116067306B (en) * 2023-03-07 2023-06-27 深圳明锐理想科技有限公司 Automatic dimming method, three-dimensional measuring method, device and system

Also Published As

Publication number Publication date
CN111566438B (en) 2022-03-25
WO2020124460A1 (en) 2020-06-25

Similar Documents

Publication Publication Date Title
US9683943B2 (en) Inspection apparatus, inspection method, and program
US20070176927A1 (en) Image Processing method and image processor
KR100200215B1 (en) Soldering detection apparatus & method thereof using corelated neural network
EP1462992B1 (en) System and method for shape reconstruction from optical images
US7019826B2 (en) Optical inspection system, apparatus and method for reconstructing three-dimensional images for printed circuit board and electronics manufacturing inspection
CN110441323B (en) Product surface polishing method and system
CN108090896B (en) Wood board flatness detection and machine learning method and device and electronic equipment
EP1604194A1 (en) Optical inspection system and method for displaying imaged objects in greater than two dimensions
CN110517265A (en) A kind of detection method of surface defects of products, device and storage medium
CN111208147A (en) Stitch detection method, device and system
CN109862346A (en) Test method of focusing and equipment
CN111566438B (en) Image acquisition method and system
CN107003255B (en) Method for inspecting terminal of component formed on substrate and substrate inspection apparatus
KR101757240B1 (en) Method for generating reference pattern of 3D shape measuring device
US10520424B2 (en) Adaptive method for a light source for inspecting an article
CN109804731B (en) Substrate inspection apparatus and substrate inspection method using the same
CN113034427A (en) Image recognition method and image recognition device
CN116008177A (en) SMT component high defect identification method, system and readable medium thereof
KR101750883B1 (en) Method for 3D Shape Measuring OF Vision Inspection System
CN113570578A (en) Lens ghost phenomenon detection method and device
KR20180037347A (en) Board inspection apparatus and method of compensating board distortion using the same
CN109754365B (en) Image processing method and device
JP2022028344A (en) Control unit, control method, and program
CN107316293B (en) LED fuzzy picture identification and judgment method and system
CN114450579A (en) Image processing system, setting method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant