CN112361991B - Three-dimensional scanning method and device, computer equipment and storage medium - Google Patents

Three-dimensional scanning method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN112361991B
CN112361991B CN202011218571.4A CN202011218571A CN112361991B CN 112361991 B CN112361991 B CN 112361991B CN 202011218571 A CN202011218571 A CN 202011218571A CN 112361991 B CN112361991 B CN 112361991B
Authority
CN
China
Prior art keywords
projection
stripe
dimensional scanning
patterns
pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011218571.4A
Other languages
Chinese (zh)
Other versions
CN112361991A (en
Inventor
宋展
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangcheng Innovation Technology Co ltd
Original Assignee
Shenzhen Guangcheng Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangcheng Innovation Technology Co ltd filed Critical Shenzhen Guangcheng Innovation Technology Co ltd
Priority to CN202011218571.4A priority Critical patent/CN112361991B/en
Publication of CN112361991A publication Critical patent/CN112361991A/en
Application granted granted Critical
Publication of CN112361991B publication Critical patent/CN112361991B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/28Investigating the spectrum
    • G01J3/2823Imaging spectrometer
    • G01J2003/2826Multispectral imaging, e.g. filter imaging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The embodiment of the invention provides a three-dimensional scanning method, a three-dimensional scanning device, computer equipment and a storage medium, which are applied to a three-dimensional scanning system, wherein the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps: controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time; controlling the multispectral camera to acquire the projected fringe pattern; processing data aiming at the projected stripe pattern to obtain a three-dimensional model of the object to be measured; the scanning device can scan an object to be detected in multiple directions simultaneously, can perform synchronous imaging, can perform multiple projection imaging, and greatly improves the three-dimensional scanning efficiency.

Description

Three-dimensional scanning method and device, computer equipment and storage medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a three-dimensional scanning method, a three-dimensional scanning apparatus, a computer device, and a storage medium.
Background
The AOI (Auto-Optical-Inspection) automatic Optical Inspection Technology has been widely used in the industrial field, especially in the electronic packaging and manufacturing field, and can effectively replace the human eye observation mode to automatically inspect many bad conditions such as the position, type, and missing of circuit board components, and is significantly better than manual Inspection in terms of Inspection efficiency and stability, so the AOI Technology has become an important market segment in the SMT (Surface Mounted Technology) field and becomes the standard equipment of the SMT production line.
However, with the improvement of the SMT processing technology, the detection requirements for the PCB are no longer limited to the poor conditions of the position, the defect, the type and the like of the component, and the quality of the soldered pin needs to be detected, and the detection requirements cannot be presented through a 2D image, so that a so-called 3D AOI technology is generated, the so-called 3D AOI is to obtain 3D information of the component and the soldered pin of the component on the circuit board through a 3D scanning mode, and judge the connection status of the pin and the PCB substrate by judging the height information of the soldered pin of the component, if the pin is large in height, it is indicated that a faulty soldering condition exists, and it is possible that soldering fracture is caused by temperature rise or vibration and the like caused by the operation of the circuit board in the future, which results in cost scrap, so that the 3D AOI detection requirements are very urgent for high-end electronic products. Compare in 2D AOI technique, 3D AOI technical maturity is still far away not enough, and whole market still is in the starting stage, no matter domestic foreign 3D AOI product, all has a great deal of problem, still can not be like 2D AOI technique universal application, its problem is studied, mainly includes two aspects: 1) The problem of detection speed is solved, the 2D AOI technology is mainly based on image processing and recognition, the shooting process and data processing can be completed in a short time, the 3D AOI mainly adopts a structured light scanning mode and scans from four directions, the image shooting time and the data processing time are far longer than those of the 2D AOI technology, the detection speed cannot be kept up with, and the whole system cannot be used in an SMT production line in a large scale; 2) 3D measurement stability problem, present 3D AOI mainly adopt the structured light scanning to mole stripe pattern or sinusoidal phase shift stripe pattern are given first place to, and this kind of structured light coding is comparatively sensitive to surface reflection and colour, and the components and parts welding pin that awaits measuring exists stronger reflection more, consequently causes 3D imaging effect poor, and stability is not enough.
Disclosure of Invention
In view of the above, embodiments of the present invention are proposed in order to provide a three-dimensional scanning method, a three-dimensional scanning apparatus, a computer device and a storage medium that overcome or at least partially solve the above-mentioned problems.
In order to solve the above problems, an embodiment of the present invention discloses a three-dimensional scanning method, which is applied to a three-dimensional scanning system, where the three-dimensional scanning system includes a multispectral camera and at least two projection devices, and the at least two projection devices are disposed in different directions of an object to be measured; the multispectral camera is connected with the projection equipment; the method comprises the following steps:
controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time;
controlling the multispectral camera to acquire the projected fringe pattern;
and carrying out data processing on the projected stripe pattern to obtain a three-dimensional model of the object to be measured.
Preferably, the projection apparatus comprises a projection apparatus projecting a red structured light stripe pattern, a projection apparatus projecting a green structured light stripe pattern, a projection apparatus projecting a blue structured light stripe pattern-and a projection apparatus projecting a near-infrared structured light stripe pattern.
Preferably, the performing data processing on the projected stripe pattern to obtain a three-dimensional model of the object to be measured includes:
extracting different color channel component patterns of the projected stripe patterns;
and carrying out data processing on the different color channel component patterns to obtain a three-dimensional model of the object to be detected.
Preferably, said controlling said multispectral camera to acquire a fringe pattern of reflections comprises:
and separating the stripe patterns through a narrow-band filter, and controlling the multispectral camera to acquire the separated stripe patterns.
Preferably, before controlling the at least two projection apparatuses to project stripe patterns with different wavelengths to the object to be measured simultaneously, the method includes:
and coding the sub-pixel coordinates in the stripe pattern by taking the edge of the binary gray code stripe pattern as a coding feature to obtain mutually independent phase coding images under different wavelengths.
Preferably, the controlling the at least two projection apparatuses to project stripe patterns with different wavelengths to the object to be measured simultaneously includes:
and controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously through conjugate projection.
The embodiment of the invention also discloses a three-dimensional scanning system, which comprises:
the system comprises a multispectral camera and at least two projection devices; the at least two projection devices are arranged in different directions of the object to be measured; the multispectral camera is connected with the projection equipment;
the multispectral camera is used for acquiring the projected stripe patterns.
The embodiment of the invention also discloses a three-dimensional scanning device which is applied to a three-dimensional scanning system, wherein the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps:
the fringe pattern projection module is used for controlling the at least two projection devices to project fringe patterns with different wavelengths to the object to be measured simultaneously;
a fringe pattern acquisition module for controlling the multispectral camera to acquire the projected fringe pattern;
and the three-dimensional model acquisition module is used for carrying out data processing on the projected stripe pattern to obtain a three-dimensional model of the object to be measured.
The embodiment of the invention also discloses computer equipment which comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes the steps of the three-dimensional scanning method when executing the computer program.
The embodiment of the invention also discloses a computer readable storage medium, wherein a computer program is stored on the computer readable storage medium, and the computer program realizes the steps of the three-dimensional scanning method when being executed by a processor.
The embodiment of the invention has the following advantages:
in the embodiment of the invention, the three-dimensional scanning method is applied to a three-dimensional scanning system, the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps: controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time; controlling the multispectral camera to acquire the projected fringe pattern; processing data aiming at the projected stripe pattern to obtain a three-dimensional model of the object to be measured; the scanning device can scan an object to be detected in multiple directions simultaneously, can perform synchronous imaging, can perform multiple projection imaging, and greatly improves the three-dimensional scanning efficiency.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flowchart illustrating steps of an embodiment of a three-dimensional scanning method according to the present invention;
FIG. 2 is a flow chart illustrating a stripe pattern encoding step according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of a conjugate projection step according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of the overall operation of a system according to an embodiment of the present invention;
FIG. 5 is a block diagram of a multi-spectral imaging system in accordance with an embodiment of the present invention;
FIG. 6 is a schematic diagram of Gray code encoding stripes according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of Gray code encoding stripes according to an embodiment of the present invention;
FIG. 8 is a schematic illustration of the positioning of a second derivative zero crossing in accordance with an embodiment of the present invention;
FIG. 9 is a schematic diagram of positioning of intersection of gray scale variation curves of positive and negative stripes according to an embodiment of the present invention;
FIG. 10 is a block diagram of an embodiment of a three-dimensional scanning apparatus according to the present invention;
FIG. 11 is an internal block diagram of a computer device of an embodiment.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects solved by the embodiments of the present invention more clearly apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings and the embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
One of the core ideas of the embodiment of the invention is that by utilizing the characteristic that the image sensors which are correspondingly received after the light emission of different wave bands cannot be confused, through the structured light projection equipment with different wavelengths arranged in different directions of the object to be measured, the coded stripe images are projected onto the object to be measured through the structured light projection equipment, the image sensors receive the stripe images emitted by the structured light projection equipment with the corresponding wavelengths, and the three-dimensional model of the object to be measured is obtained through data processing processes such as decoding and registering.
Referring to fig. 1, a flowchart illustrating steps of an embodiment of a three-dimensional scanning method according to an embodiment of the present invention is shown, and the flowchart is applied to a three-dimensional scanning system, where the three-dimensional scanning system includes a multispectral camera and at least two projection devices, and the at least two projection devices are disposed in different directions of an object to be measured; the multispectral camera is connected with the projection equipment; the method specifically comprises the following steps:
step 101, controlling the at least two projection devices to project stripe patterns with different wavelengths to an object to be measured at the same time;
the three-dimensional scanning method in the embodiment of the present invention may be applied to a three-dimensional scanning system, where the three-dimensional scanning system may include a multispectral camera and at least two projection devices, the multispectral camera is connected to the projection devices, the multispectral camera may be connected to the at least two projection devices, and the at least two projection devices may be disposed in at least two different directions of an object to be measured, and in addition, the multispectral camera and the projection devices may also be connected to a computer device, respectively, and the computer device may be used to generate a coded stripe pattern or perform other data processing processes.
That is, the three-dimensional scanning system may include a computer device, which may be used for different data processing procedures, such as encoding or decoding of stripe patterns, and the like, which is not limited by the embodiments of the present invention;
in the embodiment of the invention, the object to be detected can comprise a welding pin of a computer chip and can also comprise cultural relics and the like, the establishment of a three-dimensional model of the cultural relics can be realized through projection equipment arranged in two directions, and the embodiment of the invention does not limit the variety of the object to be detected excessively.
In a preferred embodiment of the present invention, the projection device may include a projection device projecting a red structured light stripe pattern, a projection device projecting a green structured light stripe pattern, a projection device projecting a blue structured light stripe pattern, a projection device projecting an ultraviolet light stripe pattern, and a projection device projecting a near-infrared structured light stripe pattern; the embodiments of the present invention do not limit this.
The projection device can comprise a Digital Light Processing (DLP) projection device, wherein an image signal is digitally processed, and then structured Light is projected, and the DLP projection device can emit structured Light with different wave bands; different projection devices can be controlled to emit structured light of different wavelengths in different directions at the same time.
For example, the object to be measured may include a welding pin of a computer chip, four projection devices emitting structured light with different wavelengths are respectively disposed in four directions where the welding pin of the computer chip is located, for example, a projection device emitting and projecting a red structured light stripe pattern is disposed at the welding pin on the upper side, a projection device emitting and projecting a green structured light stripe pattern is disposed at the welding pin on the lower side, a projection device emitting and projecting a blue structured light stripe pattern is disposed at the welding pin on the left side, a projection device emitting and projecting a near-infrared structured light stripe pattern is disposed at the welding pin on the right side, and structured light stripe images with different wavelengths are emitted from four directions to the welding pin in the corresponding direction.
Step 102, controlling the multispectral camera to acquire the projected fringe pattern;
further applied to the embodiment of the invention, the multispectral camera comprises a red light image sensor, a green light image sensor, a blue light image sensor and a near infrared light image sensor.
The number of the CCD (Charge Coupled Device) image sensors may be plural, and the CCD image sensors may respectively acquire a plurality of kinds of fringe patterns of structured light with different wavelengths.
For example, a red light image sensor may acquire a fringe pattern of red structured light; while the green image sensor may acquire a stripe pattern of green structured light.
In a specific example, when the object to be measured is a welding pin of a computer chip, four projection devices emitting structured light with different wavelengths are respectively arranged in four directions where the welding pin of the computer chip is located, a projection device emitting and projecting a red structured light stripe pattern is arranged at the upper welding pin, a projection device emitting and projecting a green structured light stripe pattern is arranged at the lower welding pin, a projection device emitting and projecting a blue structured light stripe pattern is arranged at the left welding pin, a projection device emitting and projecting a near infrared structured light stripe pattern is arranged at the right welding pin, and after the structured light stripe patterns with different wavelengths are emitted from four directions to the welding pins in corresponding directions, the multispectral camera can be controlled to obtain the structured light stripe patterns with different wavelengths.
In a preferred embodiment of the present invention, the stripe patterns are separated by a narrowband optical filter, and the multispectral camera is controlled to obtain the separated stripe patterns, specifically, the multispectral camera may be provided with a narrowband optical filter, and the structured light stripe patterns with different wavelengths are separated by the narrowband optical filter, and on an imaging layer, the multispectral camera is provided with a plurality of CMOS (Complementary Metal Oxide Semiconductor)/CCD sensors, and the narrowband optical filters with specific wavelength bands are respectively provided, so that it is ensured that a certain camera can only shoot structured light images with corresponding wavelength bands, and the structured light images are not interfered and confused with each other, and thus, based on four-time scanning one-time imaging, four groups of structured light images projected at different angles can be obtained, and the scanning efficiency is greatly improved.
In addition, the multispectral camera is also provided with a light splitting device, reflected light is separated into visible light and near infrared light, a near infrared light image sensor camera is added to shoot the structured light stripe patterns projected by the near infrared DLP, and the projection and the collection of the structured light stripes of various different wave bands are further realized.
In another embodiment of the present invention, the multispectral camera may include a general color industrial camera, and after the projected fringe pattern is acquired, each pixel in the projected fringe pattern is color-channel separated by a color-channel separation algorithm, and may be generally separated into an R component image, a G component image, a B component image, and the like.
And 103, performing data processing on the projected stripe pattern to obtain a three-dimensional model of the object to be measured.
In the embodiment of the invention, after the projected stripe patterns are obtained, the projected stripe patterns are respectively subjected to data processing to obtain point cloud data, and then registration is carried out to obtain the three-dimensional model of the object to be measured.
It should be noted that the calibration may be performed for a structured light projection system in the three-dimensional scanning system; in this embodiment of the present invention, before the step of controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time is executed, the structured light projection system may be calibrated, and the calibration may be performed in various manners, which is not limited in this embodiment of the present invention.
In a preferred embodiment of the present invention, after obtaining the projected fringe pattern, different color channel component patterns of the projected fringe pattern can be extracted; and carrying out data processing on the different color channel component patterns to obtain a three-dimensional model of the object to be detected.
In the embodiment of the invention, the three-dimensional scanning method is applied to a three-dimensional scanning system, the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps: controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time; controlling the multispectral camera to acquire the projected fringe pattern; processing data aiming at the projected stripe pattern to obtain a three-dimensional model of the object to be measured; the scanning device can scan an object to be detected in multiple directions simultaneously, can perform synchronous imaging, can perform multiple projection imaging, and greatly improves the three-dimensional scanning efficiency.
In a preferred embodiment of the present invention, referring to fig. 2, a flow chart of a stripe pattern encoding step in the embodiment of the present invention is shown, before controlling the at least two projection devices to project stripe patterns with different wavelengths to an object to be measured simultaneously, the method includes the following sub-steps:
and S11, coding the sub-pixel coordinates in the stripe pattern by taking the edge of the binary Gray code stripe pattern as a coding feature to obtain mutually independent phase coding images under different wavelengths.
In the embodiment of the invention, the stripe patterns with different wavelengths projected to the object to be detected can be binary patterns, namely, the edges of the binary gray code stripe patterns are taken as coding characteristics, the sub-pixel coordinates in the stripe patterns are coded to obtain mutually independent phase coding images under different wavelengths, and only the binary stripes are projected, so that the high-frequency projection characteristic of the DLP can be fully exerted, and the effect of short single scanning time can be realized by combining with the current mainstream high-speed high-resolution CMOS imaging equipment.
Referring to fig. 3, a schematic flow chart of a conjugate projection step in an embodiment of the present invention is shown, where the controlling the at least two projection apparatuses to project stripe patterns with different wavelengths to an object to be measured simultaneously includes the following sub-steps:
and S21, controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously through conjugate projection.
Aiming at the surface of a complex object to be detected, the first derivative of the fringe edge is directly solved, the fringe edge position is difficult to accurately calculate, fringe patterns with different wavelengths can be projected to the object to be detected in a conjugate projection mode, namely, after the gray code pattern is projected, the reverse fringe of the gray code pattern is projected, aiming at the fringe pattern edge positioning aspect, two gray edge change curves can be respectively obtained from a positive projected image and a negative projected image for the image point of each fringe edge position, and the intersection point of the two curves is the fringe edge position;
it should be noted that, in the data processing process, curve fitting may be performed on the image gray scale in the edge region by using a conventional second-order zero method, and then a second-order gradient derivative is obtained from the luminance distribution curve, where an image point with a second-order derivative of 0 is an edge position, so as to obtain sub-pixel edge image coordinates, and decode the obtained images of different bands.
In order that those skilled in the art will better understand the embodiments of the present invention, the following description is given by way of a specific example:
referring to fig. 4, a general system operation diagram according to an embodiment of the present invention is shown, fig. 4 is a general system operation diagram, fig. 5 is a structural principle of a multispectral imaging system, and the system operation principle is as follows:
1. the system comprises a multispectral camera and four DLP projection devices, wherein the four DLP projection devices respectively work in four wave bands of R (Red ), G (Green ), B (Blue, blue) and IR (Infrared), namely respectively project Red structure light stripes, green structure light stripes, blue structure light stripes and near-Infrared structure light stripes;
2. the four DLPs are respectively grouped into the projection of the structured light stripes of the welding pins in four directions, and the projection can be carried out at the same time instead of four times due to different working wave bands, so that the scanning efficiency is obviously improved;
3. multispectral camera system: the 3CCD camera is adopted, 3 image sensors are arranged in the 3CCD camera according to the 3CCD camera principle, and only R, G and B light ray information reflected by the surface of an object is acquired respectively, so that three DLP projection stripes of RGB cannot be mixed in an image;
4. in order to simultaneously acquire the near-infrared structured light stripes projected by the IR DLP equipment, a light splitting device is adopted to separate reflected light rays into visible light and near-infrared light, and a near-infrared camera is added to shoot the structured light stripes projected by the near-infrared DLP, so that the projection and the acquisition of the structured light stripes of four different wave bands can be realized at one time;
5. structured light encoding: in order to improve the poor stability of the existing sine-structured light stripe on surface reflection and texture, a binary stripe encoding strategy is adopted, a black and white stripe is used for replacing a sine gray stripe, and the sub-pixel coordinate of the edge of the grid Lei Matiao stripe is used as an encoding characteristic, so that the robustness of an encoding algorithm can be remarkably improved, meanwhile, the binary stripe has high output frequency on DLP (digital light processing) equipment, and compared with the gray sine stripe, the binary stripe can obtain higher scanning speed;
6. registration fusion of multi-angle scanning data: the 3CCD camera and the R-G-B DLP in the system can form three sets of structured light systems, the IR camera and the IR DLP form a fourth set of structured light systems, each set of structured light systems needs to be calibrated by independent parameters, and the calibration of the structured light systems belongs to the conventional technical means and is not described herein. After calibrating four sets of structured light equipment, placing calibration checkerboards in a common scanning area of the four sets of structured light equipment, scanning four sets of checkerboard data, then calculating external parameters among the four sets of structured light equipment, selecting one set of equipment as a reference system, and converting data of other three sets of equipment into the same reference system through the calculated external parameters, so that rapid registration and fusion of multiple sets of data can be realized.
7. After the three-dimensional models of the welding pins in the four directions of the chip are obtained, the pin height can be quantitatively analyzed according to the process requirements.
Specifically, the detailed technical solution of the embodiment of the present invention may include the following components:
(1) Single structured light system calibration
The multispectral imaging system can be composed of four sets of independent structured light systems, respectively work in R \ G \ B \ IR wave bands, and through distinguishing the spectral wave bands, the same object can be scanned simultaneously without generating projection stripes on images to be overlapped. The single structured light system adopted by the invention mainly comprises a DLP optical machine and a camera, the internal and external parameters of the projection and the camera can be calibrated by adopting a conventional calibration means, the rapid projection of the structured light image and the accurate triggering of the camera for image capture can be realized based on an FPGA module and a triggering port which are arranged in a DLP device, the image data obtained by the camera is firstly stored in an RAM arranged in the camera, and after all four cameras are completely shot, the image data is transmitted to a data processing device through a data line, so that the high efficiency and the stability of the whole acquisition process are ensured.
(2) Multiple structured light system joint calibration
In order to realize rapid and accurate registration of 3D data of different angles acquired by a plurality of structured light systems, the plurality of structured light systems need to be jointly calibrated, the essence of the method is to determine external parameter calculation among a plurality of 3D cameras, and a related mature method is provided;
(3) Structured light coding method based on stripe edge geometric features of binary Gray code pattern
The traditional sine phase shift coding mainly calculates a phase value by the gray scale change of sine stripes, so that higher phase calculation precision and coding precision are achieved, however, for a high-reflection surface like solder paste, gray scale information is difficult to maintain after being projected, and overexposure and underexposure phenomena occur under most conditions, so that the method is not feasible for the high-reflection surface.
In order to enable the pure Gray code coding method to achieve sub-pixel level coding accuracy, the Gray code stripe edge is used as a coding feature in the embodiment of the invention, and as the Gray code edge can be regarded as an ideal edge in a projector image coordinate system and has no coding error, for an image shot by a camera, a fuzzy phenomenon can be generated after the projection of black and white stripes, the original black and white rectangular edge is not kept any more and is in a curve transition form, so that the Gray code stripe edge feature with sub-pixel positioning accuracy can be detected through gray gradient change, and the sub-pixel coordinate is coded through the Gray code value with global uniqueness coding, thereby achieving the aim of improving the sub-pixel accuracy of the Gray code coding feature.
Referring to fig. 6 and fig. 7, schematic diagrams of gray code encoding stripes according to an embodiment of the present invention are shown, and assuming that a gray code encoding scheme with a total length of 1024 is adopted, and a total of 12 encoded pictures (including all black and all white pictures, with a minimum stripe width of 4 pixels) are used, the number of effective edge features that can be obtained is 1022+2=1024, where 2 is the left and right edges of the image.
(4) Gray code edge sub-pixel positioning method
For the edge point of each stripe, the encoding value can realize global unique encoding (in the x direction) through gray codes, and three-dimensional reconstruction is carried out on the three-dimensional coordinates of the edge point, so that the edge positioning algorithm is very critical, if the surface of an object accords with the lambert reflection characteristic, the quality of the stripe image shot by a camera is higher, edge positioning can be carried out by the first method, and if the texture and reflection of the surface of the object are more serious, edge positioning is carried out by the second method, which is specifically described as follows: 1) The method I is characterized in that the fringe is inevitably blurred after being projected, after being shot by a camera, the fringe is convoluted once through a camera lens, the process is equivalent to blurring once, the edge of an image shot in the image is not binary rectangular wave distribution any more, but a gray level change process occurs, at the moment, a conventional second-order zero point method can be adopted, namely, curve fitting is firstly carried out on the gray level of an image in an edge area, a second-order gradient derivative is obtained on a brightness distribution curve, and an image point with the second-order derivative being 0 is an edge position (as shown in figure 8), so that sub-pixel edge image coordinates are obtained and are coded; 2) In the second method, for a more complex object surface, it is difficult to accurately calculate the edge position by directly solving the derivative, and in order to improve the robustness of the three-dimensional scanning algorithm, the number of projection patterns can be increased, that is, a conjugate projection method is adopted, and after the gray code pattern p is projected, the reverse color stripe of the pattern p is projected, so that 10 projection patterns (pure black and pure white are already conjugate images) need to be added, that is, 22 patterns are projected in total; in the aspect of fringe edge positioning, for an image point at each fringe edge position, two gray scale edge change curves can be obtained from a forward projected image and a backward projected image respectively, and the intersection point of the two curves is the fringe edge position, so that edge image coordinates of sub-pixels can be obtained.
As shown in fig. 8, for an ideal lambertian reflecting surface, a single stripe image can be used, and sub-pixel positioning of the stripe edge can be realized by a method of zero crossing of the second derivative of the image; as shown in fig. 9, for a more complex reflection surface, a conjugate projection method is adopted, that is, for each gray code stripe, two kinds of patterns, namely a positive pattern and a negative pattern, are projected, and the sub-pixel coordinate values of the stripe coding edge are obtained by a method of intersecting gray scale variation curves of the positive and negative stripes.
In the embodiment of the invention, rapid three-dimensional reconstruction based on the GPU can be realized: in 3D-AOI detection, the requirement on image resolution is higher, generally not lower than 400 ten thousand pixels, the resolution of a current mainstream camera is 1200 ten thousand pixels, each structured light system needs to shoot about 20 images, four sets of equipment are shot simultaneously in total, nearly hundred ten million pixel structured light images are obtained, and the conventional reconstruction algorithm based on a CPU (central processing unit) platform is difficult to achieve higher reconstruction efficiency, so that the method adopts a GPU parallel platform for operation, and because the coding, decoding and reconstruction algorithms in the scheme are all carried out aiming at independent pixel points and have natural high parallelism, the parallel high-speed computing capability of the GPU can be fully exerted, and the rapid high-resolution image processing and the cloud three-dimensional reconstruction are realized;
in addition, registration and fusion of multi-angle scanning data can be realized, a 3CCD camera and an R-G-B DLP in the system can form three sets of structured light systems, an IR camera and an IR DLP form a fourth set of structured light systems, each set of structured light systems needs to be calibrated by independent parameters, and the calibration of the structured light systems belongs to a conventional means and is not described herein any more. After calibrating four sets of structured light equipment, placing calibration checkerboards in a common scanning area of the four sets of structured light equipment, scanning four sets of checkerboard data, then calculating external parameters among the four sets of structured light equipment, selecting one set of equipment as a reference system, and converting data of other three sets of equipment into the same reference system by calculating external parameters of other three sets of equipment so as to realize rapid registration of multiple sets of data. In order to further improve the registration accuracy of the four groups of point clouds, a conventional ICP algorithm can be adopted to perform fine registration on the public areas of the four point clouds, then the point clouds are fused, redundant points are eliminated, and a single-layer high-accuracy complete point cloud and a depth image are obtained;
in the embodiment of the invention, after the three-dimensional coordinates of each point in the image can be obtained, various detection and judgment can be carried out based on the 3D data, such as the calculation of key parameters of the height and the flatness of the packaging surface of the chip, the height of a welding pin, the volume of soldering tin and the like, so that the method is superior to the traditional 2D-AOI technology and realizes a more real and accurate detection process.
Based on a common DLP device, the simultaneous projection and synchronous imaging of the structural light stripes with different spectrums are realized, so that the once-through imaging of multiple projections is realized, and the three-dimensional scanning efficiency is greatly improved;
because only the binary stripes are projected, the high-frequency projection characteristic of the DLP can be fully exerted, and the requirement that the scanning time of a single FOV is less than 0.2 second and is far higher than the current 0.33s/FOV can be realized by combining with the current mainstream high-speed high-resolution CMOS imaging equipment;
aiming at a common Lambert reflecting surface and a complex texture reflecting surface, two different fringe edge sub-pixel positioning methods are provided, and the robustness of a three-dimensional scanning algorithm can be obviously improved on the premise of meeting the requirement of scanning efficiency;
the method comprises the steps of firstly calculating external parameters of three structured light systems relative to a reference system, using the external parameters as initial registration parameters, and further introducing an ICP iterative registration strategy to realize accurate and rapid registration and fusion of four groups of three-dimensional scanning data.
In the existing multi-angle three-dimensional scanning system, stripes need to be projected in sequence and are shot by one or more cameras in sequence, the scanning efficiency is low, the projected stripes are distinguished on a spectrum level, four projection devices can project simultaneously, and an imaging level is based on a plurality of CMOS/CCD sensors and is respectively provided with narrow-band filters with specific wave bands, so that a certain camera can only shoot structured light images with corresponding wave bands without mutual interference and confusion, therefore, four groups of structured light images projected at different angles can be obtained based on one-time imaging of four-time scanning, and the scanning efficiency is greatly improved;
secondly, the main bottleneck of the scanning speed of the existing structured light system is a grating projector, because the speed of projecting the sine stripes is slow, the invention adopts the binary stripes, so that the high-speed projection characteristic of the DLP equipment on the binary images can be fully exerted, and the bottleneck of the structured light system is not the grating projection speed any more, but the camera shooting speed. If the transmission bandwidth factor from the image to the computer is not considered, the existing CMOS imaging speed can easily reach thousands of FPSs by adopting the on-line processing methods of the FPGA, the chip and the like, so that the whole scanning speed has larger promotion space;
in addition, in order to improve the three-dimensional reconstruction precision, on the basis of the traditional Gray code, the black and white stripe edge is used as a coding feature, the pixel level coding precision of the traditional Gray code is improved to a sub-pixel level through a sub-pixel edge positioning method, the higher three-dimensional reconstruction precision is realized, and meanwhile, the binary coding pattern can realize high-speed projection output on DLP equipment, so the technology has unique advantages in the aspects of reconstruction precision and scanning speed.
In the embodiment of the invention, four light sources with different wavelengths of RGB-IR are adopted at present, and the specific application can be realized by completely adopting IR light sources (with different wavelengths) and ultraviolet light sources or three or five light sources and the like according to the requirements, and only a narrow-band filter is needed to be adopted at a camera end to separate different projection light sources so as to avoid code confusion;
in the embodiment of the invention, a PC-end GPU platform is adopted for high-speed three-dimensional reconstruction operation, an FPGA or a special chip can be adopted at a camera end, and after the image is directly processed to generate 3D data, the 3D data is transmitted to a computer end, so that a large number of images do not need to be transmitted to the computer, the system is more efficient and stable, and the hardware cost can be obviously reduced;
the embodiment of the invention adopts a camera imaging unit, integrates imaging devices aiming at four different wave bands in the camera imaging unit, respectively shoots patterns projected by four structured light stripe devices, also can adopt four independent cameras, respectively matches with structured light projectors of four different wave bands, and the subsequent data processing and calibration principles are the same;
aiming at the aspect of application field, the embodiment of the invention provides a relatively universal multi-angle synchronous 3D scanning method, and the technical scheme of the embodiment of the invention can also be used in the fields of multi-angle human body 3D scanning, article scanning, industrial part scanning and the like by taking the 3D-AOI technical requirement in industrial detection as an example and elaborating in detail; further, if the camera speed can reach 300-500FPS, the 3D scanning speed of the whole system can reach about 30FPS, and the technology can be used for the real 3D reconstruction of dynamic targets and can be applied to a wider field of AR/VR, holographic display and the like.
It should be noted that, for simplicity of description, the method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present invention is not limited by the illustrated order of acts, as some steps may occur in other orders or concurrently in accordance with the embodiments of the present invention. Further, those skilled in the art will appreciate that the embodiments described in the specification are presently preferred and that no particular act is required to implement the invention.
Referring to fig. 10, a block diagram of a three-dimensional scanning apparatus according to an embodiment of the present invention is shown, and the three-dimensional scanning apparatus is applied to a three-dimensional scanning system, where the three-dimensional scanning system includes a multispectral camera and at least two projection devices, and the at least two projection devices are disposed in different directions of an object to be measured; the multispectral camera is connected with the projection equipment; the method specifically comprises the following modules:
a fringe pattern projection module 301, configured to control the at least two projection devices to project fringe patterns with different wavelengths to the object to be measured at the same time;
a fringe pattern acquisition module 302 for controlling the multispectral camera to acquire the projected fringe pattern;
and the three-dimensional model acquisition module 303 is configured to perform data processing on the projected fringe pattern to obtain a three-dimensional model of the object to be measured.
Preferably, the projection device comprises a projection device projecting a red structured light stripe pattern, a projection device projecting a green structured light stripe pattern, a projection device projecting a blue structured light stripe pattern, a projection device projecting an ultraviolet light stripe pattern and a projection device projecting a near-infrared structured light stripe pattern.
Preferably, the three-dimensional model obtaining module includes:
the extraction submodule is used for controlling and extracting different color channel component patterns of the projected stripe patterns;
and the data processing submodule is used for carrying out data processing on the different color channel component patterns to obtain a three-dimensional model of the object to be detected.
Preferably, the stripe pattern acquisition module includes:
and the separation submodule is used for separating the stripe patterns through a narrow-band filter and controlling the multispectral camera to acquire the separated stripe patterns.
Preferably, a module connected to the stripe pattern projection module includes:
and the coding module is used for coding the sub-pixel coordinates in the stripe pattern by taking the edge of the binary Gray code stripe pattern as a coding characteristic to obtain mutually independent phase coding images under different wavelengths.
Preferably, the stripe pattern projection module includes:
and the conjugate projection submodule is used for controlling the at least two projection devices to project the fringe patterns with different wavelengths to the object to be measured simultaneously through conjugate projection.
Preferably, the apparatus further comprises:
and the calibration module is used for calibrating the structured light projection system in the three-dimensional scanning system.
The modules in the three-dimensional scanning device can be wholly or partially realized by software, hardware and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
The three-dimensional scanning device provided by the above can be used for executing the three-dimensional scanning method provided by any of the above embodiments, and has corresponding functions and beneficial effects.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 11. The computer device comprises a processor, a memory, a network interface, a display screen and an input device which are connected through a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a three-dimensional scanning method. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the configuration shown in fig. 11 is a block diagram of only a portion of the configuration associated with the present application, and is not intended to limit the computing device to which the present application may be applied, and that a particular computing device may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer device is provided comprising a memory having a computer program stored therein and a processor implementing the steps of the embodiments of fig. 1-3 when executing the computer program.
In one embodiment, a computer readable storage medium is provided, having stored thereon a computer program, which when executed by a processor, performs the steps of the embodiments of fig. 1-3 below.
The embodiments in the present specification are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other.
As will be appreciated by one of skill in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, embodiments of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, embodiments of the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Embodiments of the present invention are described with reference to flowchart illustrations and/or block diagrams of methods, terminal devices (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing terminal to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing terminal, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing terminal to cause a series of operational steps to be performed on the computer or other programmable terminal to produce a computer implemented process such that the instructions which execute on the computer or other programmable terminal provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While preferred embodiments of the present invention have been described, additional variations and modifications of these embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the embodiments of the invention.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or terminal that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or terminal. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or terminal device that comprises the element.
The three-dimensional scanning method, the three-dimensional scanning device, the computer device and the storage medium provided by the present invention are described in detail, and specific examples are applied herein to illustrate the principles and embodiments of the present invention, and the description of the above embodiments is only used to help understand the method and the core ideas of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (8)

1. A three-dimensional scanning method is characterized by being applied to a three-dimensional scanning system, wherein the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps:
controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time;
controlling the multispectral camera to acquire the projected fringe pattern;
processing data aiming at the projected stripe pattern to obtain a three-dimensional model of the object to be measured;
a multispectral camera and three projection devices respectively working at R, G, B wave band can form three sets of structured light systems, in addition, an IR camera and the projection devices working at IR wave band are added to form a fourth set of structured light system, and each set of structured light system needs to carry out independent parameter calibration;
before controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time, the method comprises the following steps:
coding sub-pixel coordinates in the stripe pattern by taking binary gray code stripe pattern edges as coding features to obtain mutually independent phase coding images under different wavelengths;
the controlling of the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously includes:
controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously through conjugate projection;
projecting a gray code pattern p by adopting a conjugate projection method, and then projecting a reverse color stripe of the pattern p; 10 projected patterns need to be added, and since pure black and pure white are conjugate images per se, 22 patterns are projected in total; in the aspect of fringe edge positioning, for each image point at the fringe edge position, two gray scale edge variation curves are respectively obtained from the forward and backward projection images, and the intersection point of the two curves is the fringe edge position, so that the edge image coordinate of the sub-pixel is obtained.
2. The three-dimensional scanning method according to claim 1, wherein the projection device comprises a projection device projecting a red structured light stripe pattern, a projection device projecting a green structured light stripe pattern, a projection device projecting a blue structured light stripe pattern, and a projection device projecting a near-infrared structured light stripe pattern.
3. The three-dimensional scanning method according to claim 2, wherein the performing data processing on the projected fringe pattern to obtain a three-dimensional model of the object to be measured comprises:
extracting different color channel component patterns of the projected stripe patterns;
and carrying out data processing on the different color channel component patterns to obtain a three-dimensional model of the object to be detected.
4. The three-dimensional scanning method according to claim 1, wherein said controlling said multispectral camera to acquire a fringe pattern of reflections comprises:
and separating the stripe patterns through a narrow-band filter, and controlling the multispectral camera to acquire the separated stripe patterns.
5. A three-dimensional scanning system to which the three-dimensional scanning method according to claim 1 is applied, comprising:
the system comprises a multispectral camera and at least two projection devices; the at least two projection devices are arranged in different directions of the object to be measured; the multispectral camera is connected with the projection equipment;
the multispectral camera is used for acquiring the projected fringe patterns.
6. A three-dimensional scanning device is characterized in that the three-dimensional scanning device is applied to a three-dimensional scanning system, the three-dimensional scanning system comprises a multispectral camera and at least two projection devices, and the at least two projection devices are arranged in different directions of an object to be detected; the multispectral camera is connected with the projection equipment; the method comprises the following steps:
the fringe pattern projection module is used for controlling the at least two projection devices to project fringe patterns with different wavelengths to the object to be measured simultaneously;
a fringe pattern acquisition module for controlling the multispectral camera to acquire the projected fringe pattern;
the three-dimensional model acquisition module is used for carrying out data processing on the projected stripe pattern to obtain a three-dimensional model of the object to be measured;
a multispectral camera and three projection devices respectively working at R, G, B wave band can form three sets of structured light systems, in addition, an IR camera and a projection device working at IR wave band are added to form a fourth set of structured light system, and each set of structured light system needs to be calibrated by independent parameters;
before controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured at the same time, the method comprises the following steps:
coding sub-pixel coordinates in the stripe pattern by taking binary gray code stripe pattern edges as coding features to obtain mutually independent phase coding images under different wavelengths;
the controlling of the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously includes:
controlling the at least two projection devices to project stripe patterns with different wavelengths to the object to be measured simultaneously through conjugate projection;
projecting a gray code pattern p by adopting a conjugate projection method, and then projecting a reverse color stripe of the pattern p; 10 projected patterns need to be added, and since pure black and pure white are conjugate images per se, 22 patterns are projected in total; in the aspect of fringe edge positioning, for each image point at the fringe edge position, two gray scale edge variation curves are respectively obtained from the forward and backward projection images, and the intersection point of the two curves is the fringe edge position, so that the edge image coordinate of the sub-pixel is obtained.
7. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor realizes the steps of the three-dimensional scanning method of any one of claims 1 to 4 when executing the computer program.
8. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the three-dimensional scanning method of any one of claims 1 to 4.
CN202011218571.4A 2020-11-04 2020-11-04 Three-dimensional scanning method and device, computer equipment and storage medium Active CN112361991B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011218571.4A CN112361991B (en) 2020-11-04 2020-11-04 Three-dimensional scanning method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011218571.4A CN112361991B (en) 2020-11-04 2020-11-04 Three-dimensional scanning method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN112361991A CN112361991A (en) 2021-02-12
CN112361991B true CN112361991B (en) 2022-12-16

Family

ID=74513010

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011218571.4A Active CN112361991B (en) 2020-11-04 2020-11-04 Three-dimensional scanning method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112361991B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113808097B (en) * 2021-09-14 2024-04-12 北京主导时代科技有限公司 Method and system for detecting loss of key parts of train
CN113936095A (en) * 2021-09-28 2022-01-14 先临三维科技股份有限公司 Scanner and scanning method
WO2023222139A1 (en) * 2022-05-18 2023-11-23 上海图漾信息科技有限公司 Depth data measuring head, measuring apparatus, and measuring method

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014002514B4 (en) * 2014-02-21 2015-10-29 Universität Stuttgart Device and method for multi- or hyperspectral imaging and / or for distance and / or 2-D or 3-D profile measurement of an object by means of spectrometry
CN106500627B (en) * 2016-10-19 2019-02-01 杭州思看科技有限公司 3-D scanning method and scanner containing multiple and different long wavelength lasers
CN107063129B (en) * 2017-05-25 2019-06-07 西安知象光电科技有限公司 A kind of array parallel laser projection three-dimensional scan method
US11605172B2 (en) * 2017-12-08 2023-03-14 Arizona Board Of Regents On Behalf Of The University Of Arizona Digital fringe projection and multi-spectral polarization imaging for rapid 3D reconstruction
CN110470238A (en) * 2019-07-02 2019-11-19 杭州非白三维科技有限公司 A kind of hand-held laser 3 d scanner, scan method and device
CN111811432A (en) * 2020-06-16 2020-10-23 中国民用航空飞行学院 Three-dimensional imaging system and method

Also Published As

Publication number Publication date
CN112361991A (en) 2021-02-12

Similar Documents

Publication Publication Date Title
CN112361991B (en) Three-dimensional scanning method and device, computer equipment and storage medium
JP6457072B2 (en) Integration of point clouds from multiple cameras and light sources in 3D surface shape measurement
KR102111181B1 (en) Systems and methods for parallax detection and correction in images captured using array cameras
JP5256251B2 (en) Inspection method of measurement object
US20140198185A1 (en) Multi-camera sensor for three-dimensional imaging of a circuit board
JP5659396B2 (en) Joint inspection device
US10126252B2 (en) Enhanced illumination control for three-dimensional imaging
TW201724026A (en) Generating a merged, fused three-dimensional point cloud based on captured images of a scene
KR101659302B1 (en) Three-dimensional shape measurement apparatus
CN105372259B (en) Measurement apparatus, base board checking device and its control method, storage media
CN105180836B (en) Control device, robot and control method
JP5124705B1 (en) Solder height detection method and solder height detection device
JP6791631B2 (en) Image generation method and inspection equipment
US10533952B2 (en) Method of inspecting a terminal of a component mounted on a substrate and substrate inspection apparatus
KR20100122558A (en) Method of measuring an area of a measurement target on a printed circuit board
KR20220003977A (en) Wire measuring system and method for board inspection
CN115082538A (en) System and method for three-dimensional reconstruction of surface of multi-view vision balance ring part based on line structure light projection
JP4218283B2 (en) Target projection type three-dimensional shape measuring method and apparatus
JP7459525B2 (en) Three-dimensional shape measuring device, three-dimensional shape measuring method and program
JP2022049269A (en) Three-dimensional shape measuring method and three-dimensional shape measuring device
KR101311255B1 (en) Inspection method of measuring object

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant