CN103486979A - Hybrid sensor - Google Patents

Hybrid sensor Download PDF

Info

Publication number
CN103486979A
CN103486979A CN201310227111.1A CN201310227111A CN103486979A CN 103486979 A CN103486979 A CN 103486979A CN 201310227111 A CN201310227111 A CN 201310227111A CN 103486979 A CN103486979 A CN 103486979A
Authority
CN
China
Prior art keywords
light
article
imager
pattern
field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201310227111.1A
Other languages
Chinese (zh)
Inventor
S·P·凯莎莫西
林承智
D·T·维戈瑞恩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Perceptron Inc
Original Assignee
Perceptron Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/492,065 external-priority patent/US9170097B2/en
Application filed by Perceptron Inc filed Critical Perceptron Inc
Publication of CN103486979A publication Critical patent/CN103486979A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/0816Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements
    • G02B26/0833Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light by means of one or more reflecting elements the reflecting element being a micromechanical device, e.g. a MEMS mirror, DMD
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B26/00Optical devices or arrangements for the control of light using movable or deformable optical elements
    • G02B26/08Optical devices or arrangements for the control of light using movable or deformable optical elements for controlling the direction of light
    • G02B26/10Scanning systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1006Beam splitting or combining systems for splitting or combining different wavelengths
    • G02B27/102Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources
    • G02B27/104Beam splitting or combining systems for splitting or combining different wavelengths for generating a colour image from monochromatic image signal sources for use with scanning systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/60Systems using moiré fringes

Abstract

The invention refers to a hybrid sensor. The invention refers to a laser projection system, concretely to a system and method using the laser projection system and a self-adaption light apparatus. A system and method used for imaging objects in a view field, projecting a lighting field onto the objects in the view field and selectively projecting a lighting structure onto the objects in the view field. Then, image data corresponding with the lighting field and lighting structure can be received and characteristics of the objects can be analyzed based on the lighting field and the lighting structure.

Description

Commingled system
The cross reference of related application
The application is the U.S. Patent application No.13/205 submitted on August 8th, 2011,160 part continuation application, this U.S. Patent application No.13/205,160 is the U.S. Patent application No.12/416 that submit on April 1st, 2009,463 part continuation application, and U.S. Patent application No.12/416,463 require the U.S. Provisional Application No.61/072 submitted on April 1st, 2008,607 rights and interests.The disclosure of above-mentioned application all is incorporated herein by reference.
Technical field
The disclosure relates to laser projection system, relates more specifically to utilize the system and method for an optical projection system (field projection system) and adaptive optical device (adaptive light device).
Background technology
Structured light is that the pixel of known pattern (for example, grid or horizontal bar) is projected to lip-deep process.Distortion to known pattern in the time of on inciding surface allows sensing system to determine surperficial profile (for example, the scope of feature or distance).For example, can use structured light in three-dimensional (3D) scanner of structured light.
Referring now to Fig. 1, illustrate according to the light of prior art and detect and range finding (LIDAR) scanning system 10.LIDAR system 10 is measured the profile on surface 16.System 10 comprises infrared (IR) source 12, steering reflection mirror 14, receives catoptron 18 and IR receiver 20.
IR source 12 produces the light beam that is projected to the IR light on surface 16 by steering reflection mirror 14.Be directed into IR receiver 20 by the IR light of surface 16 reflections by receiving catoptron 18.Then, the phase differential between the IR light that IR receiver 20 can be based on projection and the IR light of reception produces corresponding to the grey of the profile on surface 16 draw (grey-mapping).
Here it is for the purpose of oblatio background of the present disclosure generally that the background that provides is described.The inventor's work (to a certain extent, this is operated in this background technology and is described), and this description when submitting to, can not be taken as in other mode prior art aspect, neither can or not impliedly do not thought with respect to prior art of the present disclosure clearly yet.
Summary of the invention
Be provided for to the article in visual field are carried out imaging, illuminated field is projected on the article in visual field, and light structures is optionally projected to the system and method on the article in visual field.Then, can the reception view data corresponding with illuminated field and light structures, and can analyze based on illuminated field and light structures the feature of article.
Comprise optical projection system, imaging system and control module for the structured light sensing system of measuring surperficial profile.Optical projection system be configured to will project to as follows surface upper: (i) luminous point, (ii) form more than first luminous point of light or (iii) form more than second luminous point of many light.Imaging system is configured to optionally catch surperficial image, and wherein, the image on surface is based on by the light of this surface reflection.Control module be configured in phase controlling projection system and imaging system the two, with each the operating structure light sensor system according in following pattern: (i) dot pattern, during dot pattern, optical projection system is projected light during the first period, and imaging system is opened within the first period; (ii) ray mode, wherein, optical projection system is more than first luminous point of projection during the second period, and imaging system is opened within the second period; And (iii) region mode, wherein, optical projection system is more than second luminous point of projection during the 3rd period, and imaging system is opened within the 3rd period.
Comprise optical projection system, imaging system and control module for the structured light sensing system of parameter of measuring lip-deep feature.It is upper that optical projection system is configured to that the light of the first pattern is projected to surface, and optical projection system comprises lighting system, optical system and one group of Micro Electro Mechanical System (MEMS) catoptron with a plurality of light sources.Imaging system is configured to optionally catch surperficial image, this image comprise indicative character parameter by the light of this surface reflection.Control module is configured to: (i) produce the data corresponding to the image of catching; (ii) process the data that produce, to determine the parameter of feature; And (iii) the controlling projection system to project to surface with the light by the second pattern upper, the light of this second pattern shows the parameter of definite feature to the user.
A kind ofly for the device of measuring surperficial profile, comprise one group of Micro Electro Mechanical System (MEMS) catoptron in imaging lens system in housing, housing, image capture apparatus in housing, housing and the control module in housing.Imaging lens system is configured to make to focus on from the light of this surface reflection with at least one lens, and wherein, imaging lens system has corresponding lens focal plane, and, wherein, from the profile of the light indication surface of this surface reflection.Image capture apparatus is configured to catch the light of focusing and produce the data corresponding to the light of catching, and wherein, image capture apparatus has corresponding image focal plane, and wherein, image focal plane and lens focal plane are not parallel.Described one group of miniature MEMS catoptron is configured to focused light is directed to image capture apparatus.Control module is configured to receive the data corresponding to the light of catching from image capture apparatus, data based on receiving are determined the focusing quality of the light of catching, and control described one group of MEMS catoptron based on focusing quality, to keep Xin Pufu glug (Scheimpflug) tilt condition between lens focal plane and image focal plane.
According to detailed description, claims and accompanying drawing, other application of the present disclosure will become apparent.Detailed description and particular instance are only estimated to be used for illustrated purpose, and should not limit the scope of the present disclosure.
The accompanying drawing explanation
According to detailed description and drawings, will be more fully understood the disclosure, in the accompanying drawings:
Fig. 1 is the schematic diagram according to the LIDAR scanning system of prior art;
Fig. 2 is the schematic diagram of diagram according to the Xin Pufu glug tilt condition between lens plane of the present disclosure and imaging plane;
Fig. 3 A is the schematic diagram according to the wide sensing system of the first example structure halo of the present disclosure;
Fig. 3 B is the functional block diagram according to the wide sensing system of the second example structure halo of the present disclosure;
Fig. 4 A to 4B is respectively diagram according to the exemplary intervention instrument of prior art with according to the schematic diagram of the exemplary intervention measuring system of structured light profile sensing system of the present disclosure;
Fig. 5 A to 5B is respectively the schematic diagram of diagram according to the exemplary method for the treatment of hole and slit (slot) of the present disclosure;
Fig. 6 is the functional block diagram according to the example control module of structured light profile sensing system of the present disclosure; Fig. 7 is the process flow diagram according to the exemplary method of the angle of structured light profile sensing system of the present disclosure for correction;
Fig. 8 A to 8B is the process flow diagram of compensation according to the exemplary method of the temperature variation of structured light profile sensing system of the present disclosure;
Fig. 9 is the process flow diagram of operation according to the exemplary method of structured light profile sensing system of the present disclosure;
Figure 10 is the functional block diagram according to example structure light sensor system of the present disclosure;
Figure 11 A to 11C is respectively the sequential chart of controlling according to the coordination of optical projection system in dot pattern, ray mode and region mode of the present disclosure and imaging system;
Figure 12 is the process flow diagram according to the exemplary method of the coordination control of the structured light sensing system according to each the pattern operation in dot pattern, ray mode and region mode of the present disclosure;
Figure 13 is determined and by the view of the example projection of the parameter of the feature of its projection by the structured light sensing system according to of the present disclosure;
Figure 14 is used the structured light sensing system to determine the parameter of lip-deep feature and use the structured light sensing system definite parameter to be projected to the process flow diagram of lip-deep exemplary method;
Figure 15 is the diagram that has the exemplary device of the housing that comprises the structured light sensing system according to of the present disclosure;
Figure 16 is the block diagram that comprises the system of White Light Projection module and adaptive optical apparatus module;
Figure 17 is the process flow diagram that diagram use optical projection system and adaptive optical device obtain the method for data;
Figure 18 is the expression with candy strip of multifaceted reflecting mirror.
Figure 19 is the representative laser facula pattern used together with the striped data with Figure 18.
Figure 20 is the candy strip that representative has the image of poor striped resolution.
Figure 21 is in response to the image of the laser rays that the striped data of Figure 20 produce; And
Figure 22 is the block diagram of the adaptive optical device of diagram sensor and recalibration.
Embodiment
Following description is in fact only exemplary, never should limit the disclosure, its application or use.For purposes of clarity, will identify with identical Reference numeral similar element in the accompanying drawings.Phrase " at least one in A, B and C " should be interpreted as used herein, has the implication of the logic (A or B or C) of using non-monopolistic logical "or".Should be appreciated that, each step in method can be in the situation that do not change principle of the present disclosure according to different order execution.
As used herein term " module " can refer to special IC (ASIC), electronic circuit, the one or more software of execution or firmware program (that share, special-purpose or grouping) processor (shared, special-purpose or grouping) and/or (that share, special-purpose or grouping) storer, combinational logic circuit and/or other suitable parts of the function that provides a description in a part, perhaps, can comprise them.
Traditional structured light sensing system has the limited depth of field.In other words, the contoured that traditional structured light sensing system can sensitive surface due to the limited sensing degree of depth.Can realize that Xin Pufu glug tilt condition is to increase the depth of field.But, because the restriction of the focusing of the linear array on the whole depth of field of sensing system causes not yet realizing Xin Pufu glug tilt condition on structured light sensing system (that is, metrology).
Referring now to Fig. 2, illustrate Xin Pufu glug principle.Xin Pufu glug principle is to describe the geometry rule of the orientation of the sharp focal plane 66 of optical system (lens 60 and imager 62) when lens plane 61 and the plane of delineation 63 are not parallel.The plane of delineation 63 is corresponding to light is reflexed to Micro Electro Mechanical System (MEMS) catoptron 64 of imager 62 from lens 60.
In other words, when the tangent line tilted extends from the plane of delineation 63, and, when another tangent line extends from lens plane 61, the line that they also pass through from it in sharp focal plane 66 converges.For example, under this condition, with the uneven planar object of the plane of delineation, can fully be focused on.Therefore, MEMS catoptron 64 can be adjusted, to remain on the focused condition on imager 62.For example, MEMS catoptron 64 can be adjusted to different angle (being meaned by MEMS catoptron 65), to supplement different sharp focal planes 67.
Therefore, proposed to be incorporated to many pixels MEMS reflection mirror array to keep the system and method for structured light profile sensing of Xin Pufu glug tilt condition in the optics RX path.System and method of the present disclosure allows each line to be directed on imager with focused condition by projecting apparatus system projection in real time the time.Like this, having more wide-aperture less focal length imaging len can be used, thereby increases optical signalling and allow metering more accurately.
In addition, traditional structured light sensing system does not produce three-dimensional (3D) data of measuring for the shape (form) of feature extraction and/or surface profile.In other words, traditional structured light sensing system only produces two dimension (2D) pattern compared for the 2D pattern with initial projection.
Therefore, proposed to be incorporated to the system and method for structured light profile sensing of the generation of 3D data, feature extraction and/or shape measure.System and method of the present disclosure produces 3D point cloud, and this 3D point cloud can be used to feature extraction/tracking and/or shape measure.In other words, system and method for the present disclosure allows metering more accurately, especially on z direction (that is, profile depth).In addition, 3D point cloud can be output to external software, with the modeling for other and/or processing.
Referring now to Fig. 3 A, the first exemplary embodiment according to structured light profile sensing system 70 of the present disclosure is shown.Structured light profile sensing system 70 is determined the profile on surface 88.Structured light profile sensing system 70 can also comprise: control module 72, accelerometer 74, lighting system 76, a MEMS mirror system 86, the 2nd MEMS mirror system 90, focus lens system 92 and imager 94.
Lighting system 76 comprises: the first light source 78, secondary light source 80 and the 3rd light source 82.In one embodiment, the first light source 78, secondary light source 80 and the 3rd light source 82 are laser instruments.But, should be appreciated that other light source also can be implemented.For example, the first light source 78, secondary light source 80 and the 3rd light source 82 each can produce the light with different wave length.In one embodiment, these wavelength can be corresponding to red, green and blue.But, can recognize, other color (that is, different wavelength coverage) also can be implemented.
The first light source 78, secondary light source 80 and the 3rd light source 82 can be combined in a coaxial beam.Lighting system 76 can also comprise optical system 84, and these optical system 84 use first light sources 78, secondary light source 80 and the 3rd light source 82 produce the pattern of light.For example, in one embodiment, optical system 84 can comprise: hologram diffraction element, beam splitter and/or prism.But, can recognize, optical system 84 can comprise other optical element.Element in optical system 84 is handled the light of light (skew, separation, diffraction etc.) with the pattern of realization expectation.
In addition, can recognize, structured light profile sensing system 70 can comprise other lighting system (not shown) and Phase Shifting System (not shown), on surface 88, carrying out interferometry.More particularly, structured light profile sensing system 70 can be switched between the purpose projection pattern light for feature extraction and/or shape measure and the light for flatness measurement (that is, interferometry) projected fringe line.
As shown in the figure, for the purpose of feature extraction and/or shape measure, profile sensing system 70 projects to the light of the first pattern on surface 88, then, focuses on and catches from the light of the second pattern of surface 88 reflections.The profile of the light indication surface 88 of the second pattern.Then, profile sensing system 70 can compare the light of the second pattern of captive focusing and the light that projects to this lip-deep the first pattern.More particularly, control module 72 can determine expection the first pattern light and from the difference between the light of the second pattern of the focusing of surface 88 reflections.For example, control module 72 can be determined the phase differential between the light of the light of the second pattern and the first pattern.These poor features corresponding to surface 88, and jointly limit this surperficial profile.
These features can be output to the external system for other processing, or can and/or follow the tracks of by control module 72 storages.In other words, control module 72 can be based on from this surface reflection focusing the second pattern light (, feedback) light of controlling continuously the first pattern is to the projection on surface 88, to improve (refine) measurement to the special characteristic of the profile on common formation surface 88.The light of the 3rd pattern that in other words, control module 72 is can projection different from the light of the first pattern.For example, control module 72 can comprise the data-carrier store of storage corresponding to the data (that is, calibration data) of the light of a plurality of different patterns.
Referring now to Fig. 3 B, the second exemplary embodiment of structured light profile sensing system 100 of the present disclosure is shown.System 100 is determined the profile on surface 102.System 100 comprises: control module 104, calibrating sensors system 106, optical projection system 108 and imaging system 110.
Calibrating sensors system 106 is determined various calibration parameters, such as, the global location of the orientation of system 100, system 100 and the temperature of system 100.The setup times that the sensing orientation of system 100 and global location can allow control module 104 reduction systems 100, and improve the precision that arranges in fixed installation.In addition, the sensing temperature of system 100 can allow automatically compensates of control module 104.
In one embodiment, calibrating sensors system 106 comprises: accelerometer 112, a plurality of addressable IR light emitting diode (LED) 114 and thermocouple 116.For example, accelerometer 112 can be solid state accelerometer, and it provides the orientation of system 100 with respect to the inclination of two axles by measuring system 100.For example, IR LED114 can be placed in the pre-position on system 100, like this, can be used to determine and calibration system 100 position of (that is the system that, comprises a plurality of different sensors) externally in coordinate space.Just to giving an example, IRLED114 can allow to carry out location positioning and calibration by commercial Stereometric device.In addition, for example, thermocouple 116 can provide temperature information, with permission system 100 compensates automatically.
Order based on from control module 104 (that is, depend on and whether extract feature, measure shape or carry out interferometry), optical projection system 108 projects to the light of the light of the first pattern or streak line on surface 102.Optical projection system 108 can be by each light beam projecting to surface 102, or optical projection system 108 can be synthesized to many sets of beams in coaxial beam, to project on surperficial 102.In one embodiment, many light beams are produced by laser instrument.Optical projection system 108 can also controlling projection to color, intensity and the pattern of the light of the first pattern on surface 102.
In one embodiment, optical projection system 108 comprises: lighting system 118, interferometer measuration system 120 and turn to MEMS mirror system 122.Lighting system 118 can be used to produce for projecting to the pattern light on surface 102, with for by control module 104, carrying out feature extraction and/or shape measure.Interferometer measuration system 120 can be used to the interferometry of effects on surface 102.More particularly, interferometer measuration system 120 can be used to produce the light of streak line on surface 102, with for determining this surperficial flatness.
For example, lighting system 118 can also comprise: the first light source (LS1) 124, secondary light source (LS2) 126 and the 3rd light source (LS3) 128.Perhaps, can recognize, lighting system 118 can comprise than illustrate still less or more light source (for example a, single light source).In addition, light source 124,126,128 can be combined into the single coaxial light beam.For example, light source 124,126,128 can be Modulation and Amplitude Modulation light source, pulse frequency modulation light source and/or wavelength-modulated light source.In addition, light source 124,126 and 128 can be shaken by wavelength in real time, to reduce the speckle effect when projecting on surperficial 102.
For example, in one embodiment, LS1 124 can be red laser, and LS2 126 can be green laser, and LS3 can be blue laser 128.More particularly, red laser 124 can produce the laser beam of the wavelength (for example, 600 to 690nm) had corresponding to ruddiness.Green laser 126 can produce the laser beam of the wavelength (for example, 520 to 600nm) had corresponding to green glow.Blue laser 128 can produce the laser beam of the wavelength (for example, 450 to 520nm) had corresponding to blue light.But, can recognize, light source 124,126,128 can produce the light (that is, different wavelength coverage) of different colours.
In addition, lighting system 118 can comprise optical system 130, so that produce the pattern of light with light source 124,126,128.For example, optical system 130 can produce pattern with hologram diffraction element, electrooptic cell and/or beam splitter.In addition, for example, optical system 130 can comprise narrow bandpass filter, catoptron and/or prism.
In one embodiment, the wall scroll produced by lighting system 118 (for example, coaxial) light beam can be flying spot grating (flying spot raster).In other words, coaxial beam can comprise independent red, green and blue component.Like this, the intensity of the light source 124,126,128 by controlling lighting system 118, control module 104 can be controlled intensity and/or the color of coaxial beam.For example, control module 104 can be controlled respectively intensity and/or the color of coaxial beam due to the color on the distance to surface 102 or surface 102.
More particularly, in one embodiment, control module 104 can be controlled the color of coaxial beam based on feedback, with the color of match surface 102.Adjust the color of the color of projected light with match surface 102, can improve the precision (that is, resolution) of this system.Like this, control module 104 can be controlled three light sources 124,126,128, to control the color of coaxial beam.For example, for the level of the redness that improves coaxial beam, control module 104 can improve the intensity (wherein, light source 122 produces and has the light corresponding to the wavelength of ruddiness) of light source 112.Like this, control module 104 can be carried out the synthetic color of controlling projection to the coaxial beam on surface 102 based on feedback by (by the surface 102 reflections) light that catch.
Although lighting system 118 can produce coaxial beam,, can recognize, lighting system 118 also can produce by the subset of the MEMS catoptron from turning to MEMS mirror system 122 each and all be projected to many light beams on surface 102.More particularly, in one embodiment, from the light beam of LS1124, can use and be projected to from the first group of MEMS catoptron that turns to MEMS mirror system 122 on surface 102.For example, can use and be projected to from the second group of MEMS catoptron that turns to MEMS mirror system 122 on surface 102 from the light beam of LS2126.In addition, for example, from the light beam of LS3128, can use and be projected to from the 3rd group of MEMS catoptron that turns to MEMS mirror system 122 on surface 102.
Perhaps, structured light profile sensing system 100 can be carried out by interferometer measuration system 120 interferometry of effects on surface 102.More particularly, lighting system 132(is different from lighting system 118) can produce the light beam that has carried out phase shift by Phase Shifting System 134, and initial light beam and dephased light beam can be projected on surface 102 by turning to MEMS mirror system 122.In one embodiment, lighting system 132 can comprise a single light source, thereby makes the light beam of two projections keep homophase (not comprising the skew of generation).
For example, in one embodiment, Phase Shifting System 134 can comprise a plurality of beam splitters and/or prism.
When system 100 is being carried out interferometry, the light beam that has two projections of minimum difference (for example, 10 nanometers) on phase place can be used as streak line and appears on surface 102.But the interval between these stripeds can increase along with the scrambling on surface 102.In other words, on flat surfaces, the light beam of projection there will be extremely narrow striped (or, there is no fringe spacing), and, on the surface of very coarse (irregular), the light beam of projection there will be extremely wide striped.
Referring now to Fig. 4 A and Fig. 4 B, two different interferometer measuration systems are shown.
Referring now to Fig. 4 A, traditional interferometer is shown.Light source 50 by light beam projecting to catoptron 51.Catoptron 51 for example passes through beam splitter 152(by beam reflection, prism).Beam splitter 152 is divided into two offset beam by this light beam.The first light beam is from reflecting away with the first surface 153 of beam splitter 152 at a distance of the first distance.The second light beam is from reflecting away with the second surface 154 of beam splitter 152 at a distance of second distance.Second distance is greater than the first distance, and this makes between two folded light beams and produces phase shift.Then, two light beams that are reflected all directed (by beam splitter 152) arrive receiver 155.For example, receiver 155 can be the surface shown corresponding to the candy strip of the phase differential between two folded light beams.
But traditional interferometer is static (that is, fixing), like this, only with at receiver 155(, this surface) on little selection zone, produce candy strip.Like this, for example, in order to cover large zone (, be greater than 100 millimeters and take advantage of 100 millimeters), need a plurality of light sources and a plurality of high-resolution camera, this can increase system size, complexity and/or cost.
Referring now to Fig. 4 B, illustrate in greater detail the exemplary embodiment according to interferometer measuration system 120 of the present disclosure.Light source 160 by light beam projecting to MEMS catoptron 162.For example, light source 160 can be lighting system 132, and MEMS catoptron 162 can be to turn to MEMS mirror system 122.MEMS catoptron 162 passes through beam splitter 164 by beam reflection.For example, beam splitter 164 can be Phase Shifting System 134.
Beam splitter 164 is divided into two by light beam, and, a light beam is passed and make another beam reflection with a plurality of surfaces, like this, produce two phase deviations between light beam.Then, these two light beams are projected on surface 166.For example, surface 166 can be surface 102.These two light beams can the flatness based on surface 166 produce candy strip.More particularly, more irregular surface can comprise the wider interval between striped.But flat surfaces can comprise the narrow interval (or there is no interval) between striped.
Due to the accurate control of MEMS catoptron 162, interferometer measuration system can realize the resolution larger than conventional interference instrument.Just to giving an example, interferometer measuration system 120 can have the resolution of 5 microns on x and z direction.In addition, interferometer measuration system 120 can be adjusted catoptron 162 continuously, with the areal coverage of candy strip on surface 166 that changes projection.Just to giving an example, this candy strip can be turned in real time, to cover 200 millimeters zones of taking advantage of 200 millimeters.
Referring again to Fig. 3 B, as mentioned above, one of them of the light beam (that is, pattern or striped) that turns to MEMS mirror system 122 to be produced by lighting system 118 or interferometer measuration system 120 projects on surface 102.For example, control module 104 can be controlled and turn to MEMS mirror system 122, with by pattern or fringe projection to the ad-hoc location on surface 102.
In one embodiment, control module 104 can be controlled optical system 130 and produces for projecting to the pattern of the one or more structuring line of surface on 102.More particularly, the quantity that control module 104 can the control structure line, width, the interval between the structuring line, the angle of structuring line and/or the intensity of structuring line of structuring line.In addition, control module 104 can be controlled optical system 130 and produces for projecting to the pattern of the one or more shapes of surface on 102.For example, control module 104 can be controlled the pattern that optical system 130 produces the polygon (N3) for projecting to circle, concentric circles, rectangle and/or other N limit on surface 130.
Control module 104 can the feature based on measured be controlled the pattern be projected.More particularly, referring now to Fig. 5 A and Fig. 5 B, illustrate according to of the present disclosure for controlling two kinds of exemplary methods of pattern.
Referring now to Fig. 5 A, the exemplary method for the treatment of lip-deep hole is shown.Structured light profile sensing system can projection around many lines of the central rotation in this hole.Like this, the pattern be reflected can comprise a plurality of points relative on diameter corresponding to the edge in this hole.And/or horizontal line vertical with the only use according to prior art compared, and the method allows more accurate feature extraction and/or shape measure.
Referring now to Fig. 5 B, the exemplary method for the treatment of lip-deep slit is shown.Structured light profile sensing system can be along many horizontal lines of size projection and the perpendicular line of this slit.Like this, the pattern of reflection can comprise a plurality of points at the edge that means this slit.But, according to manufacturing tolerance (tolerance), some slit may seem more like this, also can to process them according to the method for Fig. 5 A as hole.
Referring again to Fig. 3 B, imaging system 110 receives from the light of the second pattern of surface 102 reflections or the light of streak line, and catches the reception light of the profile sensing on the surface 102 for being undertaken by control module 104.Due to the profile on surface 102, the light received may be different from the light be projected.For example, surface 102 can comprise a plurality of features with different depth.For example, the phase differential between the light based on receiving and the light be projected, control module 104 can be determined from the scope on surface 102.More particularly, imaging system 110 can receive reflected light, inclined reflection light and/or focus reflection light.In addition, then, imaging system 110 can be caught the light received, and corresponding data is sent to control module 104 for processing.
In one embodiment, imaging system 110 comprises: receive MEMS mirror system 140, focus lens system 142 and image capture module 144.Receive MEMS mirror system 140 from surperficial 102 reception reflected light, and the light received is directed to focus lens system 142.Focus lens system 142 can comprise one or more lens.For example, control module 104 can be controlled and receive MEMS mirror system 140 and focus lens system 142, to provide sophisticated sensor to point to, aims at (pointing alignment).
Receive MEMS mirror system 140 and focused light can also be tilted to image capture module 144, by maintenance Xin Pufu glug tilt condition, to maximize focusing.Like this, for example, in one embodiment, the subset that receives MEMS mirror system 140 can be directed to the light received focus lens system 142, and the different subsets of reception MEMS mirror system 140 can tilt to focused light on image capture module 144.Perhaps, for example, can recognize, can realize two different systems of MEMS catoptron.
Control module 104 is controlled and is received MEMS mirror system 140 and focus lens system 142, to reach, can allow to be suitable for the optics of technology in future and the precision of image-capable.More particularly, control module 104 can be controlled and receive MEMS mirror system 140 and focus lens system 142, by focused light being tilted on image capture module 144, to produce Xin Pufu glug image-forming condition.
In other words, control module 104 can be controlled and receive MEMS mirror system 140 and focus lens system 142, to increase the field of view (FOV) of image capture module 144.Control module 104 can be controlled and receive MEMS mirror system 140 and focus lens system 142, to increase the depth of field of image capture module 144.In addition, control module 104 can be controlled and receive MEMS mirror system 140 and focus lens system 142, with the opticpath length by controlling between focus lens system 142 and image capture module 144, keeps focused condition.
Therefore, image capture module 144 is via receiving MEMS mirror system 140 and focus lens system 142(, after focusing on and/or tilting) receive from the light of surface 102 reflections.Although show an image capture module 144,, can recognize, a plurality of image capture module 144 can be implemented.For example, each in a plurality of image capture module 144 can receive the catoptrical part corresponding to the subregion on surface 102.
Image capture module 144 is converted to data (for example, electricity) by focused light.In one embodiment, image capture module 144 is charge-coupled device (CCD) imagers.In another embodiment, image capture module 144 is CMOS(complementary metal oxide semiconductor (CMOS)s) imager.For example, the CCD imager can be realized the resolution higher than cmos imager, and cmos imager can be used than CCD imager power still less.
Image capture module 144 sends to control module 104 by data, for focus on adjust and/or for the treatment of, to determine the profile on surface 102.The laser rays profile that control module 104 can be caught by imager by assessment is determined the focusing quality of the light of catching.For the Gaussian curve profile, by the width that maximizes peak value and minimize laser rays, improve focusing.Importantly, the dynamic range of imager makes image unsaturated.Quality based on focusing on, control module 104 is controlled and is received MEMS mirror system 140(or its subsets) keep Xin Pufu glug tilt condition.This processing can repeat in real time continuously, to maximize the quality focused on, thereby keeps Xin Pufu glug tilt condition.
In addition, control module 104 can be changed, the feature of extraction and/or tracked surface 102.In addition, control module 104 can output to data surface and forms (surfacing) and/or check software, for modeling and/or other processing.In addition, control module 104 can the 3D feature based on extracting be adjusted optical projection system 108 and/or imaging system 110.In other words, for example, control module 104 can be adjusted the light of pattern light or streak line is projected on surface 102 with for more accurate profile sensing.
Referring now to Fig. 6, the exemplary embodiment of control module 104 is shown.Control module 104 can comprise: 2D processing module 170,2D extraction/division module (extraction/segmentation module) 172, coordinate transformation module 174,3D characteristic extracting module 176 and feature locations tracking module 178.
2D processing module 170 receives corresponding to projected light and catoptrical data.More particularly, the 2D processing module is determined from the data (the second image) of image capture module 144 and corresponding to the difference between the data of projected light (that is, pattern light or streak line).In one embodiment, the data corresponding to projected light can be stored in the data-carrier store in control module 104.
2D extraction/division module 172 receives processed data from 2D processing module 170.2D extraction/division module 172 extracts feature from the 2D data.In other words, 2D extraction/division module 172 by the data partition processed in the subregion corresponding to different characteristic.For example, these subregions can be corresponding to the data that exceed the predetermined characteristic threshold value.
The partition data that coordinate transformation module 174 receives corresponding to the feature of extracting.Coordinate transformation module 174 is gone back receiving sensor calibration data and shifter (mover)/instrument calibration data.For example, sensor calibration data can produce by accelerometer 112.Shifter/instrument calibration data can be the predetermined calibration be stored in data-carrier store.But, can recognize, in one embodiment, shifter/instrument calibration data can be inputted by the user.
Coordinate transformation module 174 is the 3D coordinate corresponding to different characteristic by the coordinate transform of 2D subregion.More particularly, coordinate transformation module 174 is determined the degree of depth (that is, due to Xin Pufu glug tilt to cause) of preferred coordinate.For example, coordinate transformation module 174 can produce the 3D point cloud corresponding to each 2D subregion.In one embodiment, 3D point cloud can be sent to outside 3D surface and form software, for the 3D coordinate is carried out to modeling.
3D characteristic extracting module 176 receives 3D point cloud.3D characteristic extracting module 176 is from 3D data reduction feature.More particularly, 3D characteristic extracting module 176 can determine which feature has exceeded predetermined threshold (for example, the degree of surface curvature), therefore can extract excessive feature.The feature that 3D extracts can be different from the feature that 2D extracts.In other words, after the feature that is converted into the 3D extraction, the feature that some 2D extracts can be left in the basket.In one embodiment, the 3D feature of extraction can be sent to visual examination software, for other calculating and/or to the confirmation of the excessive measurement of the 3D feature extracted.
Feature locations tracking module 178 receives the 3D feature of extracting.Feature locations tracking module 178 by the 3D characteristic storage extracted in data-carrier store.Feature locations tracking module 178 can also the 3D feature based on extracting be adjusted and is turned to MEMS mirror system 122 and/or receive MEMS mirror system 140.In other words, feature locations tracking module 178 can be adjusted the system (for example, the system based on feedback) for the more accurate profile sensing of the 3D feature to one or more extractions.But, when 178 adjustment of feature locations tracking module turn to MEMS mirror system 122 and/or receive MEMS mirror system 140, the change of mirror angle is transmitted to coordinate transformation module 174, for coordinate transform operation in the future.
Referring now to Fig. 7, for correction, according to the exemplary method of the angle of profile sensing system of the present disclosure, start from step 200.
In step 202, whether system is determined corresponding to the data of the position in reflected light and is equated with the data of position corresponding in projected light.If true, control and turn back to step 202(so, do not need calibration).If false, control and advance to step 204 so.
In step 204, system is measured movement with accelerometer.For example, system is determined the impact of gravity on system with accelerometer, such as the inclination of x axle and/or y axle.In step 206, the system adjustment turns to MEMS catoptron 122 and receives MEMS catoptron 140, the definite external action with compensation to system.Then, control can turn back to step 202.
Referring now to Fig. 8 A-8B, the exemplary method according to the temperature variation of profile sensing system of the present disclosure for compensation is shown.
Referring now to Fig. 8 A, for the exemplary method of adjusting projection due to temperature variation, start from step 250.In step 252, the temperature of systematic survey optical projection system.For example, temperature can produce by thermocouple 116.
In step 254, system determines that whether the temperature measured is different from base measuring temperature.For example, base measuring temperature can be to be stored in one of a plurality of predetermined temperatures in data-carrier store.If true, control and may be advanced to step 256 so.If false, control and can turn back to step 252 so.
In step 254, system can the temperature adjustment based on measuring turn to MEMS catoptron 122.For example, system can the predetermined relationship (function g) (for example, y=g(T)) based between MEMS reflector position (y) and temperature (T) be adjusted and is turned to MEMS catoptron 122.In one embodiment, function (g) can comprise a plurality of reflector positions (y) of being stored in data-carrier store and the temperature (T) of a plurality of correspondences.Then, control can turn back to step 252.
Referring now to Fig. 8 B, for the exemplary method that is adjusted to picture (reception) due to temperature variation, start from step 260.In step 262, the temperature of systematic survey imaging system.For example, temperature can produce by thermocouple 116.
In step 264, system determines that whether the temperature measured is different from base measuring temperature.For example, base measuring temperature can be to be stored in one of a plurality of predetermined temperatures in data-carrier store.If true, control and may be advanced to step 266 so.If false, control and can turn back to step 262 so.
In step 264, system can the temperature based on measuring be adjusted reception MEMS catoptron 140.For example, system can be adjusted reception MEMS catoptron 140 by the predetermined relationship (function f) (for example, x=f(T)) based between MEMS reflector position (x) and temperature (T).In one embodiment, function (f) can comprise a plurality of reflector positions (x) of being stored in data-carrier store and the temperature (T) of a plurality of correspondences.Then, control can turn back to step 252.
Referring now to Fig. 9, for operation, according to the exemplary method of structured light profile sensing system of the present disclosure, start from step 300.In step 302, system determines whether to carry out feature extraction and/or shape measure or whether will carry out interferometry.If carry out feature extraction and/or shape measure, control and may be advanced to step 304 so.If the execution interferometry is controlled and be may be advanced to step 314 so.
In step 304, calibration is carried out in the feature of system based on sensor feedback and/or extraction or shape measure (that is, from circulation) before.For example, system can be calibrated the color of the light beam be projected and/or the location of intensity, projection or imaging MEMS catoptron etc.
In step 306, system produces the light of the first pattern and it is projected to surface upper, with for the profile sensing.More particularly, system can produce the light beam of particular color and/or intensity, and the pattern that comprises one or more line and/or one or more shapes can be projected on surface.
In step 308, system receives from the light of surface reflection and guides reflected light with for catching.More particularly, system receives reflected light, the guiding reflected light, and in order to maximize catoptrical focusing (that is, the Xin Pufu glug tilts) for imager inclined reflection light.
In step 310, system is caught focused light for the purpose of processing.For example, focused light can be caught by CCD camera or CMOS camera.
In step 312, system is processed the data corresponding to focused light, with the feature extraction for surperficial and/or shape measure.In addition, system can be stored in the feature of extraction or shape measure in data-carrier store, and/or output is corresponding to the data of the feature of extracting, with the modeling for outside and/or other processing.Then, control can turn back to step 302.
In step 314, calibration is carried out in the feature of system based on sensor feedback and/or extraction or shape measure (that is, from circulation) before.For example, system can be calibrated the color of the light beam be projected and/or the location of intensity, projection or imaging MEMS catoptron etc.
In step 316, system produces light beam.For example, the lighting system that system can be different with the lighting system from for feature extraction and/or shape measure produces light beam.
In step 318, system is separated and is offset this light beam, thereby produces and have less phase differential (for example, two light beams 10nm).For example, can separate and be offset this light beam with a plurality of beam splitters and/or prism.
In step 320, system acquisition is from the light of the streak line of surface reflection.In step 322, the interfringe interval of systematic survey, and determine surperficial flatness based on this interval.For example, more smooth surface can comprise interfringe less interval.Then, control can turn back to step 302.
According to further feature of the present disclosure, proposed to be arranged to the structured light sensing system of multi-mode operation.More particularly, the structured light sensing system is configured at (i) dot pattern, (ii) ray mode and (iii) operating in each pattern in region mode.Dot pattern refers to projection and the imaging of the point of light.Ray mode refers to projection and the imaging of more than first points of light, and more than first points of light form line.Region mode refers to projection and the imaging of more than second point, and more than second point forms many lines, and these many lines form zone jointly.The structured light sensing system generally includes: optical projection system, imaging system and control module.But the structured light sensing system can comprise the parts that other is applicable.
By coordinating the operation of optical projection system and imaging system, optical projection system is configured to (i) luminous point or a plurality of spot projections of (ii) forming one or more line to surface.For example, control module can the operator scheme based on expectation be controlled the opening time (that is, image just be hunted down how long) of imaging system with respect to optical projection system.Although being described to the coordination of optical projection system and imaging system, control module controls,, to recognize, control circuit also can be implemented in optical projection system and/or imaging system, rather than realizes control module, thereby reach similar coordination, controls.
Referring now to Figure 10, example structure light sensor system 500 is shown.As described above, structured light sensing system 500 is configured at (i) dot pattern, (ii) ray mode and (iii) operating in each pattern in region mode.More particularly, control module 510 can the operator scheme (point, line or zone) based on expectation be coordinated to control to optical projection system 520 and imaging system 530.The operator scheme of expectation can be inputted or be selected by user 540.
In addition, can realize the imaging system 530 more than.More particularly, two or more imaging systems 530 can be implemented, with the field of view (FOV) of increase system 500.For example, can realize two imaging systems 530, each has each other overlapping partly FOV, thereby increase the entire scan width on directions X, the U.S. Patent application 12/943 of submitting on November 10th, 2010 as common transfer, disclosed in 344, the full content of this patented claim is incorporated to this paper by reference.
Optical projection system 520 can comprise one or more light sources and MEMS mirror system.These one or more light sources are common to be produced and redirect on the feature 550 on surface 560 or near the light beam it by the MEMS mirror system.Optical projection system 520 can also comprise for handling the optical system of light beam.In the exposure cycle period of imaging system, from the light of optical projection system 520, with very high frequency, preferably scanned.For example, optical projection system 520 can produce light pulse with the frequency of being scheduled to.
Compare the sweep velocity of optical projection system quite fast (for example, 100 times fast) with the exposure such as shutter speed of imaging system circulation.By using the combination of sweep velocity and shutter speed, system 500 can obtain a data, line data, multi-thread data or area illumination.For example, the quality of area illumination is similar to the quality obtained by the floodlighting source such as LED.The advantage of this means of illumination is, its use can be carried out the ability of the calibration MEMS device in reproducible accurate path.Like this, accurately informed and/or reported to relevant control module in each position of the MEMS of scan period mirror system.
Control module 510 controlling projection systems 520 are come projection (i) luminous point or (ii) a plurality of luminous points, and the plurality of luminous point forms one or more light.In other words, optical projection system can projection (i) luminous point (dot pattern), (ii) more than first luminous point, and this more than first luminous point forms line (ray mode) or (ii) more than second luminous point, and this more than second luminous point forms many lines (region mode).In some implementation, many lines (region mode) can comprise the combination of (one or more) horizontal line and (one or more) perpendicular line.But each can have identical orientation many lines.
Control module 510 can the operator scheme based on expectation be carried out one of them of these three kinds of projections of order.But, control module 510 also optionally based on the expectation operator scheme optionally control imaging system 530.For example, the operator scheme that control module 510 can be based on the expectation operator scheme of the order of optical projection system 520 (or for) (that is, point, line or multi-thread (zone)) is controlled the opening time of imaging system 530.
In dot pattern, control module 510 order optical projection systems 520 by the spot projection of light on the feature 550 on surface 560.When reducing background, can provide dot pattern for maximum intensity.Just to giving an example, dot pattern can be suitable for the dark glossiness surface such as compound substance (composite), blackwash etc.Figure 11 A is illustrated in the example that the coordination of optical projection system 520 and imaging system 530 is controlled during dot pattern.
As shown in the figure, control module 510 makes imaging system 530(high state) at optical projection system 520(, be also high state) period of the spot projection that carries out enables.In other words, the opening time of imaging system 530 can be roughly a projection period (or pulse), and it is hereinafter referred to as the first projection period.But the opening time of imaging system 530 can also be slightly larger than the first projection period, thus with the end of the beginning of projection and projection both overlapping (as shown in the figure).
In line model, control module 510 order optical projection system 520 projections form more than first luminous point (that is, a plurality of projection pulses) of light.More than first luminous point extends through the feature 550 on surface 560.Ray mode can be provided for the imaging of the 3D to feature with ultimate resolution.For example, when with effective sub-pixel algorithm combination, the imaging resolution of system 500 can be modified, and, can realize the imaging system 530 of meticulousr resolution that is.Figure 11 B is illustrated in the example that the coordination of optical projection system 520 and imaging system 530 is controlled during ray mode.
As shown in the figure, control module 510 makes imaging system 530 enable in period of line projection (that is, for projection, forming period of point of two or more light of line).In other words, the opening time of imaging system 530 can be approximately two or more spot projection periods (or pulse).But the opening time of imaging system 530 can also be larger than line projection's period of optical projection system 520 (two or more spot projection period), thus with the end of the beginning of projection and projection both overlapping (as shown in the figure).
In region mode, control module 510 order optical projection systems 520 will form more than second spot projection of many light on feature 550.As mentioned before, in some implementation, many light light (for example, the line of quadrature) vertical with at least one that light can comprise at least one level.For example, but each in many light can have identical orientation (, parallel lines).But the line of other quantity and angle configurations also can be projected.Region mode can be provided in the single exposure of imaging system 530 imaging to large flat site.Figure 11 C is illustrated in the example that the coordination of optical projection system 520 and imaging system 530 is controlled during region mode.
As shown in the figure, control module 510 makes imaging system 530 enable in the region projection period (that is, for projection, forming four or the period of a plurality of luminous points of two or more pieces light).In other words, the opening time of imaging system 530 can be approximately four or a plurality of spot projection period (or pulse).But as previously mentioned, the period of enabling of imaging system 530 can also be larger than the region projection period of optical projection system 520 (four or a plurality of spot projection period), thus with the end of the beginning of projection and projection both overlapping (as shown in the figure).
In addition, although line projection's period and region projection period defined with respect to the spot projection period,, recognize, according to dutycycle or the pulsed frequency of optical projection system 520 during given operator scheme, the relative length of each in these projection periods can be different.In other words, imaging system 530 can have respectively the opening time of the first period, the second period and the 3rd period for dot pattern, ray mode and region mode.
With reference now to Figure 12,, for coordinating optical projection system and imaging system at (i) dot pattern, (ii) ray mode and the exemplary method that (iii) operates in each of region mode start from 600.600, control module 510 is determined the operator scheme of expectation.As described above, the operator scheme of expectation can be inputted or be selected by the user.If wish dot pattern, control and may be advanced to 604 so.If wish ray mode, control and may be advanced to 608 so.If the desired area pattern, control and may be advanced to 612 so.
604, control module 510 can (i) order optical projection system 520 by spot projection on the feature 550 on surface 560, and (ii) order imaging system 530 to be opened to continue the approximately time of (or slightly being longer than) spot projection periods.Then, control and may be advanced to 616.
608, control module 510 can (i) order optical projection system 520 will form line more than first spot projection to the surface 560 that extends through feature 550, and (ii) order imaging system 530 to open the time that continues about (or slightly being longer than) spot projection periods (two or more spot projection period).Then, control and may be advanced to 616.
612, control module 510 can (i) order optical projection system 520 will form many light more than second spot projection on feature 550 or near on the surface of feature 550, and (ii) order imaging system 530 to open the time that continues about (or slightly being longer than) region projection periods (four or a plurality of spot projection period).Then, control and may be advanced to 616.
616, control module 510 can the data based on being collected by imaging system 530 be determined the parameter of feature 550.This parameter can comprise size or other applicable parameter relevant with characteristic dimension.Just to giving an example, other applicable parameter can comprise area, the degree of depth or volume.Then, control can turn back to 600.
According to further feature of the present disclosure, the structuring light sensor system has been proposed, it is configured to determine the parameter of feature and the parameter of feature is projected on surface.More particularly, the structured light sensing system is configured to determine the parameter of lip-deep feature, then size is projected on surface.The structured light sensing system can be determined according to the method for describing in this article before the size of feature.This parameter can comprise, but be not limited to size, flatness, area or volume.For example, size can be the 2D measurement result, such as, width, highly, degree of depth, radius, diameter, girth etc.
Refer again to Figure 10, as previously described, at first structured light sensing system 500 can project to the light of the first pattern on the feature 550 on surface 560.Then, imaging system 530 can be caught the image on surface 560, and this image of catching comprises by the light of surface 560 reflections.Then, control module 510 can produce data by the image based on catching, that is, the image that digitizing is caught, then, the data based on producing are determined the parameter of feature 550.
After the parameter of having determined feature 550, then, structured light sensing system 500 can project to the light of the second pattern on surface 560.But, replace for the projection of measuring purpose, the light of the second pattern can be controlled, with the parameter display by definite on surface 560.Control module 510 can the parameter based on definite be carried out controlling projection system 520.On the contrary, control module 510 can controlling projection system 510 be manipulated to the light of the second pattern of projection the readable demonstration of definite parameter.For example, the light of the second pattern can comprise numeral and/or measuring unit.
After the parameter by definite projects to surface 560, then, the user of structured light sensing system 500 can easily read definite parameter from surface 560.Utilize identical structured light sensing system 500 to send definite parameter to user and reduced cost, this cost is associated with other pattern communicated with the user such as display or computing machine.In addition, because the user can be concentrated on identical general location, so the parameter of the feature 550 that projection is definite can be user's mode faster (for example,, without watching to and fro between the measurement in feature 550 and outside display or computing machine) that measurement result is sent to.
Figure 13 illustrates the exemplary diagram 700 of projection 710 of definite parameter of the feature 720 on effects on surface 730.As shown in the figure, projective parameter 710(" 2.54cm ") mean the diameter of hole/aperture feature 720.But as described above, projective parameter 710 can comprise other combination of numeral and/or measuring unit.Although being illustrated, projective parameter 710 is positioned on feature 720,, projective parameter 710 also can be positioned at other the applicable position on surface 730.
Exactly, replacedly, projective parameter 710 can be positioned at feature 720 the left side, the right or below, and can be more approaching or further from feature 720.For example, control module 510 can controlling projection system 520 project to definite parameter in the predefined distance of distance feature 720 or predefined distance.This predefined distance can be predetermined, and is stored in storer or by the user and inputs.In addition, can be also predetermined with respect to the position of feature 720, and be stored in storer or by the user and input.
With reference now to Figure 14,, for determine and projection surface on the exemplary method of size of feature start from 800.800, control module 510 controlling projection systems 520 project to the light of the first pattern on 560 the feature 550 of surface or near it.804, control module 510 is controlled the image that imaging system 530 is caught surface 530, and the image of catching comprises by the light of surface 560 reflections.808, the image of control module 510 based on catching produces data.812, the data of control module 510 based on producing are determined the parameter of feature 550.816, control module 510 controlling projection systems 520 by the light of the second pattern project to the surface 560 on, the light of the second pattern by definite parameter display to the user.Then, control can finish or turn back to 800 and carry out other circulation.
According to further feature of the present disclosure, proposed for measuring the device of surperficial profile.This device comprises housing and is placed in a plurality of parts of the structured light sensing system in this housing.Just to giving an example, the parts in housing can comprise the system 100 of Fig. 3 B.But the parts that are placed in housing generally include as follows: imaging lens system, image capture apparatus, first group of MEMS catoptron and control module.
Imaging lens system is configured to focus on the light from surface reflection with at least one lens.Imaging lens system has the burnt surface of corresponding lens.Light indication surface profile from surface reflection.Image capture apparatus is configured to catch focused light and produces corresponding to capturing optical data.Image capture apparatus has the image focal plane of the correspondence that is not parallel to lens focal plane.
First group of MEMS catoptron is configured to focused light is directed to image capture apparatus.Control module is configured to (i) and receives the data corresponding to the light of catching from image capture apparatus, (ii) the data based on receiving are determined the focusing quality of the light of catching, and (iii) based on focusing quality, control described one group of MEMS catoptron, to keep the Xin Pufu glug tilt condition between lens focal plane and image focal plane.
With reference now to Figure 15,, exemplary device 900 is shown, the parts that it comprises housing 904 and is placed in the structured light sensing system in this housing.Any one in each embodiment of the structured light sensing system that housing 904 is described before can being included in herein.In addition, housing 904 can comprise other applicable parts, or can comprise the embodiment parts more or less of describing before than in this article.But as mentioned before, housing 904 can comprise as follows usually: imaging lens system, image capture apparatus, first group of MEMS catoptron and control module (all in housing 900, therefore not shown).
As described above, install 900 the light of pattern is projected on 912 the feature 908 of surface or near.For example, as shown in the figure, feature 908 can be hole or similar aperture.Device 900 can carry out projection pattern light by optical projection system, and this optical projection system comprises: one or more light sources, optical system and a MEMS mirror system.Device 900 can project to light on surface 912 by the first opening 916 installed in 900.Then, installing 900 can receive by the light (indicative character 908) of surface 912 reflections by the second opening 920.
The light received via the second opening can be diverted and catch by imaging system.Imaging system can comprise: the 2nd MEMS mirror system, imaging lens system and image capture apparatus.Control module in housing 900 can (i) be determined the quality of the focusing of the light of catching, and (ii) the quality based on focusing on is controlled second group of MEMS catoptron, to keep the Xin Pufu glug tilt condition between lens focal plane and image focal plane.Control module can also be processed these data to determine (one or more) parameter of feature 908.
With reference now to Figure 16,, the system of describing in Fig. 3 b may be implemented as and comprises an optical projection system 109.Field optical projection system 109 can communicate with control module 104.Field optical projection system 109 can be white light systems.White light systems can produce the line pattern of coding, such as white light Moire fringe optical projection system.Therefore, an optical projection system 109 can comprise light source 160, such as white light source.Light source 160 can be halogen source, but also can use the technology of the transmitting white of various other types.Projection module 109 can comprise the lens combination 162 be configured to produce roughly uniformly, and this is for wanting measured surface 102 have appropriate divergence.In addition, this system can comprise a plurality of gratings 164,166.Light field can be carried out projection by the first grating 164 and the second grating 166, to be created on this part the pattern of beating (beating pattern) that can be seen as moire fringe.In addition, phase changer 168 can be attached to one or more in grating 164,166, to change grating position or angle relative to each other, thereby produces the phase shift that projects to the moire fringe on surface 102.Can take one or more images at each phase position place.Can analyze brightness or the gray-scale value in each pixel position for each phase place of the moire fringe of imaging on this part.The phase differential of catching is larger, and software can solve better by system change.
In this configuration, imaging system 110 comprises focus lens system 142 and image capture module 144.Focus lens system 142 receives from the light of surface 102 reflections, and the light received is directed to image capture module 144.Focus lens system 142 can comprise one or more lens.
In addition, this system can also comprise the second imaging system 111, and it comprises lens combination 150 and image capture module 152.Lens combination 150 can be directed to image capture module 152 by the light from surface 102 reflections, in image capture module 152, comes the image of the pattern of self-fields projecting cell 109 or adaptive optical device 108 to be hunted down.The adaptive optical device can be the optical projection system 108 as described about Fig. 3 B, for example, have can programme controlled MEMS catoptron laser system.The first imaging system 110 and the second imaging system 111 are together used, this system can operate with three-dimensional pattern, thereby be provided for the better areal coverage that the angle in surface 102 changes, and allow this system to try to achieve better the distance from system 100 to surface 102 with the triangulation of visual angle based on known of the first imaging system 110 and the second imaging system 111.
With reference now to Figure 17,, the method that makes use optical projection system and adaptive optical device obtain data is shown.Method 1700 starts from frame 1710.In frame 1712, as frame 1712 is marked, throw light in the field on an optical projection system effects on surface 102 and execution area is obtained.This system can be white light Moire fringe optical projection system, like this, can be carried out and obtain with three-dimensional pattern by single imager or a plurality of imager.In addition, can catch a plurality of images during obtaining.For example, can obtain a plurality of images, wherein, the different phase shift of the moire fringe of each image capture projection.Therefore, can calculate the 3D depth data according to the image of catching in a plurality of outs of phase.As frame 1714 is marked, by the analysis to data, this system can be identified the zone of the data very little that cause due to mirror-reflection.For example, the brightness of each pixel or gray scale can be analyzed, and, if gray-scale value on the threshold value provided, it can be indicated and just receive mirror-reflection from surface so.Mirror-reflection can make the pixel of imager saturated usually, therefore, when the phase change of moire fringe, can see that the brightness of this pixel does not change.An example of mirror-reflection can be seen in Figure 18.Moire fringe can comprise a plurality of striations that replace 1812,1816 and dark fringe 1810,1814,1820.Areas of specular reflection 1830 can appear as the light spot in image.Light spot in zone in 1830 can interrupt candy strip, and, due to the pixel in detecting device may be saturated or approach saturated, so can when the phase change of candy strip, cause the brightness of pixel change is seldom arranged or do not change.
Refer again to Figure 17, as frame 1718 is marked, this system can be programmed to the adaptive optical device, with the pattern by hot spot, is positioned in mirror sections.For example, as shown in figure 19, the adaptive optical device can be programmed to a plurality of laser faculas 1910 in mirror sections 1830.Laser facula 1910 can carry out imaging by one or more image capture module of this system, to determine the distance on the surface from sensor to this part with various triangulation or interfere measurement technique.Use provides the much bigger back light (return) that can obtain than the field from projection module from the little high strength hot spot of the laser instrument of adaptive optical device for image capture module.This is especially true in the situation of white light Moire fringe projector.
Refer again to Figure 17, as frame 1716 is marked, this system can be identified the data available zone very little caused due to poor fringe contrast or poor striped resolution.In this case, may be very low from surperficial back light, it may not allow in geometric configuration the internal corners place of a part of the correct transmission of projection light field.Therefore, may have luminance difference seldom between the pixel in another phase place of phase place of moire fringe and moire fringe.Perhaps, the change of the degree of depth in the zone on surface can sharply, therefore, occur that on very little zone many stripeds change.Like this, imager may not be differentiated the change of phase place rightly.Can be referring to an example in Figure 20.
Be similar to Figure 18, Figure 20 comprises the white light moire fringe.The white light moire fringe comprises striations 2010,2014,2018 and dark fringe 2012,2016 alternately.This image comprises surface slope jumpy regional 2030.Therefore, occur that in the very little zone of image a plurality of stripeds change.Difference between the brightness of each location of pixels that therefore, these zones can be by the out of phase place to moire fringe arranges threshold value and is identified.
As frame 1720 is marked, this program can utilize the adaptive optical device structured pattern to be placed on the zone with poor fringe contrast or poor striped resolution, to obtain the data in zone 2030.The adaptive optical device can be used the laser rays that carrys out projection with pattern of parallel lines.This line pattern can be used to obtain the data in zone 2030 by triangulation technique.Can understand better this technology for Figure 21.
Figure 21 illustrates and uses the adaptive optical device to project to the multi-stripe laser line 2110 in zone 2030.Laser rays 2110 can be so positioned, and makes each line usually all perpendicular to the local candy strip in zone 2030, or, in other implementation, as shown in the figure, can use the series of parallel line, this series of parallel line is usually all perpendicular to these stripeds.In addition, if need other resolution, can in continuous image, use so a plurality of line orientations.
Refer again to Figure 17, as frame 1722 is marked, this system can be utilized the geometric characteristic of for example from the field data for projection of moire fringe analysis, identifying this part.Geometric characteristic can comprise: various other geometric characteristic of hole, slit, corner, edge and this part.The analysis of determining geometric characteristic can comprise that the three-dimensional data cloud by extracting from field data for projection and/or adaptive optical device data moves spheroid or other sampling container.Data in spheroid can compare with a plurality of predefined templates, to determine whether having recognized the from the teeth outwards geometric characteristic corresponding to template.
In addition, as frame 1724 is marked, the geometric characteristic that the adaptive optical device can be controlled to the identification based on surface 102 produces structured pattern.For example, line pattern can or produce perpendicular to the edge formed in this part or corner perpendicular to the edge of slit.These geometric characteristic can only be obtained based on zone, or the zone based on except obtaining such as other that describe for frame 1718 and 1720 obtains to identify.
But, for the technology of frame 1710 to 1724 descriptions, can be carried out adaptively for certain part.May be hindered cycling time by required complex analyses and a plurality of image used in above-mentioned processing.Like this, the step in 1710 to 1724 can be used as the part of the instruction pattern of this system.Like this, as frame 1726 is marked, can produce and obtain model.Obtaining model can comprise that zone obtains with one or more adaptive optical devices and obtain.For example,, for the data that gather mirror sections, the data that gather poor fringe contrast or resolution and come the one or more of image data to obtain based on geometric characteristic.Perhaps, zone obtains can be only for the instruction pattern, and the adaptive optical device obtain can be only for reducing the model that obtains of cycling time during environment in working time.In this case, can utilize one or more adaptive optical devices of the structured pattern comprised based on geometric characteristic to obtain.In frame 1728, this system can allow manually to adjust and obtain model.Manually adjustment can allow graphical userization ground to add, remove or revise line or hot spot, and these lines or hot spot carry out the pattern of free adaptive optical device projection.In frame 1730, system can be stored and be obtained model, thereby make to retrieve, obtains model with for carrying out working time.The method finishes at frame 1732.In addition, understand like this, this system can be stored a plurality of models that obtain, and one is specifically obtained model and can trigger by the part identifier based on being sensed by this system.The part identifier can receive or offer system via the part tracker by bar code or other sensing mechanism, and this part tracker is followed the tracks of this part in whole manufacture process.
Except the region projection system or replace region projection system, system can adopt the adaptive optical device to carry out scanning of a surface in dot pattern, and the coarse 3D surface that adopts the stereocamera of simultaneous action to produce scene.Use thick some cloud, controller can produce the 3D surface.Then, system can be switched to ray mode by the adaptive optical device, and catches and project to lip-deep line.System can be carried out the data point on gauging surface with triangulation.The 3D surface of before catching in dot pattern can be used to differentiate any degree of depth degeneration (depth degeneracy) while with line, obtaining data.
Then, can carry out statistical study to the digitizing scene, improve to find the quality of data.This can take above-mentioned form, wherein, by 3D cloud data, comes mobile spheroid or other volume.Can put to obtain various statisticss for each of volume.The adaptive optical device can be controlled, and with the localized areas to catching for high density data, is thrown light on, and wherein, data are sparse, or, in appropriate, is based on the template matches of specific geometric configuration.This can adopt identical mode as above.For example, laser facula can be used to mirror sections, and laser rays can be used to the zone of low contrast or inclination.In addition, the geometry in particular based on for example identifying by template matches, can generate laser pattern.In addition, 3D registration Algorithm (registration algorithm) can be used to minimize non-overlapped problem, and 3D surface formation algorithm can generate the 3D nurb surface according to the some cloud generated in arbitrary situation.3D nurb surface and/or cloud data can be output to can be by analysis or the software for display of third party's supply.
This device can be built with less support, so its size is suitable for hand-held market.In addition, this device can provide a plurality of models with the nearly IR, redness and the blue laser that point to the MEMS mirror unit by optical fibre coupling.
In addition, this device can be used in the situation that resolution is not significantly lost and caught large scene.Described system does not need retro-reflector or photographic to carry out registration 3D scene.Current camera and lens technologies allow this device automatically to adjust and focus on and zoom, to improve picture quality.In the situation that there is no external intervention, the modulation of laser can allow to optimize the illumination to each several part.
In addition, as about illustrated in fig. 22, sensor can be independent of the adaptive optical device unit and be mounted.In this case, one or more sensors 2210 of for example, describing for Figure 16 above can be provided, with for checking part 2214.As mentioned above, adaptive optical device 2212 can also be provided for the inspection same section.Adaptive optical device 2212 can be independent of one or more sensors 2210 and be mounted.Adaptive optical device 2212 can also be by factory calibrated to be mapped to projection angle the projector space in adaptive optical device 2212 the place aheads.Light from the adaptive optical device can be projected on part 2214, and is received by one or more sensors 2210.But, because the independent of each device installed, the relation between adaptive optical device 2212 and one or more sensor 2210 will can not known.Therefore, adaptive optical device 2212 can be corrected with one or more sensors 2210, and, by being placed in the projection field of adaptive optical device 2212 and each the visual field in one or more sensor 2210 such as the known surface of calibration artifact, can determine the conversion between each and the adaptive optical device 2212 in one or more sensors 2210.This conversion can be saved and utilize in production model, will be by the each several part of this systems inspection to measure.If the adaptive optical device is can be programme controlled and can control the projection in adaptive optical device projector space, this is helpful especially so, because this measuring system can be retrieved the projected position in adaptive optical device space, and by the light projective transformation in the sensor space, with according to watch with the sensor space in surface come alternately to determine surperficial position.
Main instruction of the present disclosure can realize with various forms.Therefore, although the disclosure has comprised specific example,, due to studied accompanying drawing, instructions and below claim the time other modification will become apparent, therefore true scope of the present disclosure should not be confined to this.

Claims (22)

1. a system that checks article, this system comprises:
At least one imager, be configured to receive the image with the visual field that comprises article;
The field projection arrangement, be configured to illuminated field is projected on the article in visual field;
The adaptive optical device, be configured to light structures is projected on the article in visual field; And
Processor, be configured to the view data corresponding with illuminated field and light structures from the imager reception, and this processor is configured to the feature based on illuminated field and light structures analysis article.
2. system according to claim 1, wherein, illuminated field comprises pattern.
3. system according to claim 2, wherein, pattern is the line pattern of coding.
4. system according to claim 3, wherein, the line pattern of coding is moire fringe.
5. system according to claim 1, wherein, the adaptive optical device is LASER Light Source.
6. system according to claim 5, wherein, light structures is can be programme controlled.
7. system according to claim 1, wherein, the analysis that the adaptive optical device is configured to based on to illuminated field automatically projects to light structures on article.
8. system according to claim 7, wherein, the adaptive optical device is configured to produce the pattern of a line or many lines, to obtain wherein data from the analysis field projection, is the data in sparse zone.
9. system according to claim 8, wherein, the adaptive optical device is configured to produce one or more pattern, to obtain wherein from the data of analysis field projection, determines that the scattering of illuminated field is suppressing the data in the zone of the analysis of illuminated field.
10. system according to claim 1, wherein, processor is configured to determine the geometric configuration about article.
11. system according to claim 1, wherein, geometric configuration is identified in the analysis based on to illuminated field.
12. system according to claim 1, wherein, identify geometric configuration based on a plurality of predefined templates.
13. system according to claim 1, wherein, the adaptive optical device is configured to based on geometric configuration, structure be projected on article.
14. system according to claim 1, wherein, analyzed structure, the template with definition for the feature of article.
15. a method that checks article, the method comprises the steps:
Article in visual field are carried out to imaging;
Illuminated field is projected on the article in visual field;
Light structures is optionally projected on the article in visual field;
The view data that reception is corresponding with illuminated field and light structures; And
Feature based on illuminated field and light structures analysis article.
16. method according to claim 15, wherein, illuminated field comprises moire fringe, and light structures is can programme controlled laser lighting structure.
17. method according to claim 15, also comprise that the analysis based on to illuminated field automatically projects to light structures on article.
18. method according to claim 15, also comprise that analysis and a plurality of predefined template based on to illuminated field determined the geometric configuration about article.
19. a system that checks article, this system comprises:
The first imager that right mode configures with solid and the second imager, each in the first imager and the second imager is configured to receive the image with the visual field that comprises article;
Projection arrangement, the illuminated field that is configured to comprise the laser pattern of predetermined coding projects on the article in the two visual field of the first imager and the second imager;
The adaptive optical device, be configured to can programme controlled laser lighting structure to project on the article in the two visual field of the first imager and the second imager; And
Processor, be configured to the view data corresponding with illuminated field and light structures from the imager reception, this processor is configured to the feature based on illuminated field and light structures analysis article, this processor is configured to control the adaptive optical device, with the analysis based on to illuminated field, automatically light structures is projected on article.
20. system according to claim 19, wherein, analysis and a plurality of predefined template that processor is configured to based on to illuminated field are determined the geometric configuration about article.
21. system according to claim 20, wherein, the adaptive optical device is configured to based on geometric configuration, structure be projected on article, and, structure is analyzed to the template with definition for the feature of article.
22. a system that checks article, this system comprises:
At least one imager, be configured to receive the image with the visual field that comprises article;
The adaptive optical device, be configured to light structures is projected on the article in visual field, wherein, adaptive optical device and described at least one imager are mounted independently, and after installing, be calibrated, to determine the conversion between at least one imager space and adaptive optical device space; And
Processor, be configured to receive the view data corresponding with light structures from imager, and this processor is configured to the feature based on light structures and transform analysis article.
CN201310227111.1A 2012-06-08 2013-06-08 Hybrid sensor Pending CN103486979A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US13/492,065 US9170097B2 (en) 2008-04-01 2012-06-08 Hybrid system
US13/492,065 2012-06-08

Publications (1)

Publication Number Publication Date
CN103486979A true CN103486979A (en) 2014-01-01

Family

ID=49626009

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310227111.1A Pending CN103486979A (en) 2012-06-08 2013-06-08 Hybrid sensor

Country Status (2)

Country Link
CN (1) CN103486979A (en)
DE (1) DE102013105828A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105700280A (en) * 2014-12-16 2016-06-22 原子能及能源替代委员会 Structured-Light Projector And Three-Dimensional Scanner Comprising Such A Projector
WO2017173744A1 (en) * 2016-04-08 2017-10-12 杭州先临三维科技股份有限公司 Multi-measurement-mode three-dimensional measurement system and measurement method
CN110542403A (en) * 2019-09-19 2019-12-06 上海兰宝传感科技股份有限公司 MEMS (micro-electromechanical systems) measuring sensor and triangular area measuring method

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9686517B2 (en) 2014-12-15 2017-06-20 Test Research, Inc. Optical system and image compensating method of optical apparatus
DE102016221040A1 (en) * 2016-10-26 2018-04-26 Robert Bosch Gmbh Method for locating a device in a system
CN108169981A (en) * 2018-01-15 2018-06-15 深圳奥比中光科技有限公司 Multi-functional lighting module
CN110017795B (en) * 2019-04-24 2020-12-29 中国科学院国家天文台南京天文光学技术研究所 Relative swing arm type contourgraph for mirror surface inspection and detection method
US20210404874A1 (en) * 2020-06-24 2021-12-30 Airmar Technology Corporation Software Defined Lighting
CN116593282B (en) * 2023-07-14 2023-11-28 四川名人居门窗有限公司 Glass impact resistance reaction test system and method based on structured light

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6262803B1 (en) * 1998-09-10 2001-07-17 Acuity Imaging, Llc System and method for three-dimensional inspection using patterned light projection
US6564166B1 (en) * 1999-10-27 2003-05-13 Georgia Tech Research Corporation Projection moiré method and apparatus for dynamic measuring of thermal induced warpage
US20070124949A1 (en) * 2005-11-01 2007-06-07 Hunter Engineering Company Method and Apparatus for Wheel Alignment System Target Projection and Illumination
CN201974159U (en) * 2008-04-01 2011-09-14 感知器公司 Contour sensor with MEMS reflector

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6262803B1 (en) * 1998-09-10 2001-07-17 Acuity Imaging, Llc System and method for three-dimensional inspection using patterned light projection
US6564166B1 (en) * 1999-10-27 2003-05-13 Georgia Tech Research Corporation Projection moiré method and apparatus for dynamic measuring of thermal induced warpage
US20070124949A1 (en) * 2005-11-01 2007-06-07 Hunter Engineering Company Method and Apparatus for Wheel Alignment System Target Projection and Illumination
CN201974159U (en) * 2008-04-01 2011-09-14 感知器公司 Contour sensor with MEMS reflector

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105700280A (en) * 2014-12-16 2016-06-22 原子能及能源替代委员会 Structured-Light Projector And Three-Dimensional Scanner Comprising Such A Projector
CN105700280B (en) * 2014-12-16 2019-11-19 原子能及能源替代委员会 Structured light projector and spatial digitizer with this projector
WO2017173744A1 (en) * 2016-04-08 2017-10-12 杭州先临三维科技股份有限公司 Multi-measurement-mode three-dimensional measurement system and measurement method
CN110542403A (en) * 2019-09-19 2019-12-06 上海兰宝传感科技股份有限公司 MEMS (micro-electromechanical systems) measuring sensor and triangular area measuring method

Also Published As

Publication number Publication date
DE102013105828A1 (en) 2013-12-12

Similar Documents

Publication Publication Date Title
US9170097B2 (en) Hybrid system
CN103486979A (en) Hybrid sensor
CN201974159U (en) Contour sensor with MEMS reflector
US9013711B2 (en) Contour sensor incorporating MEMS mirrors
US9858682B2 (en) Device for optically scanning and measuring an environment
US11067692B2 (en) Detector for determining a position of at least one object
US10571668B2 (en) Catadioptric projector systems, devices, and methods
US6600168B1 (en) High speed laser three-dimensional imager
CN100592029C (en) Ranging apparatus
JP6309459B2 (en) Time-of-flight camera with stripe lighting
US20170280132A1 (en) System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
JP6484072B2 (en) Object detection device
US20170310948A1 (en) Scanning Illuminated Three-Dimensional Imaging Systems
CN101765755A (en) Three-dimensional shape measuring device, three-dimensional shape measuring method, three-dimensional shape measuring program, and recording medium
CN107783353A (en) For catching the apparatus and system of stereopsis
JPWO2006013635A1 (en) Three-dimensional shape measuring method and apparatus
CN107525466A (en) Automatic pattern switching in volume size annotator
US20150324991A1 (en) Method for capturing images of a preferably structured surface of an object and device for image capture
EP3435026B1 (en) Dual-pattern optical 3d dimensioning
CN104215200A (en) Device and method for the simultaneous three-dimensional measurement of surfaces with several wavelengths
WO2014011182A1 (en) Convergence/divergence based depth determination techniques and uses with defocusing imaging
US11326874B2 (en) Structured light projection optical system for obtaining 3D data of object surface
CN115248440A (en) TOF depth camera based on dot matrix light projection
JP6362058B2 (en) Test object measuring apparatus and article manufacturing method
Van Wolputte et al. Embedded line scan image sensors: The low cost alternative for high speed imaging

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20140101