CN207600393U - Pattern projection module, three-dimensional information obtain system and processing unit - Google Patents

Pattern projection module, three-dimensional information obtain system and processing unit Download PDF

Info

Publication number
CN207600393U
CN207600393U CN201721745583.6U CN201721745583U CN207600393U CN 207600393 U CN207600393 U CN 207600393U CN 201721745583 U CN201721745583 U CN 201721745583U CN 207600393 U CN207600393 U CN 207600393U
Authority
CN
China
Prior art keywords
pattern
measurement
lines
projection module
intersection point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201721745583.6U
Other languages
Chinese (zh)
Inventor
朱庆峰
苑京立
田克汉
尹晓东
张国伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yu Light Technology Development Co Ltd
Original Assignee
Beijing Yu Light Technology Development Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yu Light Technology Development Co Ltd filed Critical Beijing Yu Light Technology Development Co Ltd
Priority to CN201721745583.6U priority Critical patent/CN207600393U/en
Application granted granted Critical
Publication of CN207600393U publication Critical patent/CN207600393U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

This application discloses a kind of for measuring the pattern projection module of dimension of object, it includes coherent source and pattern maker, the pattern maker includes diffraction optical element and is arranged in the light path for the illumination light propagated from the coherent source to object, for to be projected onto the measurement pattern on object based on the illumination photogenerated, measurement pattern includes at least two lines item, and the lines are in measurement with being crossed to form at least one central intersection point in pattern.Disclosed herein as well is the three-dimensional informations including the pattern projection module to obtain system.According to the utility model, quick, high-precision dimensional measurement to approximately cuboid object is helped to realize, and allow measuring device portability with pattern based on the measurement, reduces manufacture cost.

Description

Pattern projection module, three-dimensional information obtain system and processing unit
Technical field
The utility model relates generally to the method for three-dimensional measurement and device of dimension of object measurement, and in particular to is used for It measures the pattern projection module of dimension of object, obtain system, three for measuring the three-dimensional information of the size of approximately cuboid object Tie up information processing unit and the method for three-dimensional measurement for measuring approximately cuboid dimension of object.
Background technology
Logistic industry develops rapidly in recent years, and logistics package is needed (usually to use in storage, transport and all too many levels of charging Packed in cases) three-dimensional dimension.Traditional hand dipping and record obviously cannot be met the requirements.There are some storehouses to employ industry Three-dimensional scanning device.For industrial three-dimensional scanning device mostly using line-structured light scanning technique, precision is high, but needs fixed peace Dress and measuring environment, requirement testee position is immovable and at high price during scanning.
Occur some Kinect somatosensory sensings for people's Three-dimensional Sensor used in everyday, such as Microsoft on the market Device.These equipment are mostly using infrared pattern light or TOF (time of flight) technology, it is so structured that portable , but measurement error is larger, about 2~10% range, it is impossible to three-dimensional dimension is wrapped up in accurate, steadily measurement, and Cost is still higher.
Utility model content
The purpose of this utility model is to provide a kind of novel three-dimensional measurement scheme, is particularly suitable for generally rectangular shape The three-dimensional measurement scheme of object (such as logistics package) dimensional measurement of shape, solves exist in the prior art at least partly The above problem.
One side according to the present utility model provides a kind of pattern projection module for being used to measure dimension of object, Including:Coherent source;And pattern maker, the pattern maker include diffraction optical element and are arranged on from the phase Dry light source is into the light path for the illumination light that object is propagated, for be projected onto the measurement on object based on the illumination photogenerated With pattern, measurement pattern includes at least two lines item, the lines measurement be crossed to form in pattern it is at least one in Entreat intersection point.
Other side according to the present utility model provides the three of a kind of size for being used to measure approximately cuboid object Information Acquisition System is tieed up, the system comprises:Pattern projection module as described above, for the object in target area Project the measurement pattern;And imaging unit, the imaging unit include at least one camera, have for acquisition projection described The image of measurement pattern, described object, the pattern projection module and the imaging unit have relative to each other to be determined Position.
By using the measurement pattern that at least one central intersection point is crossed to form including at least two lines item and lines, root It is helped to realize according to the module of the utility model, system, method and apparatus to object, particularly approximately cuboid shaped objects Quickly, high-precision dimensional measurement.In addition, the three-dimensional measuring apparatus based on above-mentioned measurement pattern can realize portability, and It manufactures at low cost.
Description of the drawings
By reading the detailed description to non-limiting example made with reference to the following drawings, other spies of the invention Sign, objects and advantages will become more apparent upon:
Fig. 1 is the schematic block diagram that system is obtained according to the three-dimensional information of the utility model embodiment;
Fig. 2 is according to the schematic diagram of the pattern projection module of the utility model embodiment, shows two kinds of differences Make;
Fig. 3 is schematically shown is embodied as hand-held survey according to the three-dimensional information of the utility model embodiment acquisition system Measure equipment;
Fig. 4 A and Fig. 4 B respectively illustrate pattern projection in the three-dimensional information acquisition system according to the utility model embodiment Two kinds of arrangements of module and imaging unit;
Fig. 5 schematically shows the application examples that system is obtained according to the three-dimensional information of the utility model embodiment;
Fig. 6 shows the schematic flow chart of the method for three-dimensional measurement according to the utility model embodiment;
Fig. 7 is shown to be shown according to the one of measurement pattern that the pattern projection module of the utility model embodiment projects Example;
Fig. 8, which schematically shows projection thereon, the object of measurement pattern shown in Fig. 7;
Fig. 9 A, Fig. 9 B and Fig. 9 C respectively illustrate the different modifications of measurement pattern shown in Fig. 7;
Figure 10 shows the another of the measurement pattern that is projected according to the pattern projection module of the utility model embodiment A example;
Figure 11, which schematically shows projection thereon, the object of measurement pattern shown in Figure 10;
Figure 12 is schematic diagram, wherein denoting the object top corner part for calculating object space orientation by image procossing and being utilized Point;
Figure 13 shows a kind of modification of measurement pattern shown in Figure 10;
Figure 14, which schematically shows projection thereon, the object of measurement pattern shown in Figure 13;
Figure 15 and Figure 16 is respectively illustrated to be used available for the measurement of the method for three-dimensional measurement according to the utility model embodiment Other two example of pattern.
Specific embodiment
The utility model is described in further detail with reference to the accompanying drawings and examples.It is understood that herein Described specific embodiment is used only for explaining related invention rather than the restriction to the invention.It also should be noted that For ease of description, it is illustrated only in attached drawing with inventing relevant part.
It should be noted that in the absence of conflict, the feature in embodiment and embodiment in the application can phase Mutually combination.The utility model will be described in detail below with reference to the accompanying drawings and embodiments.
First, system 10 is obtained according to the three-dimensional information of the utility model embodiment referring to figs. 1 to Fig. 5 introductions.Fig. 1 is three Tie up the schematic block diagram of Information Acquisition System 10.
As shown in Figure 1, three-dimensional information, which obtains system 10, includes pattern projection module 11 and imaging unit 12, wherein pattern is thrown Module 11 is penetrated for projecting measurement pattern to the object in target area, imaging unit 12 includes at least one camera (not shown), for obtaining the subject image that measurement pattern has been projected in the target area.Imaging unit 12 relative to Pattern projection module 11 has determining position relationship, this is introduced in further detail hereinafter with reference to Fig. 4 A and Fig. 4 B.
Fig. 2 schematically shows the construction of pattern projection module 11, as shown in the figure, pattern projection module 11 includes being concerned with Light source 11a and pattern maker 11b.Coherent source 11a can be laser, such as red laser.Pattern maker 11b is excellent Choosing includes diffraction optical element (DOE), and pattern maker 11b is arranged on illumination light from coherent source 11a to object that propagated from In light path, for be projected onto the measurement pattern on object based on illumination photogenerated.According to the utility model embodiment Three-dimensional information acquisition system in, the measurement that the pattern maker of pattern projection module is generated includes at least two lines with pattern Item, the lines are in measurement at least one central intersection point is crossed to form in pattern, this is hereinafter with reference to such as Fig. 7, figure 9th, Figure 10, Figure 15 and Figure 16 are introduced in further detail.In the application, measurement pattern is is projected by pattern projection module 11 , the pattern obtained in the plane perpendicular to illuminating light propagation light path.
A kind of representative configuration of pattern projection module 11 is shown in (a) of Fig. 2, wherein as coherent source 11a's Laser is vertical cavity surface emitting laser (VCSEL), and the laser emitting surface of laser 11a faces pattern maker 11b (examples Such as diffraction optical element) arrangement;Another representative configuration is shown in (b) of Fig. 2, wherein, as coherent source 11a's Laser is edge emitter laser, in order to will the laser guide pattern maker 11b from edge emitter laser, pattern throw Optical path-deflecting element 11c, such as speculum or reflecting prism etc. can also be included by penetrating module 11.
It is surveyed as shown in figure 3, obtaining system 10 according to the three-dimensional information of the utility model embodiment and can be implemented as hand-held Equipment is measured, for measuring the three-dimensional dimension of approximately cuboid shaped objects OB.In use, hand-held measurement device 10 is remained With certain angle of depression against object OB so that pattern projection module 10 by measurement pattern projection in the top surface P1 of object OB and On first side surface P2 and the second side surface P3, the first side surface P2 and the second side surface P3 is with top surface P1 adjoinings and each other Two adjacent side surfaces (referring to Fig. 8, Figure 11 and Figure 14).At the same time, operator can control (such as passing through button) into As unit 12, shooting projection has the image of the object OB of the measurement pattern.The image can be used for subsequent processing, with extraction The three-dimensional information of object OB.
Fig. 4 A and Fig. 4 B are respectively illustrated obtains pattern throwing in system 10 according to the three-dimensional information of the utility model embodiment Penetrate two kinds of arrangements of module 11 and imaging unit 12.In arrangement shown in Fig. 4 A, the center of projection of pattern projection module 11 Axis CA and the optical axis OA of imaging unit 12 are parallel to each other, and pattern projection module 11 is separated by a certain distance with imaging unit 12 (distance is also known as parallax range).In arrangement shown in Fig. 4 B, pattern projection module 11 is with imaging unit 12 still at a distance of certain Distance (distance is also known as parallax range), but the optical axis OA of the center of projection axis CA of pattern projection module 11 and imaging unit 12 It is not parallel each other.
It should be appreciated by those skilled in the art that any arrangement more than no matter taking, according in measurement pattern It throws position and pattern of position, this feature point of one characteristic point in the pattern in the image acquired in imaging unit 12 Penetrate determining position relationship (angle and figure including center of projection axis CA Yu optical axis OA between module 11 and imaging unit 12 Case projects the spacing of module 11 and imaging unit 12), the space bit of launching position of this feature point on object can be calculated It puts (spatial position is for the camera coordinates system of imaging unit 12 here).
Relative position relation between pattern projection module 11 and imaging unit 12 can be set in installation process, It can be determined after mounting by calibration process, the utility model is unrestricted in this aspect.
Fig. 5 schematically shows the application examples that three-dimensional information obtains system 10, and wherein imaging unit 12 is by integrating Camera 12a in such as mancarried devices such as mobile phone, PDA or tablet computer is formed, and pattern projection module 11 passes through a peace Dress stent 14 is installed to the mancarried device, so as to be removably installed fixation relative to imaging unit 12.Mounting bracket 14 Such as opposite elastic clip gripping arm there are two tools, for being fixed to mancarried device.Pattern projection module 11 can be detachable Ground is installed in the mounting bracket 14, can also be integrated with mounting bracket 14.In some embodiments, pattern projection mould Group 11 can include the laser light source 11a and installation that integrate in a portable device or the figure being integrated in mounting bracket 14 Pattern generator 11b.
Although it should be understood that with reference to as described in Fig. 3 and Fig. 5, obtained according to the three-dimensional information of the utility model embodiment System 10 is taken to be adapted for carrying out as portable equipment/portable device, but it can also be embodied as being mounted on fixed position simultaneously The equipment used in the fixed environment, such as the detection device being fixedly mounted in storehouse or materials-sorting system.
Below in conjunction with Fig. 6 to Figure 16 introductions according to the method for three-dimensional measurement of the utility model embodiment and available for this The measurement of method projection pattern.
Fig. 6 shows the schematic flow chart of the method for three-dimensional measurement 100 according to the utility model embodiment.Three-dimensional measurement Method 100 is suitable for measuring the three-dimensional dimension of the object of approximately cuboid shape, including following processing:
S110:Measurement pattern is projected to the object using pattern projection module, which includes at least two with pattern Lines, the lines are in the measurement with being crossed to form at least one central intersection point in pattern, and the projection is so that institute It states central intersection point to be projected on the top surface of the object, and the lines is caused to be projected the top surface and abutted with top surface The first and second side surfaces on, the edge of the lines and the top surface is crossed to form top edge intersection point, and with described first The lower edge intersection point being crossed to form with the lower edge of at least one of the second side surface;
S120:Using the imaging unit relative to the pattern projection module with determining relative position, obtain thereon Projection has the image of the object of the measurement pattern, and described image shows the top surface and the first side surface of the object With the second side surface;
S130:Determine the position of the central intersection point, top edge intersection point and lower edge intersection point in described image;
S140:It is based at least partially on the length and width of object described in the central intersection point and top edge intersection point calculation; And
S150:It is based at least partially on the height of object described in the lower edge intersection point calculation.
Referring back to Fig. 1, as shown in the figure, obtaining system 10 according to the three-dimensional information of the utility model embodiment can also wrap Processing unit 13 is included, processing unit 13 receives the described image from imaging unit 12, and is configured to the image and performs State processing S130~S150 in method for three-dimensional measurement 100.Processing unit 13 can include processor 13a and storage program instruction Memory 13b, described program instruction performed by processor 13a so that the processor 13a perform processing S130~ S150。
In some embodiments, processing unit 13 can also be separately formed three-dimensional information processing unit.
In some embodiments, processing unit 13 can be integrated in together with pattern projection module 11 and/or imaging unit 12 In one hardware device, such as shown in figure 5, it can be integrated in mobile phone with imaging unit 12.In further embodiments, it handles Unit 13 can also provide connect in a manner of wiredly and/or wirelessly with pattern projection module 11 and/or imaging unit 12 it is logical In the discrete equipment of letter, such as cloud server.
The first embodiment of method for three-dimensional measurement 100 is introduced below with reference to Fig. 7 and Fig. 8 and available for first reality Apply the measurement pattern of mode.
Fig. 7 shows the measurement pattern P A10 of the first embodiment available for method for three-dimensional measurement 100.Such as Fig. 7 institutes Show, measuring and using pattern P A10 packets four lines L11, L12, L21, L22, wherein lines L11, L12 is parallel to each other, forms one group and puts down Line, lines L21, L22 are parallel to each other, form another group of parallel lines, lines L11, L12 and lines L21, L22 and scheme in measurement Four central intersection point a1, a2, a3, a4 are formed in case.
Fig. 8, which schematically shows projection thereon, the object of measurement pattern P A10.According to method for three-dimensional measurement 100, It handles in S110, measurement pattern is projected to the object using pattern projection module.In preferable example shown in Fig. 8, locating It managing in S110, projects measurement pattern P A10 so that central intersection point a1, a2, a3, a4 are projected on the top surface P1 of object OB, And lines L11, L12 are projected on top surface P1 and the first side surface P2, lines L21, L22 are projected top surface P1 and the On two side surface P3, so as to be crossed to form top edge intersection point b1~b8 with the edge of top surface P1, with the first and second side surface P2, The lower edge of P3 is crossed to form lower edge intersection point c1~c4.
Due to combining with reference to Fig. 3, it can be seen that since pattern projection module 11 projects survey with certain angle of depression on object Amount pattern, and measurement pattern be projected by pattern projection module 11, in the plane perpendicular to illuminating light propagation light path The pattern of upper acquisition, so projecting the angle of the lines in the pattern on object top surface can change.For the ease of lines L11, L12 can intersect with lines L21, L22 with four edges of object top surface P1, in some preferred embodiments, lines The non-obtuse angle α that L11, L12 and lines L21, L22 are formed meets 30 °≤α≤90 °, preferably 50 °≤α≤70 °, more preferable α =60 °.
In addition, in order to which lines L11, L12 and lines L21, L22 is made to be projected onto top surface and the first and second sides of object The lower edge on surface, according to the pattern maker 11b in the pattern projection module 11 of the utility model embodiment, such as diffraction light Element is learned, is configured on the length direction along lines, to be not less than 20 °, preferably project the survey with the subtended angle for being not less than 30 ° Amount pattern.If the projection subtended angle is too small, the lines projected are likely to the edge for being difficult to cover/cross over object top surface And first and second side surface lower edge, so as to cannot be formed for the three-dimensional measurement side according to the utility model embodiment Top edge intersection point and lower edge intersection point in method.
While measurement pattern is projected, obtaining projection thereon using imaging unit 12 has the object of measurement pattern P A10 The image (processing S120) of body OB.The top surface and the first side surface of described image display object and the second side surface.
In the described image that processing S130, processing imaging unit 12 are obtained, central intersection point a1, a2, a3 in detection image, A4, top edge intersection point b1~b8 and lower edge intersection point c1~c4, determine the position of each intersection point in the picture.
In some embodiments, it can be handed over by being extracted in described image corresponding to top edge intersection point and/or lower edge The characteristic point of point determines the position of top edge intersection point and/or lower edge intersection point in described image.
In further embodiments, the lines that can detect measurement pattern project formation on two adjacent surfaces Straight line, and calculate the intersection point of two straight lines, what the edge shared as the lines and the two described surfaces was crossed to form Top edge intersection point or lower edge intersection point.For example, in example shown in Fig. 8, for example, can detect lines L12 on the P1 of top surface It projects the straight line formed and projects the straight line of formation on the first side surface P2, and calculate the intersection point of this two straight lines, as The intersection point at the shared edge of lines L12 and top surface P1 and the first side surface P2, top edge intersection point b1.The detection of this edge intersection point Mode for the edge of object have fillet, defect or deform and cause measurement pattern in lines and the edge from It is formed especially advantageous in the case of the clear or accurate intersection point in position, helps avoid the measurement error under the above situation, carry High measurement accuracy.
In some embodiments, pattern projection module 11 can intermittently project measurement pattern in a manner of stroboscopic, and And the projection that imaging unit 12 is controlled to obtain object respectively has the image of measurement pattern and does not project measurement pattern Image.In this way, in image processing process, there is the image of measurement pattern by comparing projection and do not project measurement pattern Image more can detect quickly and/or more accurately the measurement pattern being incident upon on object, so as to detect Central intersection point, top edge intersection point and lower edge intersection point.
In example shown in Fig. 8, in S140 is handled, following handle can be carried out:
Based in position of at least three in central intersection point a1, a2, a3, a4 in described image, described at least three Entreat the determining position between position and pattern projection module 11 and imaging unit 12 of the intersection point in measurement pattern Relationship using triangulation, calculates the spatial position of described at least three central intersection points;
Determine that (true for the spatial position of top surface P1 of object OB at 3 points based on the spatial position of described at least three central intersection points A fixed plane);And
The position of spatial position and top edge intersection point b1~b8 in the picture based on top surface P1 calculates the length of the top surface Degree and width.
Wherein, in some embodiments, spatial position and top edge intersection point b1~b8 based on top surface P1 are in the picture Position can calculate the spatial position of top edge intersection point first.
In example shown in Fig. 8, due to same group of parallel lines, lines L11, L12 or lines L21, L22, with top surface P1's Two top edge intersection points that same edge is crossed to form, it is possible to go out described same one side based on the two top edge intersection point calculations The position of edge (2 points determine straight line).
In some embodiments, four of top surface can based on the position at four edges of the top surface P1 of object, be calculated The position on vertex, so as to calculate the length and width of top surface.
In example shown in Fig. 8, in S150 is handled, following handle can be carried out:
Spatial position based on top surface P1, the orthogonal position relationship of first and second side surface P2, P3 and top surface, with And the position at edge that top surface P1 and first and second side surface P2, P3 are shared, the sky of first and second side surface P2, P3 of calculating Between position;
Based on lower edge intersection point c1~c4, the position of the lower edge of first and second side surface P2, P3 is calculated;And
The position of spatial position and their lower edge based on first and second side surface P2, P3 calculates object OB Height.
In some embodiments, first can be calculated based on the position of the lower edge of first and second side surface P2, P3 With the intersection point of the lower edge of the second side surface, so as to calculate the height of object.
The first embodiment of method for three-dimensional measurement 100 is described above in association with Fig. 7 and Fig. 8 and available for the implementation One example of the measurement pattern of mode.However, the first embodiment of method for three-dimensional measurement 100 is not limited to using Fig. 7 Shown measurement pattern P A10.Fig. 9 A, Fig. 9 B and Fig. 9 C respectively illustrate the different modifications of measurement pattern P A10.
With in pattern P A11, lines can include several separated line segment parts, such as shape for measurement shown in Fig. 9 A As the lines of " dotted line " form.The central intersection point and/or up/down edge intersection point that such lines are formed can pass through The intersection point of two straight lines is calculated to determine.
Speckle pattern is can be combined on the basis of the intersecting line image of measurement pattern 10, to help to observe or survey Deformation of the object relative to rectangular shape is measured, particularly for measuring the deformation at object edge, to improve dimensional measurement Precision.For example, measurement shown in Fig. 9 B is in pattern P A12, combining the speckle SP1 being arranged in array.Preferably, speckle point Cloth is using four of two groups of parallel lines central intersection points as in the region other than the diamond-shaped area on vertex.In method for three-dimensional measurement 100 Some preferred embodiments in, this method can also be included on the top surface P1 of object label of the setting with known dimensions, Such as bar code or Quick Response Code etc., when projecting measurement pattern, by the open diamond area that described four central intersection points are vertex Domain is directed at the label, and in subsequent image processing and calculating, using the label with known dimensions to dimension measurement result It is corrected.In addition it is also possible to using measurement pattern P A13 as shown in Figure 9 C, wherein combining the speckle of irregular distribution SP2.Speckle pattern is preferably array arrangement or the speckle otherwise regularly arranged.
Below with reference to Figure 10 to Figure 12 introduce method for three-dimensional measurement 100 second embodiment and available for this second The measurement pattern of embodiment.
Figure 10 shows the measurement pattern P A20 of the second embodiment available for method for three-dimensional measurement 100.Such as Figure 10 It is shown, it measures and includes two lines L1, L2 with pattern P A20, they form a center in the range of measurement is with pattern P A20 Intersection point a.
Figure 11, which schematically shows projection thereon, the object of measurement pattern P A20.In example shown in Figure 11, three In the processing S110 for tieing up measuring method 100, measurement is projected the object to approximately cuboid shape with pattern P A20 so that center Intersection point a is projected on the P1 of object top surface, and lines L1 is projected on top surface P1 and the first side surface P2, lines L2 quilts It is incident upon on top surface P1 and the second side surface P3, so as to be crossed to form top edge intersection point b1~b4 with the edge of top surface P1, with the One and second the lower edge of side surface P2, P3 be crossed to form lower edge intersection point c1, c2.
Similar to measurement pattern P A10, in some preferred embodiments, measure with lines L1, L2 shape in pattern P A20 Into non-obtuse angle α meet 30 °≤α≤90 °, preferably 50 °≤α≤70 °, more preferably α=60 °.
In addition, in order to which lines L1, L2 is made to be projected onto the top surface of object and the lower edge of the first and second side surfaces, root According to the pattern maker 11b in the pattern projection module 11 of the utility model embodiment, such as diffraction optical element, it is configured to Along along the length direction of lines to be not less than 20 °, measurement pattern P A20 is preferably projected with the subtended angle for being not less than 30 °.
While measurement pattern is projected, obtaining projection thereon using imaging unit 12 has the object of measurement pattern P A20 The image (processing S120) of body OB.
In the second embodiment of method for three-dimensional measurement 100, processing S130 can be with measuring the of three-dimensional method 100 One embodiment identical mode carries out, and details are not described herein.
In the second embodiment of method for three-dimensional measurement 100, in S140 is handled, the side of image procossing can be passed through Method, the characteristic point of extraction objects in images OB, the edge and vertex of reconstruction of objects OB, particularly Figure 12 are marked in two dimensional image Vertex A, B, C, D and edge AB, edge BC and the edge BD shown, is then based on object top surface P1 and the first and second side surfaces Position relationship perpendicular to one another between P2, P3 calculates such as top surface P1 relative to the space angle of camera.It is new according to this practicality Type embodiment, top edge intersection point b1~b4 can be as the key feature points in above-mentioned restructuring procedure, such as can require to reconstruct Top edge have to pass through corresponding top edge intersection point b1~b4.In some embodiments, the space bit of central intersection point a is calculated It puts, and the spatial position of the space angle based on top surface P1 and central intersection point a, the spatial position of top surface P1 is calculated, so as to calculate Go out the spatial position of vertex A, B, C, so as to calculate the length and width of object (top surface P1).
It in some embodiments, can be based on edge BD relative to the vertical of edge AB and edge BD in S150 is handled The position of relationship and vertex D in the picture calculates the spatial position of vertex D, so as to calculate the height of object (edge BD Length).According to the utility model embodiment, lower edge intersection point c1, c2 can be used for correcting the position of vertex D, for example, utilizing The lower edge of first and second side surfaces and the parallel relation of top edge.
It should be understood that the above description to processing, the description particularly to the concrete operations for handling S140 and S150 It is only exemplary, and not restrictive, such as wherein used circular and step can have various modifications, This is that those skilled in the art can just make according to common sense in the field, and the utility model is not limited in this respect.
Figure 13 shows measurement pattern P A21, is a kind of modification of measurement pattern shown in Figure 10.Measurement pattern In PA21, interruption part is formed on the two lines item that intersects each other, endpoint of the lines in the discontinuities office can be considered as Characteristic point.As shown in figure 13, measurement pattern P A21 can include center intersection point a1 and four characteristic point a2~a5.
As shown in figure 14, in S110 is handled, measurement pattern P A21 can be projected so that central intersection point a1 and four spies Sign point a2~a5 is projected on the top surface P1 of object.
According to the utility model embodiment, when combining using measurement with pattern P A21, method for three-dimensional measurement 100 can be with It is realized different from the mode in its first embodiment and second embodiment.For example, it in some embodiments, can be based on Central intersection point a1 and four characteristic point a2~a5 is in position of the measurement in pattern P A1 and position in the picture and is based on The determining position relationship of pattern projection module 11 and imaging unit 12, calculating center intersection point a1 and four characteristic point a2~ The spatial position of a5, so as to calculate the spatial position of object top surface P1.In some embodiments, can utilize top edge intersection point and Lower edge intersection point and the other feature point extracted in respective edges by image procossing, reconstruct/determine the edge.It is other Specific processing is that those skilled in the art are easily envisaged that according to above-mentioned introduction and common sense in the field, no longer superfluous herein It states.
Finally, Figure 15 and Figure 16 respectively illustrates the survey available for the method for three-dimensional measurement according to the utility model embodiment Other two example of amount pattern.
Figure 15 shows measurement pattern P A30, and including three lines L1, L2, L3, they meet at central intersection point respectively a1、a2、a3.It, can be with when being implemented based on the measurement with pattern P A30 according to the method for three-dimensional measurement of the utility model embodiment The spatial position of object top surface is calculated based on central intersection point a1, a2, a3, and is formed using lines L1, L2, L3 and object edge Intersection point (top edge intersection point or lower edge intersection point) or these intersection points and image in the other feature point being located on edge that extracts Combination, the edge of object is calculated/determines, so as to calculate the length of object, width and height.
Figure 16 shows measurement pattern P A40, and including two groups of parallel lines, every group of parallel lines include three lines.It can be with See, measurement is capable of providing available for calculating object top surface spatial position, redundancy central intersection point, and energy with pattern P A40 Enough provide can be used for determining the top edge intersection point of redundancy of object edge and/or lower edge intersection point.This is for such as detected material For example there is defect or the situation of deformation (this is common for logistics package) are especially beneficial for body.For example, minimum can be passed through Square law fits top surface and edge using the central intersection point or up/down edge intersection point of redundancy quantity, improves dimensional measurement essence Degree.
It should be understood, however, that if the quantity of central intersection point and up/down edge intersection point is excessive, need using one Fixed structure light coding technology identifies different intersection points, and instead so that measuring speed is slack-off, error rate rises.Therefore, according to The lines quantity that the utility model embodiment, preferably measurement pattern include is no more than 10, and more preferable lines quantity does not surpass 6 are crossed, the central number of intersections of formation is no more than 9.
The preferred embodiment and the explanation to institute's application technology principle that above description is only the application.People in the art Member should be appreciated that invention scope involved in the application, however it is not limited to the technology that the specific combination of above-mentioned technical characteristic forms Scheme, while should also cover in the case where not departing from the inventive concept, it is carried out by above-mentioned technical characteristic or its equivalent feature The other technical solutions for arbitrarily combining and being formed.Such as features described above has similar work(with (but not limited to) disclosed herein The technical solution that the technical characteristic of energy is replaced mutually and formed.

Claims (15)

1. it is a kind of for measuring the pattern projection module of dimension of object, including:
Coherent source;And
Pattern maker, the pattern maker include diffraction optical element and are arranged on from the coherent source to object to pass In the light path for the illumination light broadcast, for be projected onto the measurement pattern on object, the measurement based on the illumination photogenerated Include at least two lines item with pattern, the lines are in measurement with being crossed to form at least one central intersection point in pattern.
2. pattern projection module as described in claim 1, wherein, the diffraction optical element is configured in the length along lines The measurement pattern is projected with the subtended angle for being not less than 20 ° on direction.
3. pattern projection module as described in claim 1, wherein, the non-obtuse angle α that at least two lines item is formed is expired 30 °≤α≤90 ° of foot.
4. pattern projection module as claimed in claim 3, wherein, the non-obtuse angle α that at least two lines item is formed is expired 50 °≤α≤70 ° of foot.
5. pattern projection module as described in claim 1, wherein, the measurement is included in the region of measurement pattern with pattern The interior a plurality of lines for forming at least three central intersection points.
6. pattern projection module as claimed in claim 5, wherein, the measurement includes two groups of parallel lines with pattern, and every group is put down Line includes two lines item, and two groups of parallel lines form four central intersection points in measurement pattern.
7. pattern projection module as claimed in claim 6, wherein, the measurement further includes speckle, the speckle point with pattern Cloth is using described four central intersection points as in the region other than the diamond-shaped area on vertex.
8. pattern projection module as described in claim 1, wherein, the measurement further includes speckle with pattern, and the speckle is excellent Choosing is arranged in array.
9. the pattern projection module as described in any one of claim 1-8, wherein, at least one lines packet in the lines Include several line segment parts spaced apart.
10. a kind of obtain system for measuring the three-dimensional information of the size of approximately cuboid object, the system comprises:
The pattern projection module as described in any one of claim 1-9, for described in the object projection in target area Measurement pattern;And
Imaging unit including at least one camera, has the image of the measurement pattern, described object for obtaining projection, The pattern projection module and the imaging unit have determining position relative to each other.
11. three-dimensional information as claimed in claim 10 obtains system, processing unit is further included, the processing unit reception comes from The described image of the imaging unit, and be configured as performing following operation:
Described image is handled to detect the lines in the central intersection point being projected on object top surface and the measurement pattern Top edge intersection point and lower edge at least one of the first and second side surfaces are crossed to form with the edge of the top surface The lower edge intersection point being crossed to form, wherein, first and second side surface is for the adjacent with top surface of the object and adjacent each other Two side surfaces connect;
It is based at least partially on the length and width of object described in the central intersection point and top edge intersection point calculation;And
It is based at least partially on the height of object described in the lower edge intersection point calculation.
12. the three-dimensional information as described in claim 10 or 11 obtains system, wherein, the three-dimensional information obtains system and is constructed To be used to measure the handheld apparatus of packing case size.
13. three-dimensional information as claimed in claim 12 obtains system, wherein, the pattern projection module passes through a mounting bracket Fixation is removably installed with the imaging unit.
14. three-dimensional information as claimed in claim 11 obtains system, wherein, the imaging unit and the processing unit integrate In mobile phone, PDA or tablet computer.
15. the three-dimensional information as described in claim 10 or 11 obtains system, wherein, the pattern projection module is with the side of stroboscopic Formula intermittently projects the measurement pattern, and the projection that the imaging unit obtains the object has the measurement pattern Image and do not project it is described measure pattern image.
CN201721745583.6U 2017-12-14 2017-12-14 Pattern projection module, three-dimensional information obtain system and processing unit Active CN207600393U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201721745583.6U CN207600393U (en) 2017-12-14 2017-12-14 Pattern projection module, three-dimensional information obtain system and processing unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201721745583.6U CN207600393U (en) 2017-12-14 2017-12-14 Pattern projection module, three-dimensional information obtain system and processing unit

Publications (1)

Publication Number Publication Date
CN207600393U true CN207600393U (en) 2018-07-10

Family

ID=62764196

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201721745583.6U Active CN207600393U (en) 2017-12-14 2017-12-14 Pattern projection module, three-dimensional information obtain system and processing unit

Country Status (1)

Country Link
CN (1) CN207600393U (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107907055A (en) * 2017-12-14 2018-04-13 北京驭光科技发展有限公司 Pattern projection module, three-dimensional information obtain system, processing unit and measuring method
US20210052195A1 (en) * 2019-08-19 2021-02-25 National Central University Tremor identification method and system thereof

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107907055A (en) * 2017-12-14 2018-04-13 北京驭光科技发展有限公司 Pattern projection module, three-dimensional information obtain system, processing unit and measuring method
CN107907055B (en) * 2017-12-14 2024-01-26 北京驭光科技发展有限公司 Pattern projection module, three-dimensional information acquisition system, processing device and measuring method
US20210052195A1 (en) * 2019-08-19 2021-02-25 National Central University Tremor identification method and system thereof

Similar Documents

Publication Publication Date Title
CN107907055A (en) Pattern projection module, three-dimensional information obtain system, processing unit and measuring method
CN101821580B (en) System and method for the three-dimensional measurement of shape of material objects
US11022692B2 (en) Triangulation scanner having flat geometry and projecting uncoded spots
US9529945B2 (en) Robot simulation system which simulates takeout process of workpieces
US10582188B2 (en) System and method for adjusting a baseline of an imaging system with microlens array
US20040008259A1 (en) Optical methods for remotely measuring objects
US20150070468A1 (en) Use of a three-dimensional imager's point cloud data to set the scale for photogrammetry
US20140268178A1 (en) System and method of acquiring three dimensional coordinates using multiple coordinate measurement devices
JP2004170412A (en) Method and system for calibrating measuring system
CN207600393U (en) Pattern projection module, three-dimensional information obtain system and processing unit
CN108458731A (en) For by three-dimensional data visualization method
CN102881040A (en) Three-dimensional reconstruction method for mobile photographing of digital camera
US20200041262A1 (en) Method for performing calibration by using measured data without assumed calibration model and three-dimensional scanner calibration system for performing same
JP2896539B2 (en) How to detect unique information such as object position and angle
US7046839B1 (en) Techniques for photogrammetric systems
JP6486083B2 (en) Information processing apparatus, information processing method, and program
US20160349045A1 (en) A method of measurement of linear dimensions of three-dimensional objects
JP2021038939A (en) Calibration device
TW201349171A (en) System and method for establishing a coordinate system on a curved surface
Ahrnbom et al. Calibration and absolute pose estimation of trinocular linear camera array for smart city applications
RU164082U1 (en) DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS
JP3781762B2 (en) 3D coordinate calibration system
JP2019184430A (en) Three-dimensional position measurement system
JP2003294433A (en) Method for synchronizing coordinate systems of plural three dimensional shape measuring devices
Yasuda et al. 360 degrees Three-Dimensional measurement using five measuring systems

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant