CN1300551C - Apparatus and method for automatically arranging three dimensional scan data using optical marker - Google Patents

Apparatus and method for automatically arranging three dimensional scan data using optical marker Download PDF

Info

Publication number
CN1300551C
CN1300551C CNB038178915A CN03817891A CN1300551C CN 1300551 C CN1300551 C CN 1300551C CN B038178915 A CNB038178915 A CN B038178915A CN 03817891 A CN03817891 A CN 03817891A CN 1300551 C CN1300551 C CN 1300551C
Authority
CN
China
Prior art keywords
mark
data
scan
image
view data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CNB038178915A
Other languages
Chinese (zh)
Other versions
CN1672013A (en
Inventor
张敏镐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SOLUTIONIX CORP
Original Assignee
SOLUTIONIX CORP
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR10-2003-0022624A external-priority patent/KR100502560B1/en
Application filed by SOLUTIONIX CORP filed Critical SOLUTIONIX CORP
Publication of CN1672013A publication Critical patent/CN1672013A/en
Application granted granted Critical
Publication of CN1300551C publication Critical patent/CN1300551C/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

An apparatus and method for automatically arranging 3D scan data using optical markers are disclosed where non-contact type markers are adopted to automatically arrange 3D data while scanned parts of an object are neither lost nor damaged. The apparatus using optical markers for automatically arranging 3D scan data which is obtained by photographing an object at various different angles comprises: marker generating means for projecting a plurality of optical markers on a surface of the object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for calculating 3D positions of the markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data based on the 3D positions of the markers.

Description

Use the equipment and the method for optical markings automatically arranging three dimensional scan data
Technical field
The present invention relates to by using the equipment and the method for optical markings auto arrangement three-dimensional (3D) scan-data, and more specifically, relate to according to the equipment and the method for a coordinate system auto arrangement with the relative position of a plurality of 3D scan-datas of diverse location and angle scanning.
Background technology
Usually, optics 3D scanner can by scanning only the object surfaces in the scanner ken ask for the 3D data.In order to scan be blocked other districts outside 3D scanner sight line of this object, should make object to be scanned rotation or move, perhaps should be mobile and navigate to the part that can see this object or the place of several portions with scanner itself.Then can obtain complete 3D scanning, can and be integrated into unique unified coordinate system with the 3D data ordering that so obtains thus by observe this object with different orientation and angle.
In this case, so should to be integrated into the reason in unique unified coordinate system be that then the position according to scanner can be limited each 3D scan-data by different seat systems if this scanner is moved to different positions and angle to write down this object to the 3D data that obtain.
For mating these different coordinate systems, must know the distance that scanner has been moved.Two kinds of algorithms that calculate this distance are arranged.A kind of is to obtain absolute distance by the numerical control device that uses the motion scan device, and another kind is by only calculating this distance with reference to this scan-data.
Under latter event, the scanning of being carried out must make a plurality of scan-datas overlap each other and import corresponding point in the lap position of this scan-data.Arrange scan-data with reference to described respective point then, make each coordinate system that limits a plurality of scan-datas be attached to a unified coordinate system.
In the case, when importing respective point by hand, be easy to go wrong, this depends on that the operator imports their degree of accuracy.Particularly, if there is not off-note on object surfaces, then more more mistake may appear.In addition, big or complex objects requires to carry out a large amount of scan operations by the position and the angle that change scanner, can delay the input of respective point thus and make that scan operation is easier to make mistakes.Another shortcoming be operator's carelessness can make the mistake the input or omit corresponding point, cause the coarse arrangement of frequent appearance thus.
For overcoming above-mentioned defective, introduced a kind of method recently, its adopt invest the little of body surface can detected object, be called mark or target, in order to discern them by the operator and accurately to import respective point.In addition, also developed a kind of technology, used this technology to discern respective point automatically by image processing algorithm.
As shown in Figure 1, traditional mark 4 is at random invested the surface of object 2, and pursues the surface of partly scanning this object 2 with overlap mode.
When obtaining a plurality of scan-data by above-mentioned scan method, the operator is the corresponding mark of mark manually, as shown in Figure 2.
After the surface by scanning object 2 obtains the first and second scan-data I1 and I2, operator retrieval is placed on mark M1 among these two scan-data I1 and the I2 and the number of M2 jointly as shown in Figure 2, and arranges this two scan-data I1 and I2 by coupling as the mark of respective point.
Simultaneously, automatically discerning in the technology of respective point, retrieve by Flame Image Process and to have the mark that different pattern is used to discern each other, and if the mark that will have a same pattern be arranged in two different scan-datas, then can automatically arrange this two scan-datas according to this mark.
Yet, have a shortcoming by the conventional art that uses above-mentioned mark to discern the respective point of scan-data.That is, owing to invest the physical size of the mark on this body surface, partial data that may the object surfaces that loss marker covered.
Although can also have other shortcoming to be exactly these method out of true and to require a large amount of working hours by inserting or behind movement indicia, rescan the data of recovering to lose.
Summary of the invention
The invention provides a kind of equipment and method by use optical markings auto arrangement 3D scan-data, wherein, described optical markings is that non-contact type is so that the sweep test of object is protected and is indeformable.
According to one object of the present invention, a kind of equipment that uses optical markings auto arrangement 3D scan-data is provided, wherein, by obtaining described scan-data with different angle scanning objects, comprise: tag producing apparatus is used for a plurality of optical markings are projected object surfaces; Pattern projecting device is used for pattern is projected on this object surfaces to obtain the 3D scan-data of this object; Image-acquisition device is used to obtain to comprise the 2D view data of this object that projects on this object surfaces, and is used for by projecting the 3D scan-data that pattern on this body surface obtains this object; And control device, be used for the 3D position that the relation between this 2D view data and 3D scan-data is asked for optical markings, and according to the relative position of this 3D scan-data of 3D position calculation of this optical markings.
According to one object of the present invention, a kind of method of using optical markings auto arrangement 3D scan-data is provided, this method may further comprise the steps: the position that image-acquisition device is moved to the image of the several portions that is suitable for obtaining this object; By tag producing apparatus this optical markings is projected this object surfaces, and comprise the 2D view data of the part of the object that is incident upon the optical markings on this object surfaces by the image-acquisition device acquisition; By pattern projecting device pattern is projected on this object surfaces, and obtain the 3D scan-data of the part of this object, described pattern is projected on this object by image-acquisition device; And from the 3D position that the relation between 2D view data and the 3D scan-data is asked for optical markings, and obtain the 3D scan-data from the different part of this object according to the 3D positional alignment of described optical markings.
Description of drawings
In order to understand feature of the present invention and purpose better, should be in conjunction with the accompanying drawings with reference to following detailed description, in the described accompanying drawing,
Fig. 1 is the exemplary plot that the conventional labels phenotypic marker is invested the 3D scanning of this object on the object in order to illustrate;
Fig. 2 uses the exemplary plot of label phenotypic marker as the arrangement of the different scanning data of reference mark in order to illustrate;
Fig. 3 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of first embodiment of the invention to be shown;
Fig. 4 a-4c is a synoptic diagram, in order to the example that obtains the state of 2D view data according to the use optical markings of first embodiment of the invention to be shown, and uses pattern to obtain the state of 3D scan-data;
Fig. 5 is a synoptic diagram, in order to the example from the state of the 2D position of deriving mark by the 2D view data that activates and the deactivation optical markings obtains according to first embodiment of the invention to be shown;
Fig. 6 is a synoptic diagram, in order to the example of state that the 3D position of deriving mark from the center of the 2D position of mark and camera lens is shown;
Fig. 7 a-7b is a synoptic diagram, in order to illustrate according to first embodiment of the invention by means of the operation to the triangle comparison search respective markers of mutually different view data;
Fig. 8 a-8d is a synoptic diagram, in order to illustrate according to first embodiment of the invention the triangle of two different images data relatively in, coupling is in the conversion operations of two triangular structures of different mutually positions;
Fig. 9 is a synoptic diagram, in order to illustrate according to first embodiment of the invention by means of the operation that obtains at the virtual tag retrieval respective markers of different mutually view data;
Figure 10 a-10b is a process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to first embodiment of the invention to be shown;
Figure 11 is a synoptic diagram, in order to the structure that uses the equipment of optical markings auto arrangement 3D scan-data according to second embodiment of the invention to be shown;
Figure 12 is a process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to second embodiment of the invention to be shown;
Figure 13 is a process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to third embodiment of the invention to be shown;
Figure 14 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of fourth embodiment of the invention to be shown;
Figure 15 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of fifth embodiment of the invention to be shown;
Figure 16 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of sixth embodiment of the invention to be shown;
Figure 17 a-17b is a process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to sixth embodiment of the invention to be shown;
Figure 18 is a synoptic diagram, ties up to the mistake that occurs in the process of arranging scan-data in order to illustrate by means of a reference coordinate;
Figure 19 is a synoptic diagram, ties up to the mistake that occurs in the process of arranging scan-data in order to illustrate by means of an absolute coordinates;
Figure 20 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of seventh embodiment of the invention to be shown;
Figure 21 a-21b is a process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to seventh embodiment of the invention to be shown;
Figure 22 obtains the figure that part obtains the example of image for the big area image of describing by use in Figure 20;
Figure 23 is for obtaining the partly example figure of the image of acquisition by use at the big area image acquisition part and the image of Figure 20 description;
Figure 24 is a synoptic diagram, in order to the structure according to the equipment of the use optical markings auto arrangement 3D scan-data of eighth embodiment of the invention to be shown;
Figure 25 a and 25b are process flow diagram, in order to the operation of using the method for optical markings auto arrangement 3D scan-data according to eighth embodiment of the invention to be shown;
Figure 26 a obtains the partly figure of the example of the image of acquisition for a pair of big area image of describing by use in Figure 24;
Figure 26 b obtains the example figure of the image of part and the acquisition of image acquisition part for a pair of big area image of describing by use in Figure 24;
Figure 27 is a synoptic diagram, in order to the principle of explanation eighth embodiment of the invention;
Figure 28 is a synoptic diagram, in order to the structure of explanation according to the equipment of the use optical markings auto arrangement 3D scan-data of ninth embodiment of the invention;
Figure 29 is a process flow diagram, uses the operation of the method for optical markings auto arrangement 3D scan-data according to ninth embodiment of the invention in order to explanation;
Figure 30 is a process flow diagram, in order to the structure of explanation according to the equipment of the use optical markings auto arrangement 3D scan-data of tenth embodiment of the invention;
Figure 31 is in order to the synoptic diagram of explanation according to the structure of the marker generator of eleventh embodiment of the invention.
Embodiment
Describe the first embodiment of the present invention in detail below with reference to accompanying drawing.
Fig. 3 is the structural drawing of explanation according to the equipment of the use optical markings auto arrangement 3D scan-data of first embodiment of the invention, wherein, this equipment comprises a marker generator 12, the pattern projector 16, and image obtains part 18, mobile drive part 20, travel mechanism 22, image importation 24, mark flash controller 26, pattern projector controller 28, microprocessor 30 and impact damper 32.
The lip-deep marker generator 12 that is designed to be incident upon by the mark that image obtains part 18 identification object 10 comprises a plurality of mark outputs 14, is used for along irregular direction a plurality of optical markings synchronously being incident upon the surface of this object 10.
Preferably, these a plurality of mark outputs 14 adopt the lip-deep laser designator that a plurality of red points can be incident upon object 10, so that can easily distinguish the position that is incident upon object 10 lip-deep points and obtain the image that part 18 obtains by image, described image obtains for example camera etc. of part 18.
Marker generator 12 never is limited to laser designator, and can adopt any laser optics mark, as long as it can correctly focus on and can easily be controlled to repeat to glimmer in object surfaces.
Can around object, a plurality of marker generators 12 be set, so that optical markings is incident upon on the surface of this whole object 10, and can change the number of described optical markings according to the size and dimension of this object 10.In addition, in scan period, marker generator 12 should be fixed on the object, so that the position of mark does not change on this object surfaces.
The pattern that the pattern projector 16 projections shown in the figure are scheduled to is so that can obtain the 3D scan-data of this object 10.That is, by using the projector, the LCD projector etc. for example is incident upon the light beam of space encoding on the surface of this object 10, or laser beam is incident upon on the surface of object 10, so that can obtain the 3D scan-data that part 18 obtains this object by image.
And preferred, the pattern projector 16 adopts a slip projector, and it comprises light source, pattern film and be used to throw the lens of predetermined pattern, and the perhaps electronics LCD projector, or be used for the laser diode of projecting laser candy strip.Between light source and lens, it allows a series of candy strip to be incident upon on this object 10 to pattern film with candy strip by a predetermined feed arrangement feeding.
Pattern film can have the candy strip (striped pattern) of different gap, as disclosed in the korean patent application No.2002-10839 of on February 28th, 2002 application by the applicant, the name of this patented claim is called " 3D scanning device and the method for using a plurality of candy strips ".It is equally applicable to use the scanister of laser stripe pattern.
In addition, preferably when obtaining the 3D scan-data, mark should be incident upon on the object 10, because scan-data may be destroyed by the mark on the object 10.
Image obtains part 18 and comprises the imageing sensor that can receive image, for example charge-coupled device (CCD) camera or CMOS camera.Image obtains part 18 and takes this object by on the surface that mark is incident upon optically this object 10 time and obtain image.
Image obtains part 18 and can be configured to separate with the pattern projector 16, but preferably image being obtained part 18 installs integratedly with the pattern projector 16, because simple in structure, and be easy to coupling 2D view data and 3D rendering data under the situation of not calibrating with the image acquisition part 18 of the projector 16 one.
Image obtains part 18 and obtains 2D view data and 3D scan-data, the flicker cycle of while synchronous optical mark and the flicker cycle of pattern, and its detailed content is shown in Fig. 4 a, and 4b is among the 4c.
Shown in Fig. 4 a, image obtains part 18 and obtains first view data 40 by a specific part of taking this object 10, and on this part of object, projection has a plurality of optical markings (RM) arbitrarily.
Next, shown in Fig. 4 b, image obtains part 18 and obtains second view data 42 by this same section of taking the object 10 shown in Fig. 4 a, and there is described laser optics mark on the surface that while marker generator 12 is closed to prevent object 10 by projection.
Next, as shown in Fig. 4 c, image obtains part 18 and obtains the 3D scan-data by shot object 10, and projection has the candy strip from the pattern projector 16 on object 10, and simultaneously, marker generator 12 is closed.Concrete, the acquisition form is the 3D scan-data of first to the 5th scan-data 44a-44e, it corresponds respectively to the object 10 lip-deep same sections with different candy strip PT1-PT5.Although in the present embodiment, pattern film has 5 kinds of different candy strips, and it is not limited thereto, and can have more kinds of patterns.
Image-driven part 20 moves the pattern projector 16 and image acquisition part 18 according to the drive controlling of microprocessor 30 with respect to object 10, so that can obtain the image of whole object 10.
The signal that travel mechanism 22 receives from mobile drive part 20 is to move the pattern projector 16 and image acquisition part 18 with respect to this object to a predetermined direction.Although adopt mobile drive part 20 in the present embodiment, be used for the mobile electrically pattern projector 16 and image and obtain part 18, obviously, also can manually operate travel mechanism 22.
The importation of image shown in the figure 24 receives from image and obtains the view data that part 18 obtains, and mark flash controller 26 is used for according to the control of microprocessor the optical markings of marker generator 12 being glimmered.
The speed of feed and the direction of the pattern film of the projection controller 28 control pattern projectors 16, and the flicker of also controlling light source is used for projection.
Microprocessor 30 receives and analyzes this 2D view data and the 3D scan-data of taking with different angles by image importation 24, and automatically this 3D scan-data is arranged in the coordinate system of a single unanimity.
As shown in Figure 5, for seeking the 2D position of laser optics mark, second view data 42 that 30 pairs of microprocessors have first view data 40 of laser optics mark (RM) and do not have a laser optics mark (RM) is carried out Flame Image Process, as a result, obtain only to comprise the 3rd view data 46 of laser optics mark (RM).
As shown in Figure 6, microprocessor 30 obtains the 3D position that the relation between the camera lens center 50 of part 18 and the 2D view data 52 that asks for the usage flag position is calculated mark from image.3D position corresponding to the mark of respective markers (a ', b ', c ') can be by estimating that the point of crossing obtain, in described point of crossing, (straight line c) and 3D scan-data 54 intersect for a, b for the camera lens center 50 of connection layout picture acquisition part 18 and the position of any mark in the 2D view data.
Obtain under the situation of part 18 one configuration at the pattern projector 16 and image, might obtain the 3D position of mark fast.Yet, if obtaining part 18, the pattern projector 16 and image disposed respectively, should carry out calibration to the coordinate of the pattern projector 16 and image acquisition part 18, be used to obtain the 3D position of mark.By above-mentioned steps, can obtain 3D position with the 3D scan-data of different angles shooting.
Meanwhile, scan-data preferably should comprise more than 4-5 mark, and two adjacent scan-datas should comprise 3 or more common mark.This is because must use 3 or more put the unique position that limits in the 3d space, and obtains corresponding mark and also require 3 or more point (following description).
In addition, possible among the present invention is that each mark uses different patterns to distinguish each other.Yet the configuration of equipment and manufacturing thereof may be complicated, because hundreds and thousands of mark outputs should throw difform mark.
In the present invention, in the 3D scan-data of auto arrangement by the adjacent region that obtains with the overlap mode shot object, relevant by using based on the information of each 3D scan-data at the relative position of the mark of microprocessor 30 places calculating, make that these marks obtain distinguishing each other.For example, three points that formed by mark can constitute a triangle, and the triangle that is made of three different points differs from one another, so that by relatively angle and its length can be distinguished each triangle.Thus, can distinguish mark and another mark corresponding to triangular apex.Below will be described in detail this process.
As shown in Fig. 7 a and the 7b, comprise M gauge point at a scan-data 60, and another scan-data 62 of adjacent this scan-data 60 comprises under the situation of N gauge point, scan-data 60 comprises MC 3Individual different triangle, and another scan-data comprises NC 3Individual different triangle.Then, relatively these two scan-datas are altogether MC 3* NC 3Inferior, can obtain corresponding diabolo.
At first, as shown in Fig. 7 a, microprocessor 30 makes up a plurality of triangle T 1 and T2 according to being included in the point that a mark in the scan-data 60 obtains, and the point that obtains according to the mark that is included in another scan-data 62 makes up a plurality of triangle T 3 and T4.
Next, shown in Fig. 7 b, microprocessor 30 is sought the triangle of a pair of mutual correspondence, for example T1 and T3, and it is included in respectively in two scan- datas 60 and 62.
Can use diverse ways to come these triangles of comparison.One of them is a triangle of seeking a pair of correspondence by the length on more every limit.In other words, relatively each three sides of a triangle (a1, a2, a3) (b1, b2, b3), and if the length on every limit identical with the other side, and if the order on every limit identical, then can determine this two triangle correspondences.
Seeking to have in the triangle of three same edge, with the length on each limit of series arrangement of successively decreasing, for example, if detect have same edge at least more than two triangle, then check the order on every limit.That is,, then judge two triangle correspondences if identical along in proper order counterclockwise or clockwise order every limit relatively from longest edge.
In as above explaining, describe in each scan-data, distinguish corresponding triangle or mark after, the motion scan data are so that can be positioned at these marks on the identical point in the single consistent coordinate system.That is to say that one in the triangle is used as benchmark, and corresponding triangle moves to this benchmark, and the result, these two coordinate system couplings.
Two leg-of-mutton matching processs that are arranged in two different scan-datas are shown in Fig. 8 a-8d.Shown in Fig. 8 a, providing under two leg-of-mutton situations, each triangle size and dimension is identical but be positioned at different scanning data place, learns the information on relevant summit and limit when determining the triangle of two correspondences.
Shown in Fig. 8 b, make one in the drift angle of two selections in the corresponding triangle to be complementary by using transition matrix (T), here, the frame of reference relevant with triangle is set to A, and another coordinate system is set to B.Its transition matrix that carries out (T) is defined in following formula 1:
Formula 1......T=T (A1-B1)
Next, as shown in Fig. 8 c, be rotated conversion, define rotation matrix (R1) by formula 2 here to mate a corresponding edge by use rotation matrix (R1):
Formula 2......R1=R (Θ 1)
Fig. 8 d is depicted as by the rotation of using rotation matrix (R2) to carry out and changes to mate a remaining respective vertices,, defines rotation matrix (R2) by formula 3 here
Formula ... R2=R2 (Θ 2)
As a result, M carries out total matching treatment by transition matrix, and it is defined by formula 4:
Formula 4......M=TR1R2
Thus, will be included in point (P) in the scan-data by following formula 5 and move to reposition in another scan-data:
Formula 5......P`=M * P
Meanwhile, miscount appears possibly because physical size can not by mathematics be defined as a little, but have actual size.Thus, after microprocessor 30 mated different scan-datas as mentioned above, microprocessor 30 was handled more complicated meticulously to obtain more accurate coupling.That is, further adjust the position of mark based on data into the net, it is called as " registration (registering) ".Handle by these, each scan-data can be attached in the coordinate system of single unanimity more accurately.Below be described in detail.
Will be incorporated at cloud data A that comprises a plurality of points and the cloud data B that comprises a plurality of points under the situation of single consistent coordinate system, coordinate system A is moved and rotates to be attached to coordinate system B.In the case, if the n of A point is P={pi}, and the point of the B corresponding with it is Q={xi}, then can be by being used to make the least square method of the distance minimization between P and the Q obtain to move and the rotation conversion, and this conversion is applied to A.As a result, be minimized by the mean distance of the cloud data of A and B representative, and repeat these and handle till the mean distance between P and the Q is in range of tolerable variance.
Also can obtain to come from the corresponding point of a plurality of cloud datas, and in following formula 6, provide formula its application by using least square method:
Formula 6
( Q ) = 1 N p Σ i = 0 N p | | x i - ( R ( Q R ) p i + Q T ) | | 2
Here, Q is the state vector of registration, and Q is restricted to [Q R| Q T] 2, wherein, Q R=quaternionic vector, and Q R=[q 0q 1q 2q 3] t(q 〉=0, q 0 2+ q 1 2+ q 2 2+ q 3 2=1).Q TFor converting vector and be defined as [q 4q 5q 6] t
In above-mentioned formula 6, f (Q) definition xi subtracts (R (Q R) pi+Q T) square mean distance, and at this moment, can calculate (R (Q by least square method R) and Q TIn formula 6, (R (Q R) can define as shown in the formula 73 * 3 rotation matrix by one:
Formula 7
q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 1 q 3 + q 0 q 2 ) 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 + q 2 2 - q 1 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 2 q 3 + q 0 q 1 ) q 0 2 + q 3 2 - q 1 2 - q 2 2
In addition, limit one group of scan-data by P={pi} and limiting by X={xi} under the situation of one group of reference data (X), providing the barycenter of P and X by following formula 8:
Formula 8
μ p = 1 N p Σ i = i N p p i
μ x = 1 N x Σ i = i N x x i
In addition, the cross covariance matrix of P and X is provided by following formula 9:
Formula 9
Σ px = 1 N p Σ i = 1 N p [ ( p i - μ p ) ( x i - μ x ) T ] = 1 N p Σ i = 1 N p [ p i x i i ] - μ p μ i x
The circulation of antisymmetric matrix (Aij) mixes (cyclic compounds) and is used to form column vector (Δ).Should be used to form symmetrical 4 * 4 matrix Q (∑ px) by vector (Δ) then, it is provided by following formula 10:
Formula 10
A ij=(∑ px-∑ px T) ij
Δ=[A 23A 31A 12] T
Q ( Σ px ) = tr ( Σ px ) Δ T Δ Σ px + Σ px T - tr ( Σ px ) I 3
Wherein, I 3It is 3 * 3 unit matrix.In the superincumbent formula 10, Q TBe the latent vector of the dominant eigenvalue of corresponding Q (∑ px), and hypercomplex number (Q0, Q1, Q2, Q3) by using this latent vector to obtain, this rotation matrix is by obtaining this hypercomplex number substitution formula 7.Simultaneously, the R (Q by using top formula 7 to provide R) can access Q from following formula 11 T(q 4, q 5, q 6).
Formula 11
Q T=μ x-R(Q Rp
As a result, provide final matrix by following formula 12:
Formula 12
q 0 2 + q 1 2 - q 2 2 - q 3 2 2 ( q 1 q 2 - q 0 q 3 ) 2 ( q 1 q 3 + q 0 q 2 ) q 4 2 ( q 1 q 2 + q 0 q 3 ) q 0 2 + q 2 2 - q 1 2 - q 3 2 2 ( q 2 q 3 - q 0 q 1 ) q 5 2 ( q 1 q 3 - q 0 q 2 ) 2 ( q 2 q 3 + q 0 q 1 ) q 0 2 + q 3 2 - q 1 2 - q 2 2 q 6 0 0 0 1
When the correspondence markings (marker) of each 3D scan-data was obtained, microprocessor 30 calculated based on a described scan-data 60 and is used for total matrix of changing as reference coordinate, thus all 3D scan-data auto arrangement is arrived this frame of reference.
Simultaneously, except Fig. 8 a to the method shown in Fig. 8 d, can coordinate system itself be mapped to this frame of reference by using least square method, its representative alignment scanning data after finding corresponding triangle.
The information that is in the correspondence markings on summit is obtained in seeking corresponding leg-of-mutton process, for example is P and the X in the formula 6, and the best transition matrix is provided by following formula 13:
Formula 13
R T 0 1
The formula that is used for cloud data (P) can be defined by following formula 14, and described cloud data is treated to be arranged by the coordinate Mapping method:
Formula 14
P′=TP
Simultaneously, be used for seeking the leg-of-mutton method of a pair of correspondence as explained above, the correspondence markings more than 3 should be included in the overlapping district of each scan-data.Therefore, having only 2 correspondence markings to be included in the situation in this district, should adopt other method and seek corresponding mark.
Because each scan-data that has mark has 3D information, be possible so seek corresponding mark, even have only 2 correspondence markings to be included in this overlay region.As shown in Figure 9, on an overlay region of scan- data 64 and 66, have only 2 correspondence markings, they be respectively two marks (RM1, RM2) and (RM3, RM4).Distance between two vertical vectors by mark position relatively and two marks is sought correspondence markings.
Simultaneously, if there is too many mark to be projected onto each scan-data or mark, the possibility that has increased access to inaccurate correspondence markings is just arranged by unified projection.In this case, Fu Jia mark and 3D scan-data are used to produce additional reference.For example, in the situation that 3 correspondence markings are arranged, form triangle, then from the vertical setting-out of this leg-of-mutton center of gravity by three marks.The intersection point that obtains this vertical line and 3D scan-data then is as the 4th reference point.Next step can find correspondence markings by the average vertical vector information of utilizing the body surface around mark or mark.
In addition, be in the available situation having only 2 correspondence markings, draw straight lines by connecting two marks, and from this straight line center at plane upper drawing circle perpendicular to this straight line.The intersection point that obtains this circle and 3D scan-data then is as the 4th and the Wucan examination point.
According to a preferred embodiment of the invention, clearly, except the top method of carrying, to finish the automatic aligning of 3D scan-data also be possible by producing additional reference point around the mark.
As shown in Figure 3, the information about the mark that newly obtains by the auto arrangement of 3D scan-data is handled is deposited in the impact damper 32.
Below, will explain the operation according to first embodiment of the invention above-mentioned in detail with reference to figure 10a and 10b, wherein S represents step.
At first, the mobile drive part 20 of microprocessor 30 control starts travel mechanisms 22, will obtain part 18 with the integrated image of the pattern projector 16 and move to the position (S10) that is suitable for scanning object 10.
Then, microprocessor 30 control mark flash controllers 26 are incident upon mark arbitrarily the surface (S11) of object 10 to allow a plurality of mark outputs 14 that are set on the marker generator 12.
Next step, the specified domain that image obtains part 18 shot objects 10 obtains to comprise the 2D image that projects object 10 lip-deep optical markings that microprocessor 30 receives this 2D view data (S12) via image importation 24 then.
Next, microprocessor 30 control mark flash controllers 26 closing flag generators 12, thus mark can not be projected onto (S13) on the object 10.At this state, image obtains the 2D image that part 18 shootings same territory as above obtains not have mark, and microprocessor 30 receives these 2D view data (S14) via image importation 24 then.
Microprocessor 30 control projection controllers 28 start the pattern projector 16 when marker generator 12 is closed.Then, predetermined pattern (for example, having the pattern of striped of different gap or the pattern of many stripeds therebetween) is projected onto on the surface of object 10 from the pattern projector 16.Next step, 18 shootings of image acquisition part have the object 10 that projects the candy strip on it and obtain the 3D scan-data, and microprocessor 30 receives these 3D scan-datas (S15) via image importation 24 then.
At this state, microprocessor 30 has the 2D view data of mark by Flame Image Process and does not have the 2D position (16) that the 2D view data of mark is calculated mark.
Next, 2D position and the 3D scan-data of microprocessor 30 by the usage flag 3D position of calculating mark.Just, the 3D position of mark can be by estimating in camera lens center that the connection layout picture obtains part 18 and the 2D view data that the straight line of the position of mark and the crossing intersection point of 3D scan-data obtain (S17) arbitrarily.
Simultaneously, whether the register of microprocessor 30 identification impact dampers 32 is empty (S18).If the register of impact damper 32 is a non-NULL, relatively compare and search for correspondence markings (S19) in the 3D position (current 3D scan-data) of the mark that S17 obtains and the 3D position (in other words, with the partly overlapping 3D data of current 3D scan-data) that is stored in mark in the register of impact damper 32.
When finding corresponding mark according to forenamed search procedure by relatively being included in mark in the current 3D scan-data and being deposited with mark in the register of impact damper 32, microprocessor 30 calculates the transition matrix (S20) of two 3D scan-datas of coupling.The position of the 3D scan-data of depositing in the register of impact damper 32 is used as the frame of reference and provides, and current scan-data is transformed into this frame of reference (S21).
Next step, microprocessor 30 is deposited the marks that newly obtain in the register of impact damper 32 (S22) from current scan-data.Then, whether microprocessor 30 identifications finish (S23) to the auto arrangement of 3D scan-data.If this arrangement is not finished, then this process turns back to S10, and repeating step S10 is to S23 then.
With reference now to accompanying drawing, describes the second embodiment of the present invention in detail.
Figure 11 is the structural drawing that uses the equipment of optical markings auto arrangement 3D scan-data according to second embodiment of the invention.In whole accompanying drawings, reference number identical with first embodiment and symbolic representation be part of equal value in function and operation, in order to simplify, has omitted the explanation to these parts.
Equipment according to the auto arrangement 3D scan-data of second embodiment of the invention comprises marker generator 70, the pattern projector 16, image obtains part 18, mobile drive part 20, travel mechanism 22, image importation 24, separate marking flash controller 74, projection controller 28, microprocessor 76 and impact damper 32.
Marker generator 70 comprises a plurality of mark outputs 72 and throws at random and can be obtained the mark that part 18 is recognized by image to the surface of object 10.
Marker generator 70 is opened 1-N mark output 72 in order one by one with response separate marking flash controller 74, and it can be comprised different marks to obtain each image that part 18 obtained for image.
Separate marking flash controller 74 glimmers in proper order individually with predefined procedure according to the control of microprocessor 76 and is installed in a plurality of mark outputs 72 of marker generator 70.By analyzing the 2D view data and carrying out corresponding to arrangement at the coordinate system of a plurality of 3D scan-datas of a plurality of angle shots by the 3D scan-data microprocessor 76 of image importation 24 inputs.Just, microprocessor 76 is set up image as the reference image, this image be when obtain part 18 by image during underlined being closed and taken pictures, then this reference picture is compared one by one with a plurality of images captured when mark is opened.By top process, the 2D position of each mark is obtained.
Next, microprocessor 76 is carried out and the performed same process of first embodiment.In other words, microprocessor analysis 2D view data and 3D scan-data calculate the 3D position of mark, seek corresponding mark and obtain transition matrix and a plurality of 3D scan-datas are transformed into this frame of reference.
Explain the operation according to second embodiment of the invention as described herein in detail in conjunction with flow table shown in Figure 12 now.
At first, the mobile drive division of microprocessor 76 control assigns to start travel mechanism 22, and it will obtain part 18 with the integrated image of the pattern projector 16 and move to the position (S30) that is suitable for scanning object 10.
Under these circumstances, microprocessor 76 obtain by image obtain part 18 when all optical markings are closed shot image data as the reference image.Then, microprocessor 76 control separate marking flash controllers 74 are with the mark output 72 of appointment at first in a plurality of mark outputs 72 of opening installation in marker generator 70, and first mark is projected onto on the surface of object 10 (S31).Then, obtain part 18 by image and take the image (S32) that obtains as first view data.
Next, the microprocessor 76 control separate marking flash controllers 74 mark output of opening second appointment according to predetermined order makes second optical markings project (S33) on the object 10.Then, obtain second view data (S34).Next step, microprocessor 76 identification be included in the image mark whether for last (N) in predetermined a plurality of marks (S35).If this mark is not last mark, repeated execution of steps S33 and S34 are obtained up to N view data.
During this time, if identify this mark is last one, be closed when preventing that optical markings from being throwed at marker generator 70, microprocessor 76 control projection controllers 28 start the pattern projector 16 and are projected on the object 10 by the pattern projector 16 with the predetermined pattern (striped or the many candy strips that have different interval for example) that will be used for 3D scanning.
At this moment, take and have the object 10 that projects the pattern above it obtaining the 3D scan-data if image obtains part 18, microprocessor 76 24 receives 3D scan-datas (S36) from the image importation.
Microprocessor 76 with first to N view data each and reference picture compares and seek by the formed bright spot of optical markings in relatively at each, help the 2D position of each mark to be easy to find (S37).
Next, the 3D position that 2D position and the 3D scan-data of microprocessor 76 by evaluation of markers calculates mark, the 3D position of reference marker is sought and is included in the correspondence markings of overlay region and calculates transition matrix, a plurality of 3D scan-datas are transformed into (S38) in the frame of reference, this identical with described in first embodiment.
Next, whether the auto arrangement of microprocessor 76 identification 3D scan-datas finishes (S39).If this arrangement is not also finished, this flow process turns back to S30, by starting travel mechanism 22 the pattern projector 16 and image acquisition part 18 is moved to correct position under the control of mobile drive part 22.Repeating step S30 is to S38.
Next, describe the third embodiment of the present invention in conjunction with the accompanying drawings in detail.
According to the structure of the auto arrangement 3D scan-data equipment of third embodiment of the invention with shown in Figure 11 identical.Yet this method is different between second embodiment and the 3rd embodiment.That is to say that in a second embodiment, each comprises N image of a mark that differs from one another and must be taken respectively.Yet, in the 3rd embodiment, log 2(N+1) individual image is taken, and each comprises one group of mark that is used for binarization.
Under the control of microprocessor 76, separate marking flash controller 74 will be placed in mark output 72 on the marker generator 70 and be divided into several groups and be used for binarization, and open mark by group.
For example, if the number of mark output 72 is 16, the overlapping mode of separate marking flash controller 74 usefulness is divided into 4 groups with 16 mark outputs 72.
In other words, for example, first group comprises the 9th to the 16th mark, and second group comprises the 5th to the 8th mark and the 13rd to the 16th mark, the 3rd group comprises the the 3rd, the 4, the 7th, the 8, the the 11st, the 12, the 15th, the 16 mark, the 4th group of mark the (the the 2nd, the 4, the 6th that comprises even number, the the 8th, the 10, the 12nd, the the 14th, the 16), they all provide in table 1.
Table one
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
First image 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1
Second image 0 0 0 0 1 1 1 1 0 0 0 0 1 1 1 1
The 3rd image 0 0 1 0 0 0 1 1 0 0 1 1 0 0 1 1
The 4th image 0 1 0 1 0 1 0 1 0 1 0 1 0 1 0 1
On behalf of mark, " 0 " be closed, and on behalf of mark, " 1 " opened by generation.
As table 1 defined, first mark remains closed condition, and the 16th mark remains open mode, and institute is underlined all to have eigenvalue (intrinsic values) respectively.
Microprocessor 76 control separate marking flash controllers 76 are throwed N mark by group and relatively obtain part 18 resulting log by image 2(N) individual view data, and the 2D position of calculating these marks.
Throwed by group to obtain first shown in the table 1 under the situation of the 4th view data at 16 marks, these 16 marks are by their binary code difference, and these binary codes are represented their opening and closing state, just, and identifier (ID).Therefore, can access the 2D position of these 16 marks.For example, the 10th mark is identified as binary number " 1001 ", and the 13rd mark is identified as binary number " 1100 ".Simultaneously, first mark that always remains on closed condition is not used, and 15 marks can be used in the actual sensed so altogether.
As a result, even 1024 (2 10) individual mark is used, with the binarization of top mark, 10 view data are enough these marks of differentiation just.In addition, corresponding mark is sought in the 3D position that 2D position and the 3D scan-data of microprocessor 76 by these marks calculates these marks, calculates transition matrix and moves a plurality of 3D scan-datas by this transition matrix.Above process identical with described in first embodiment.
With reference now to process flow diagram shown in Figure 3, explains the operation of described third embodiment of the invention in detail.
As shown in table 1, among the embodiment that explain, 16 mark outputs that are installed in marker generator 70 places can be used to projection 16 marks and obtain part 18 from image and obtain four 2D view data altogether.
At first, the mobile drive part 20 of microprocessor 76 control drives travel mechanism 22, and it will obtain part 18 with the integrated image of the pattern projector 16 and move to the position (S40) that is fit to scanning object 10.
Next step, microprocessor 76 control separate marking flash controllers 74 are opened mark output 72, can be by projection (S41) so that belong to first group mark (the 9 one 16th mark).Obtain the part 18 first captured view data by image and obtained (S42) via image importation 24.
Next, microprocessor 76 control separate marking flash controllers 74 are opened mark output 72 so that the N group echo, for example, the 5 one 8th and the 13 1 16th mark throwed (S43), thereby obtain obtaining captured N the view data (S44) of part 18 by image.
Then, whether the mark group that microprocessor 76 identifications are included in this view data is last (S45), if it is not last, flow process turns back to S43 and repeats this process.
During this time, as the result of S45 identification, if this mark group is last group, projection controller 28 drives the pattern projector 16 and pattern is projected on the surface of object 10, and marker generator 70 is closed to prevent that optical markings from being throwed simultaneously.
At this moment, when being throwed figuratum object 10 when image obtains to obtain the 3D scan-data in the part 18 by taking, microprocessor 76 receives this 3D scan-data (S46) by image importation 24.
Next, microprocessor 76 relatively obtains first to N image that part 18 obtains from image, thereby obtains the binary message that these marks are relevant to first to the 4th view data.Therefore, the ID of each mark, just the 2D position of these marks is obtained (S47).
Simultaneously, 2D position and the 3D scan-data 3D position of calculating these marks of microprocessor 76 by analyzing these marks, and seek correspondence markings in the overlay region that is included in two different 3D scan-datas with reference to the 3D position of these marks, calculate transition matrix, with by this transition matrix one of them 3D scan-data is transformed into the frame of reference (S48), this identical with described in first embodiment.
The auto arrangement of microprocessor 76 identification 3D scan-datas whether be done (S49).If the auto arrangement of 3D scan-data also is not done, then flow process is returned to S40.Therefore, repeating step S40 is to S48.
Next, will be described in detail with reference to the attached drawings the fourth embodiment of the present invention.
As shown in figure 14, auto arrangement 3D scan-data equipment according to fourth embodiment of the invention comprises the pattern projector 16, image obtains part 18, mobile drive part 20, travel mechanism 22, image importation 24, projection controller 28, impact damper 32, marker generator 80, single flicker control 84 and microprocessor 86.
In whole figure, reference number identical with first embodiment and symbolic representation be part of equal value in function and operation, in order to simplify, has omitted the explanation to these parts.
Marker generator 80 obtains the mark that part 18 can recognize with image and projects on the surface of object 10.Marker generator 80 is furnished with a plurality of mark outputs 82 that are used for throwing on the whole surface of object 10 with random angle a plurality of optical markings.
According to the control mark generator 80 of separate marking flash controller 84 a plurality of mark outputs 82 that optionally glimmer.Control separate marking flash controller 84 according to microprocessor 86 is controlled a plurality of mark outputs 82 individually.
Microprocessor 86 is analyzed the data that are scanned that obtain from object 10 and is used for the scan-data at the coordinate system auto arrangement 3D of single unanimity.Microprocessor 86 receives via image importation 24 and obtains part 18 by image and analyze them at the 2D of a plurality of angle shots view data and 3D scan-data and be used for auto arrangement on a coordinate system, and the running program of the microprocessor among its DOP detailed operating procedure and first embodiment is identical.
Yet, after the difference between first embodiment and the 4th embodiment is carrying out the 2D view data and 3D scan-data process in a district that obtains object 10, the mark that is incident upon in this district (for example glimmered with the predetermined cycle, about 0.5 second), the mark that projects simultaneously in other district is subjected to the control of separate marking flash controller 84 to keep " opening " state.
On the contrary, when the mark in other district glimmered with predetermined period, it was possible that the mark in the district that procurement process has been moved to end keeps " opening " state.
In other words, the condition of mark is differently set up between the obtained district of view data and scan-data and other district.Therefore, the operator is easy to the district is distinguished.
Next, will be described in detail with reference to the attached drawings the fifth embodiment of the present invention.As shown in figure 15, equipment according to the auto arrangement 3D scan-data of fifth embodiment of the invention comprises the pattern projector 16, image obtains part 18, mobile drive part 20, travel mechanism 22, image importation 24, projection controller 28, impact damper 32, marker generator 90, mark glimmer separately/color controller 94 and microprocessor 96.
In whole figure, reference number identical with first embodiment and symbolic representation be part of equal value in function and operation, in order to simplify, has omitted the explanation to these parts.
Marker generator 90 obtains the pattern that part 18 can recognize with image and projects on the object surfaces.Marker generator 80 is equipped with a plurality of mark outputs 82 and is used for the surface that a plurality of optical markings projected object 10 with arbitrarily angled.
Marker generator 90 is configured like this, promptly according to mark glimmer separately/control of color controller 94 can optionally throw two or more at least different colours from each mark output 92.For example, each mark output 92 is equipped with plural light source, and each has different colors, makes that these light sources can be by optionally luminous.
Control mark according to microprocessor 96 glimmers separately/and color controller 94 controls are set at the flicker of a plurality of mark outputs 92 on the marker generator 90 and color separately.
Microprocessor 96 is analyzed by image and is obtained 2D view data and the 3D scan-data of part 18 from a plurality of angle shots, is used for 3D scan-data auto arrangement at a coordinate system.Identical in detailed running program and the first embodiment of the invention.
Yet, between the 5th embodiment and first embodiment, some differences are arranged.Microprocessor 96 glimmers separately/color controller 94 according to the fifth embodiment of the invention control mark, make be projected onto in view data and the obtained district of scan-data mark be projected onto the different color that is marked with that other distinguishes.
By according to trivial minute color, the operator just can be easy to check with bore hole whether 2D view data and 3D scan-data be obtained from this district, and this provides convenience to scan operation.
Next, will 16 describe sixth embodiment of the invention in detail with reference to the accompanying drawings.
As shown in figure 16, the equipment of auto arrangement 3D scan-data comprises marker generator 12, the pattern projector 16, image obtains part 18, image importation 24, mark flash controller 26, projection controller 28, impact damper 32, universal stage 100, rotary driving part divides 102, rotating mechanism 104 and microprocessor 106.
In whole figure, reference number identical with first embodiment and symbolic representation be part of equal value in function and operation, in order to simplify, has omitted the explanation to these parts.
The object 10 of universal stage 100 on being placed in universal stage 100 upper disc rotates, and also makes to be arranged in upper disc a plurality of marker generators 12 rotations on every side.
According to the control of microprocessor 106, rotary driving part divides 102 to be driven in rotation mechanism 104 and to rotate universal stage 100, makes this object can be set to the angle that is fit to scanning.
Although the rotary driving part in the sixth embodiment of the invention divides 102 to be used to rotate electrically universal stage 100, should it is apparent that rotating mechanism 104 can be manually turned so that the operator arbitrarily controls universal stage 100 here.
In addition, as long as marker generator and object can be rotated together at stationary state, not only universal stage 100 but also other parts also may here be employed.
Microprocessor 106 receives by image and obtains part 18 at the 2D of a plurality of angle shots view data and 3D scan-data and analyze these data and be used for 3D scan-data auto arrangement at a coordinate system, and microprocessor is identical among its DOP detailed operating procedure and first embodiment.
Yet, different in sixth embodiment of the invention, because what rotate during scanning process is that object 10 and marker generator 12 rather than image obtain the part 18 and the pattern projector 16.
Explain the running program of the auto arrangement 3D scan-data equipment according to sixth embodiment of the invention as described herein in detail referring now to Figure 17 a and 17b.
At first, microprocessor 106 control rotary driving parts divide 102 driven in rotation mechanisms 104, thereby rotate universal stage 100 at a predetermined angle, make object 10 can be rotated the position (S50) that is fit to scanning.
Under such condition, a plurality of mark outputs 14 of microprocessor 106 control mark flash controllers 26 opening installation on marker generator 12, thus allow a plurality of marks to be incident upon on the surface of object 10 (S51).
When optical markings is throwed, thereby image obtains the 2D image that part 18 shot objects 10 obtain to comprise optical markings, and microprocessor 106 receives by image via image importation 24 and obtains the 2D view data (S52) that part 18 obtains.
Subsequently, microprocessor 106 control mark flash controllers 26 closing flag generators 12, thus prevent that optical markings is projected onto (S53) on the object 10.Next step, shooting there is not the same zone of market object 10 and receives its 2D view data (S54) via image importation 24.
In addition, the microprocessor 106 control projection controllers 28 driving pattern projectors 16 are closed to prevent that optical markings from being throwed with tense marker generation 102.Therefore, appointed pattern (pattern or the many candy strips that for example, have the striped in the gap) surface that is projected onto object 10 is used for 3D scanning.
Take the object 10 that given pattern arranged by projection when obtaining the 3D scan-data when image obtains part 18, microprocessor 106 receives these 3D scan-datas (S55) via image importation 24.
Microprocessor 106 comprises and lacks the 2D position (S56) that the 2D view data of optical markings is calculated mark by Flame Image Process.
Then, 2D position and the 3D scan-data of microprocessor 30 by the mark 3D position (S57) of calculating mark.Just, the intersection point that intersects of the straight line by estimating the position of any mark in camera lens center that the connection layout picture obtains part 18 and the 2D view data and 3D scan-data can obtain the 3D position of mark.
Simultaneously, whether the register of microprocessor 106 identification impact dampers 32 is empty (S58).
Result as S58 identification, if the register of impact damper 32 is not empty, the 3D position of the mark that microprocessor 106 obtains S57 and the 3D position that is included in the mark in the 3D scan-data of the register that is stored in impact damper 32 compare, thereby seek the mark (S59) that corresponds to each other.
In the searching process of S59 by relatively being included in the mark in the current 3D scan-data and being stored in after thereby mark in the register of impact damper 32 obtains correspondence markings, microprocessor 106 calculates transition matrix (S60) by the relation of analyzing correspondence markings, and current scan-data is transformed into reference coordinate, according to the 3D scan-data (S61) that the register of this coordinate definition impact damper 32 is listed, this identical with described in first embodiment.
Then, microprocessor 106 is deposited with the register of impact damper 32 with mark, as the reference (S62) of calculating next time.Then, whether microprocessor 106 is checked from the auto arrangement of object 10 resulting 3D scan-datas be done (S63).
Result as checking is not done if identify from the auto arrangement of object 10 resulting 3D scan-datas, and flow process turns back to S50.Therefore, divide 102 to start rotating mechanism 104 and forward universal stage to 2D view data and 3D scan-data that similar specified angle obtains object 10 other districts by the foundation rotary driving part.Step S50 is repeated to carry out to S62.
Can obviously see from top description, sixth embodiment of the invention is in order to allow object 10 to be moved by constructing like this, obtain and arrange the 3D scan-data from the object more less relatively than the object of first embodiment of the invention easily like this, the projector and image obtain part and are configured to move in first embodiment.
Here, marker generator is fixed on the universal stage to prevent relatively moving between them, up to finishing this scanning process.
At this moment, use the aligning method of reference coordinate that shortcoming is arranged among the embodiment that mentions in front, if it is very big promptly to be scanned the number in district, mistake will increase.Because in the above methods, by being attached to the frame of reference, a 3D scan-data carries out arrangement, and its adjacent 3D scan-data that has obtained is defined therein, and alignment processes is repeated in all districts of this object.Therefore, the careless mistake in a process can be exaggerated when this arrangement finishes.
For example, Figure 18 a and 18b illustrate by scanning two scan-datas that two adjacent regions that overlap each other obtain.Dotted line is represented the real data of object, and solid line is represented and the inconsistent scan-data of real data.
Under these circumstances, if any one scan-data among Figure 18 a and Figure 18 b be as a reference and another data by additional (arrangements) to this reference, mistake may increase in adding, this causes Fig. 9 c.In other words, the number increase that is scanned the district may increase the chance of makeing mistakes.
In order to solve above-mentioned problem, in the of the present invention the 7th and the 8th embodiment, introduce the method that replaces frame of reference arrangement 3D scan-data with absolute coordinate system.
Absolute coordinate system in these embodiments is different with the described frame of reference, and promptly each 3D scan-data in the described district of object is mapped to absolute coordinates.Therefore, the mistake that takes place in obtaining the 3D scan-data is not passed to the adjacent 3D scan-data of acquisition.
For example, Figure 19 a and 19b illustrate from scanning two scan-datas that two adjacent regions obtain, and the part of this scan-data is overlapping.If two scan-datas of Figure 19 a and 19b are transformed into absolute coordinate system respectively and are added mutually shown in Figure 19 d, then the mistake that takes place respectively in two scan-datas can not be added shown in Figure 19 c, makes the so described inaccurate wrong scale-up problem that causes owing to image acquisition part to be prevented from.
At first, will be described in detail with reference to the attached drawings the seventh embodiment of the present invention.
The equipment of auto arrangement 3D scan-data that is shown in described the 7th embodiment of foundation of Figure 20 comprises marker generator 12, the projector 16, and image obtains part 18, first moves drive part 20, the first travel mechanisms 22, mark flash controller 26, projection controller 28, impact damper 32, big area image obtains part 110, image importation 112, second moves drive part 114, second travel mechanism 116, microprocessor 118 and Reference 120.In whole figure, reference number identical with first embodiment and symbolic representation be part of equal value in function and operation, in order to simplify, has omitted the explanation to these parts.
Big area image obtains part 110 and comprises the imageing sensor that is used to receive image, as CCD camera or complementary metal oxide semiconductor (CMOS) (CMOS) camera.When mark projects on the surface of object 10 from marker generator 12, obtain part 110 by big area image and take and obtain its image.Big area image obtains part 110 and is separated the image that the big territory of object 10 was taken and obtained in the location from image acquisition part 18.
Big area image acquisition part 110 is preferably adopted to have than image to obtain the image that the part 18 relative more imageing sensors of pinpoint accuracy obtain this scanning field of part.
Image output 112 obtains part 18 and big area image acquisition part 110 reception view data from image.
According to the drive controlling of microprocessor 118, second moves drive part 114 drives the position that second travel mechanism 116 moves to big area image acquisition part 110 the most of image that is fit to acquisition object 10.
Should be noted that although in the seventh embodiment of the present invention, big area image acquisition part 110 is moved drive part 114 by second and moves electrically, also can move it by operating second travel mechanism.
A plurality of optical markings are projected the lip-deep while of object 10 at marker generator 12, obtain part 110 in the captured object 10 of two or more direction and the view data of Reference 120, the 3D position that microprocessor 118 calculates each mark in these territories of exposing thoroughly by analyzing big area image.The 3D position of the mark that obtains thus is as absolute coordinate system.
In addition, microprocessor 118 receives by image via image importation 112 and obtains a plurality of 2D view data and the 3D scan-data of part 18 in a plurality of angle shots, and analyze them, and each 3D scan-data is transformed into this absolute coordinate system, cause the arrangement of whole 3D scan-datas of object 10.
Described Reference 120, its size (yardstick) information is input to the object of the designated shape of microprocessor 118 in advance, is placed near object 10.The image of Reference 120 obtains via the image of big area image acquisition part 110 with object 10.
Therefore the operating process of the device of being explained that is used for auto arrangement 3D scan-data according to a seventh embodiment of the invention is described in detail referring now to the process flow diagram shown in Figure 21 a and the 21b.
At first, object 10 is placed on marker generator 12 next doors, and Reference 120 is set near the assigned address the object 10.Then, microprocessor 118 controls second are moved drive part 114 and are driven second travel mechanism 116, move to the position that is fit to scanning object 10 so that big area image obtains part 110.
Next step, microprocessor 118 control mark flash controllers 26 are enabled in a plurality of mark outputs 14 that marker generator 12 is assembled, and therefore a plurality of marks can be incident upon arbitrarily on the surface of object 10 (S70).
The big territory of object 10 and Reference 120 are obtained part 18 by big area image and take pictures, to obtain to comprise the 2D view data of optical markings.Microprocessor 118 receives from big area image via image importation 112 and obtains the 2D view data (S71) that part 110 obtains then.
In Figure 23, shown to comprise the example that obtains the image of the entire domain of the object 10 that part 110 obtains and Reference 120 via big area image.Symbol " RM " expression is incident upon the lip-deep optical markings of object 10, and reference symbol " BI " refers to the image that is obtained by big area image acquisition part 110.
Next step, microprocessor 118 controls second are moved drive part 114 and are driven second travel mechanism, so that big area image is obtained the position (S72) that part 118 moves to the another part that is fit to scanning object 10.
Next step, the big area image of microprocessor 118 control obtains 118 pairs of parts and comprises that the big territory of the object 10 of Reference 120 takes pictures, and therefore obtains to comprise the 2D image of optical markings in the direction that is different from S71.This 2D image is received (S73) via image importation 112 by microprocessor 118.
Microprocessor 118 control mark flash controller 26 then turn-offs marker generators 12, prevents that thus optical markings is projected on the surface of object 10 (S74).
Microprocessor 118 calculates mark (S75) in the 2D image that is included in institute's combination in conjunction with obtained part 110 by big area image at the 2D image in the big territory of the object 10 that different directions obtains and according to the dimension (dimension) of known Reference 120.Next step, microprocessor 118 is deposited the 3D position of each mark of calculating like this at impact damper 32 (S76).
Then, microprocessor 118 control first is moved drive part 20 and is driven first travel mechanism 22, therefore obtains part 18 with the integrated image of the pattern projector 16 and is moved into the position (S77) that is fit to scanning object 10.
Under above-mentioned environment, microprocessor 118 control mark flash controllers 26 turn on a plurality of mark outputs 14 that marker generator 12 is assembled, and are incident upon arbitrarily on the surface of object 10 (S78) to allow a plurality of marks.
A part (seeing " NI " among Figure 23) that obtains the big territory of 18 pairs of objects 10 of part at image is taken pictures, optical markings is projected on the surface of object 10 simultaneously, and microprocessor 18 receives via image importation 112 and obtained the 2D view data (S79) that part 18 obtains by image.
Next step, microprocessor 118 control mark flash controllers 26 turn-off marker generators 12, prevent that thus optical markings is projected on the surface of object 10 (S80).Under this condition, the same section in above-mentioned big territory is obtained part 18 by image and takes pictures, to obtain not comprise the 2D image of optical markings.Therefore the 2D view data of this acquisition is input to (S81) in the microprocessor via image importation 112.
And, microprocessor 118 control projection controllers 28 activate the pattern projector 16, marker generator 12 is turned off to avoid optical markings to be throwed simultaneously, therefore predetermined pattern (pattern or the multiple bar chart case that for example, have the bar that has different gap each other) is projected on the surface of object 10.
When the object that is throwed 10 with pattern was obtained part 18 projections with acquisition 3D scan-data by image, microprocessor 118 received these 3D scan-datas (S82) via image importation 112.Under this environment, microprocessor 118 is by carrying out the 2D position (S83) that Flame Image Process is calculated mark to this 2D view data that comprises mark and the 2D view data that does not comprise mark.
Microprocessor 118 is from the 2D position of mark and the 3D position of 3D scan-data calculating mark.That is to say that the 3D position of mark can obtain (S84) by estimating the crossing intersection point of straight line and 3D scan-data that the connection layout picture obtains the position of the camera lens center of part 18 and three any marks in the 2D view data.
Next, microprocessor 118 will be compared with the mark in the register that is stored in impact damper 32 at S76 in the 3D position of the mark that S84 finds, to search for corresponding mark, in other words, be exactly with its 3D position in identical mark (S85).
When employing is stored in mark in the register in the impact damper 32 by relatively being included in the optical markings in the current 3D scan-data, when corresponding mark is found, microprocessor 118 calculates transition matrix, so that the mark in the current 3D scan-data is converted to absolute coordinate system (S86).Then, current scan-data is converted matrix and moves so that being arranged on absolute coordinates fastens, and the 3D position that is stored in the mark in the register of impact damper 32 thus is defined (S87).
Next step, whether microprocessor 118 is differentiated about the auto arrangement of the 3D data that obtain from object 10 and is finished, and in other words, whether is all arranged (S88) from the 3D data that the part of object 10 obtains.
If do not finish, flow process turns back to S77.Therefore, microprocessor control first is moved drive part 20 to drive first travel mechanism 22, and the projector 16 and image obtain part 18 and be moved into the position that is fit to scanning object but does not also have scanning thus.Step S77 is repeated to carry out to step S78.
Though in the 7th embodiment, big area image obtains part and image and obtains part and be introduced into two different elements, may be preferably image obtain partly to be used to obtain the image in big territory of object and object big territory parts of images the two.
Next step, 24 pairs of eighth embodiment of the present invention are described in detail with reference to the accompanying drawings.
Use the equipment of optical markings dynamic arrangement 3D scan-data to comprise that marker generator 12, the pattern projector 16, image obtain part 18, mobile drive part 20, travel mechanism 22, mark flash controller 26, projection controller 28, impact damper 32, a pair of or a plurality of big area image acquisition part 130 and 132, image importation 134 and a microprocessor 136 according to the eighth embodiment of the present invention shown in Figure 24.Among these figure, identical reference number and symbol be used to specify in function and operation with first embodiment in the part that is equal to mutually, and for the sake of simplicity, the description of these parts is left in the basket.
This a pair of big area image obtains part 130 and 132 and comprises imageing sensor, is used to receive image, for example ccd video camera or cmos camera.This video camera is interfixed, and their catch the image from the same object of different angles, and its method is called StereoVision.
Big area image obtains part 130 and 132 and is obtained the imageing sensor that part 10 has relative high-resolution by preferred the employing than image, is used to obtain the image of the part in this territory.Image importation 134 is intended to be used to receive image and is obtained part 18 and big territory acquisition part 130 and 132 view data that obtained.
Microprocessor 136 is the 3D position that the territory calculates each mark of exposing thoroughly by analyzing the view data that obtains part 130 and 132 objects of being taken pictures 10 by big area image at two different directions, and a plurality of optical markings are labeled on the surface that generator 12 is incident upon object 10 simultaneously.Therefore the 3D position of the mark that is obtained is as absolute coordinate system.
And, microprocessor 136 receives by image via image importation 134 and obtains a plurality of 2D view data and the 3D scan-data that part 18 is taken pictures in all angles, and analyze them, and each 3D scan-data is converted to absolute coordinates, it causes the arrangement of the whole 3D scan-data of object 10.
Next step describes the operating process that is used for the equipment of auto arrangement 3D scan-data according to eighth embodiment of the invention in detail with reference to the process flow diagram shown in figure 25a and the 25b.
At first, predetermined object 10 is placed on marker generator 12 next doors, and microprocessor 136 control mark flash controllers 26 turn on marker generator 12 and go out a plurality of mark outputs 14 that assembled, and allow a plurality of marks to be incident upon arbitrarily on the surface of object 10 (S90).
When the big territory of object 10 is obtained part 130 and 132 when different directions is throwed with overlap mode by big area image, microprocessor 118 receives arrogant area image to obtain two 2D view data of part 130 and 132 via image importation 134 respectively, and the optical markings from marker generator 12 is projected (S91) on the object 10 simultaneously.
The example of the image in the territory of exposing thoroughly of serve as reasons big area image acquisition part 130 and 132 objects that obtained 10 shown in Figure 26.Reference symbol " RM " expression is incident upon the lip-deep optical markings of object 10, and " BI-1 " obtains the image that part 132 is obtained by big area image, and " BI-2 " is by obtaining the image that part 130 is obtained by big area image.
Next, microprocessor 136 control mark flash controllers 26 turn-off marker generators 12, prevent that thus optical markings is projected on the surface of object 10 (S92).
According to obtaining two 2D view data (S93) that part 130 and 132 is taken pictures at two different directions by big area image, the 3D position that microprocessor 136 calculates the mark in the territory of exposing thoroughly that is included in object.In other words, big area image is obtained the position of part 130 and 132 and be incident upon relation between the 2D position of each mark on the object 10 from this, the 3D position of each mark is by trigonometric calculations, and its details is explained in the back.Next step, microprocessor 136 is deposited the 3D position (S94) in the register of impact damper 32 of so being calculated each mark.
Microprocessor 136 is controlled mobile drive part 20 then and is driven travel mechanism 22, and the image that combines with the pattern projector 16 thus obtains part 18 and moves to the position (S95) that is fit to scanning object 10.
Under this condition, microprocessor 136 control mark flash controllers 26 turn on a plurality of mark outputs 14 of marker generator 12 places assemblings, thus a plurality of marks are incident upon arbitrarily on the surface of object 10 (S96).
Take pictures when the overseas part of exposing thoroughly of object 10 (" NI " among Figure 26 b) is obtained part 18 by image, to obtain to comprise the 2D view data of optical markings, microprocessor 136 receives 2D view data (S97) via image importation 134.
Next, microprocessor 136 control mark flash controllers 26 turn-off marker generator 12, prevent that optical markings is projected (S98) on the object 10.Under this condition, when being obtained part 18 by image, above-mentioned identical part takes pictures, and to obtain not comprise the 2D view data of optical markings, microprocessor 136 receives this 2D view data (S99) via image importation 112.
And microprocessor 136 control projection controllers 28 activate the pattern projector 16, and marker generator 12 is turned off simultaneously, is throwed to avoid optical markings.Therefore, Yu Ding pattern (for example, having the pattern of the bar that different gap is arranged each other or many pattern) is projected on the surface of object 10.
The object 10 that throws when the quilt with pattern is obtained part 18 by image and takes pictures when obtaining the 3D scan-data, and microprocessor 136 receives these 3D scan-datas (S100) via image importation 112.
Under this environment, microprocessor 136 is analyzed the 2D view data that comprises the 2D view data of mark and do not comprise mark, to calculate the 2D position (S101) of mark.
And microprocessor 136 is according to the 2D position of mark and the 3D position (S102) of 3D scan-data calculating mark.That is to say that the 3D position of mark can obtain by estimating the crossing intersection point of straight line and 3D scan-data that the connection layout picture obtains the position of the camera lens center of part 18 and any mark in the 2D view data.
Next, microprocessor 136 relatively is stored in the 3D position of the mark in the register of impact damper 32 in the 3D position of the mark that S102 found with at S94, to search for corresponding mark (S103).
When finding corresponding mark by above-mentioned mark search step, microprocessor 136 calculates transition matrix, to change the mark (S104) in the current 3D scan-data.Current scan-data can be moved by transition matrix, so that be arranged on the absolute coordinates, the 3D position of the mark of storing in the register of impact damper 32 is defined (S105) thus.
Next step, microprocessor 136 judges with reference to the 3D data that obtain from object 10 whether auto arrangement is finished, in other words, from for the 3D scan-data that part obtained in the whole scan-data territory of object 10 whether by all arrangements (S106).
If do not finish, flow process turns back to S95.Therefore, microprocessor is controlled mobile drive part 20, is used to drive travel mechanism 22, and the pattern projector 16 and image obtain part 18 and be moved into the position that is fit to scanning object but is not scanned as yet thus.Step S95 is repeated to carry out to step S106.
Though in the 8th embodiment, a pair of big area image obtains part, an image acquisition part and a marker generator and disposes separately, as a kind of improvement, big area image is obtained part for this and marker generator can combinedly dispose.In this case, this can be more easily, according to the territory that optical markings is throwed this obtains part to big area image position is set because need not.
Another kind as eighth embodiment of the invention improves, and a pair of big area image obtains part and an image obtains the combined structure of part.In this case, in the process that obtains absolute coordinates, scanning field can become slightly little and precision also can reduce, yet, in the part process of the image that obtains scanning field, there is no need to take pictures in overlapping mode.Therefore, the quantity of scanning process can reduce.
To explain the principle of the 8th embodiment of the invention described above below in detail.
Disclosed big area image acquisition part 130 and 132 can be by two camera modelizations towards an object in the eighth embodiment of the invention, and it can be modified according to application.In the 8th embodiment, two video cameras are arranged parallel to each other, as shown in figure 27.Variable-definition among Figure 27 is as follows.
X: with the position of an obtained point
B: the parallax range between the camera lens center
F: focus of camera
B: by the plane of delineation that each video camera obtained
Xl, Xr: the relevant position of the image of the point on the plane of delineation
P, Q: the lens center of each video camera
A kind ofly can in formula (equation) 15 and 16, be defined by the method for using stereo-picture to obtain position a little.
Formula 15
x ′ l f = x + b / 2 z , With x ′ r f = x - b / 2 z
y ′ l f = y ′ r f = y z
x ′ l - x ′ r f = b z
Formula 16
x = b ( x ′ l + x ′ r ) / 2 x ′ l - x ′ r , y = b ( y ′ l + y ′ r ) / 2 x ′ l - x ′ r , z = b f x ′ l - x ′ r
Next step describes the ninth embodiment of the present invention in conjunction with the accompanying drawings.
In the ninth embodiment of the present invention, a plurality of projectors, image obtain part and marker generator be arranged at object around, making the projector and image obtain part need not to move to obtain 2D image and the 3D scan-data relevant with the whole scanning field of object, and a scan operation makes acquisition 2D image and 3D scan-data be called possibility, has simplified work thus and has shortened the time that wherein consumes.
Shown in Figure 28 for the structure synoptic diagram that uses the equipment of optical markings auto arrangement 3D scan-data according to ninth embodiment of the invention is shown, wherein this equipment comprises N marker generator 142, a M pattern projector 146, a L image acquisition part 148, image importation 150, projection controller 152, mark flash controller 154, a microprocessor 156 and an impact damper 158.
N marker generator 142, being intended to obtain part 148 discernible marks by image is incident upon on the object surfaces, be equipped with a plurality of mark outputs 144, be used for a plurality of optical markings are incident upon with direction of scanning arbitrarily the whole surface of object 10.
N marker generator 142 directed towards object 10, and predetermined interval away from each other, and mark is so arranged so that cover whole object.
M the pattern projector 146 is incident upon predetermined pattern or laser strip pattern on the surface of object 10, to obtain the 3D scan-data.The LCD projector can be used for space encoding bundle or laser beam are incident upon the surface of object 10, obtains part 148 via image thus and obtains the 3D scan-datas.
M the pattern projector 146 directed towards object 10, and predetermined interval are away from each other made to cover the entire domain of object 10 from the space encoding bundle of each pattern projector 146 projection.
L the image that comprises the imageing sensor that can receive image obtains part 148, and for example ccd video camera or cmos camera are taken pictures and obtained the image of object 10.Preferably L image obtain in the part 148 each combine with an independent pattern projector 146, rather than separated.
And N image obtains part 148, as shown in figure 28, directed towards object 10, and predetermined interval away from each other, the scanning field that image obtains part 148 covers the entire domain of object 10.
Image importation 150 receives from L image and obtains each view data that part 148 obtains, and the transfer rate of projection controller 152 control pattern films (pattern film) and the flicker cycle of direction of transfer and light source, so that the projective patterns film.
Mark flash controller 154 glimmers from the optical markings of N marker generator 142 according to the control cycle ground of microprocessor 156.
Microprocessor 156 is the 3D position of the mark on each territory according to 2D view data that is obtained from L image acquisition part 148 and the calculating of 3D scan-data respectively, and on each overlapping territory, search for corresponding mark, and by using corresponding mark to calculate transition matrix according to the 3D position of mark.The result is that microprocessor 156 is arranged each 3D scan-data by transition matrix.Impact damper 158 storages are used to calculate necessary data and gained data thereof.
Therefore, next step describes the operating process of using the equipment of optical markings auto arrangement 3D scan-data according to ninth embodiment of the invention in detail in conjunction with process flow diagram shown in Figure 29.
At first, object 10 is placed on the position that is fit to scanning, and N marker generator 142, a M pattern projector 146 and L image obtain part 148 and be set at around the object 10.Then, microprocessor 156 control mark flash controllers 154 start a plurality of mark outputs 144, its each be assembled in N marker generator 142, allow a plurality of marks to be incident upon arbitrarily on the surface of object 10 (S110) thus.
In the time of when the scanning field of object 10 is taken pictures the surface that is projected object 10 with acquisition 2D image while optical markings from every L image acquisition part 148 on, microprocessor 156 receives L the 2D view data (S111) that obtains from L image acquisition part 148 via image importation 150.
Next, microprocessor 156 control mark flash controllers 154 turn-off N marker generator 142, prevent that thus optical markings is projected (S112) on the object surfaces.Under this condition, when being obtained part 148 by each L image, the same domain of above-mentioned object 10 takes pictures when obtaining not comprise L 2D image of optical markings, and microprocessor 156 is via image importation 150 reception L 2D view data (S113).
And, M projector 146 of microprocessor 156 control projection controller 152 operations, N marker generator 142 is turned off simultaneously, to prevent that optical markings from being throwed.Therefore, Yu Ding pattern (pattern or the multiple bar chart case that for example, have the bar that has different gap each other) is projected on the surface of object 10 from M the pattern projector 146.
The object 10 that throws when the quilt with pattern is obtained part 148 by L image and takes pictures when obtaining L 3D scan-data, and microprocessor 156 is via image importation 150 reception L 3D scan-datas (S114).Under this environment, microprocessor 156 is by carrying out the 2D position (S115) that Flame Image Process is calculated mark to this 2D view data that comprises mark and the 2D view data that does not comprise mark.
And microprocessor 156 is from the 2D position of mark and the 3D position (S116) of 3D scan-data calculating mark.That is to say that the 3D position of mark can obtain the crossing intersection point of the straight line of position of the camera lens center of part 18 and any mark in the 2D view data and 3D scan-data and obtains by estimate connecting L image.
Next, microprocessor 118 compares 3D position and L 3D scan-data of this mark, to search for corresponding mark (S117).
When finding corresponding mark by search procedure, microprocessor 156 calculates transition matrix, to change the mark (S118) in the current 3D scan-data.One in L 3D scan-data is established as reference coordinate, and current 3D scan-data moves (S119) according to the resulting transition matrix that is used to arrange.
Secondly, describe the tenth embodiment of the present invention in conjunction with the accompanying drawings in detail.The element of tenth embodiment of the invention is identical with the 9th embodiment almost, yet operating process differs from one another.Therefore, the tenth embodiment will be described based on the configuration of the 9th embodiment shown in Figure 28 and process flow diagram shown in Figure 30.
At first, the Reference that its dimension is known is placed on the position that is fit to scanning, and N marker generator 142, a M pattern projector 146 and L image acquisition part 148 are separately positioned on around the Reference.Be calibration, Reference can be actual object if perhaps its dimension is known by specific manufacturing.
Under this condition, microprocessor 156 control mark flash controllers 154 turn-off the corresponding mark output 144 that assembles at N marker generator 142, are incident upon arbitrarily on the surface of Reference (S120) to allow a plurality of marks.
Microprocessor 156 is carried out calibration, obtains be correlated with (S121) between the part 148 to search Reference and L image.Its detailed operating process is described below.
At step S121, optical markings from N marker generator 142 is projected on the surface of Reference, and take pictures when obtaining to comprise the 2D image of optical markings when the scanning field of object 10 is obtained part 148 by L image, microprocessor 156 receives L 2D view data via image importation 150.
Next, M pattern projector 146 of microprocessor 156 control projection controller 152 operations.Therefore, Yu Ding pattern (pattern or the multiple bar chart case that for example, have the bar that has different gap each other) is projected on the surface of Reference from M the pattern projector 146.The Reference that throws when the quilt with pattern is obtained part 148 by L image and takes pictures when obtaining L 3D scan-data, and microprocessor 156 is via L 3D scan-data of image importation 150 receptions.
Under this condition, microprocessor 156 connects L image respectively and obtains the 3D position that the crossing intersection point of the straight line of the mark that comprises in each center of the video camera of assembling in the part 148 and each the 2D view data and 3D scan-data is estimated L the mark in the 3D scan-data by calculating.
Next, microprocessor 156 is made comparisons 3D position and each L 3D scan-data of mark, searching for corresponding mark, and calculates transition matrix from the relation of correspondence markings.Then, microprocessor 156 is deposited the transition matrix of acquisition in the register of impact damper 158, has finished calibration at S121 thus.
After finishing described calibration at S121, Reference is removed, and object 10 is placed on the place that Reference once was removed, and microprocessor 156 control mark flash controllers 154 turn-off N marker generator 142, prevent that thus optical markings is projected on the surface of object 10 (S122).
Under this environment, when being obtained part 148 by L image, the scanning field of object 10 takes pictures when obtaining not comprise L 2D image of optical markings, and microprocessor 156 is via L 2D view data of image importation 150 receptions.
And microprocessor 156 control projection controllers 152 activate M the pattern projector 146, and N marker generator 142 turn-offs to prevent that optical markings from being throwed simultaneously.Therefore, Yu Ding pattern (pattern or the multiple bar chart case that for example, have the bar that has different gap each other) is projected on the surface of object 10 from M the pattern projector 146.
The object 10 that throws when the quilt with pattern is obtained part 148 by L image and takes pictures when obtaining L 3D scan-data, and microprocessor 156 is via image importation 150 reception L 3D scan-datas (S124).
Microprocessor 156 is read the transition matrix in the register that is stored in impact damper 158, and is provided with in L the 3D scan-data one and is benchmark, and moves L-1 3D scan-data (S125) by transition matrix.
When for another object or same object, scan when essential, can be left in the basket from the calibration of S121 to S123.Because the 3D scan-data can be arranged by the transition matrix in the register that is stored in impact damper 158, can be reduced sweep time.Yet, if desired, can carry out each scanning from the calibration of S121 to S123, and can easily be changed according to operator's wish or system configuration, improve or change.
Next step describes the 11st embodiment of the present invention.The 11 embodiment provides and has been different from marker generator and the peripherals that uses among first to the tenth embodiment of the present invention.
The marker generator of the 11 embodiment, as shown in figure 31, comprise that prismatic polygon mirror that the light source of a plurality of X-axis, flash controller, one rotate around X-axis 164, rotary driving part divide the light source 170 of 166, rotating mechanisms 168, a plurality of Y-axis, flash controller 172, prismatic polygon mirror that rotates around Y-axis 174, a rotary driving part to divide 176 and rotating mechanisms 178.
The light source 160 of a plurality of X-axis produces the bundle with splendid rectilinear propagation characteristic, for example is launched into the laser beam of the reflecting surface of prismatic polygon mirror 164.The light source of X-axis can be, for example, and the laser pointers device.Flash controller 162 is according to the control of a microprocessor (not shown) each light source 160 that glimmers.
The prismatic polygon mirror 164 that is assembled a plurality of reflecting surfaces is rotated mechanism's 168 rotations, to reflect a plurality of bundles, a plurality of bundles is incident upon on the surface of an object (OB) thus.Rotary driving part divides 166 driven in rotation mechanisms, 168 rotary polyhedral mirrors to a direction, with the control of response microprocessor.
The light source 170 of a plurality of Y-axis produces the bundle with splendid rectilinear propagation characteristic, for example is launched into the laser beam of the reflecting surface of prismatic polygon mirror 174.Light source can be, for example, and the laser pointers device.Flash controller 172 is according to the control of a microprocessor (not shown) each light source that glimmers.
The prismatic polygon mirror that is assembled a plurality of reflecting surfaces is rotated mechanism's 178 rotations, to reflect a plurality of bundles, thus a plurality of bundles is incident upon on the surface of object (OB).Rotary driving part divides 176 driven in rotation mechanisms, 178 rotary polyhedral mirrors to a direction, with the control of response microprocessor.
Therefore next step explain the operating process of the marker generator of eleventh embodiment of the invention in detail.
At first, by rotary driving part divide 166 and rotary driving part divide 176 driving powers that produce to be applied to rotating mechanism 168 and rotating mechanism 178, to be driven rotating mechanism 168 and rotating mechanism 178 rotary polyhedral mirrors 164 and 174 of power drive respectively according to microprocessor control signal.
When light source 160 and 170 is lighted with the response microprocessor control signal by flash controller 162 and 172, be transmitted to the reflecting surface of prismatic polygon mirror 164 and 174 by a plurality of light sources 160 and 170 bundles that produce.Then, bundle is projected on the surface of object (OB).
Prismatic polygon mirror 164 is rotated with 174 so that the angle of reflecting surface is different.Therefore, on the surface of object (OB), the line of a plurality of bundles forms in X-axis and Y direction, and the intersection point that the line of X-axis and Y-axis intersects becomes optical markings (RM) respectively.
For example, if the quantity of the light source of X-axis is m, and the quantity of the light source of Y-axis is n, and the m*n quantity of intersection point can be formed on the surface of object (OB), and the m*n quantity of intersection point becomes corresponding optical markings (RM).Therefore, the optical markings that adopts the light source of lesser amt to produce relatively large quantity is possible.
Though in conjunction with its preferred embodiment the present invention has been carried out abundant description with reference to the accompanying drawings, it should be noted various modifications and improve apparent to one skilled in the art.In the scope of the present invention that such modification and improvement should be understood that to limit as accompanying Claim, unless its disengaging.
Draw by foregoing is apparent, of the present inventionly openly provide a kind of equipment and method of using the 3D scan-data that the optical markings auto arrangement obtains with the position from different perspectives. Therefore advantage of the present invention is: the optical markings that does not have quantity of goods produced is used for finding out the relative position of two-way different scanning data, so that scan-data is not lost or damages, even exist at the position at mark. Another advantage is: scanning object need not mark is placed on the object or with mark and removes from object, so convenience and safety are provided in the scanning process, and prevents as the damage of object of adhering to and remove the result of mark. Also have an advantage to be: the present invention can infinitely be used.

Claims (36)

1. equipment that uses optical markings auto arrangement 3D scan-data, described 3D scan-data are by obtaining with the different angles photo-object, and this equipment comprises:
Tag producing apparatus is used for a plurality of optical markings are projected described body surface;
Pattern projecting device projects pattern on the body surface for the 3D scan-data that obtains object;
Image-acquisition device is used to obtain the 2D view data of described object, comprises the mark that is incident upon on this body surface, and is used for obtaining the 3D scan-data of this object by being incident upon pattern on the body surface; With
Control device is used for calculating the 3D position of mark and based on the relative position of the 3D position calculation 3D scan-data of mark from the relation between 2D view data and the 3D scan-data.
2. the equipment of claim 1, wherein said pattern projecting device and described image-acquisition device integrate regularly.
3. the equipment of claim 2 further comprises:
Mobile drive part by described control device control; With
Travel mechanism, it is driven by described mobile drive part, so that with respect to described pattern projecting device of described movement of objects and described image-acquisition device.
4. the equipment of claim 1 further comprises:
Mark flicker control device, be used for the mark that produces by described tag producing apparatus according to the control cycle flicker of described control device, wherein said control device utilizes described mark flicker control device turn-on flag, be used for comprising the 2D view data of the part object of mark by described image-acquisition device acquisition, and closing flag, be used to obtain the not 2D image of the same section of this object of tape label.
5. the equipment of claim 4 wherein, by relatively comprised the 2D image of mark and the 2D image of the not tape label that obtains of the same section of this object certainly by described control device, obtains the 2D position of described mark.
6. the equipment of claim 1 further comprises:
Independent mark flicker control device, be used for according to the control of described control device by the reservation order described mark that glimmers successively individually, wherein said control device is set not, and the view data of tape label is a basic image data, basic image data is compared with a plurality of view data that comprise the mark that differs from one another separately, thus the 2D position of asking for mark.
7. the equipment of claim 1 further comprises:
Independent mark flicker control device, be used for according to the control of the described control device described mark that glimmers individually, wherein said control device is controlled described independent mark flicker control device, makes that being incident upon mark of taking on the territory and the mark zone that is incident upon on the next field that will take separates.
8. the equipment of claim 1, further comprise independent mark flicker/color control device, wherein said tag producing apparatus comprises a plurality of multicolor luminous element that its color can selectively changing, and wherein said independent mark flicker/color control device is controlled the flicker and the color of these a plurality of multicolor luminous elements, and wherein said control device is controlled described independent mark flicker/color control device, makes to be incident upon the mark taken on the territory and to be incident upon the mark zone of also not taking on the territory and to separate.
9. the equipment of claim 1, further comprise independent mark flicker control device, wherein said mark is divided into a plurality of groups according to predetermined overlap mode, be used for the binarization of described mark, the binary message of wherein said independent mark flicker control device described mark of opening and closing successively by group under the control of described control device, the wherein said control device mark by searching the unlatching of representing mark respectively or closed condition is asked for the 2D position that is included in the mark in a plurality of 2D view data.
10. the equipment of claim 1 further comprises the projection controller, is used for the pattern state that control is throwed by pattern projecting device under the control of described control device, and wherein said control device prevents projection mark when projective patterns.
11. the equipment of claim 1, wherein said tag producing apparatus is disposed at described object regularly, so that can be with respect to described object of which movement.
12. the equipment of claim 11, wherein said tag producing apparatus are a plurality of, and are disposed at the universal stage of placing object regularly.
13. the equipment of claim 1, wherein said tag producing apparatus at random throw mark on body surface.
14. the equipment of claim 1, wherein said image-acquisition device obtains 2D view data and 3D scan-data from the overlapping territory of object, and described control device utilization is included at least 2 or the 3D position of a plurality of unique point calculating mark in 2D view data and the 3D scan-data.
15. the equipment of claim 14, wherein said image-acquisition device obtains a plurality of 2D view data and 3D scan-data from a plurality of overlapping territory of object, and described control device is searched the corresponding mark in each described overlapping territory according to the 3D position of mark, described control device calculates transition matrix based on the relation between the corresponding mark, and described control device moves to reference coordinate by using transition matrix with each 3D scan-data.
16. the equipment of claim 1, wherein said image-acquisition device, described pattern projecting device and described tag producing apparatus are arranged on around the object by a plurality of respectively, and wherein said control device is from 3D position that the 2D image and the 3D scan-data in each district of being assigned to each image-acquisition device calculates mark, and search the respective markers that in adjacent region, comprises according to the 3D position of mark, and calculate transition matrix, thereby arrange a plurality of 3D scan-datas by using transition matrix by the relation of respective markers.
17. the equipment of claim 16, wherein said control device storage transition matrix is used for arranging the 3D scan-data in scan operation next time.
18. the equipment of claim 1, wherein said image-acquisition device is taken by a plurality of scanning positions and is obtained a plurality of 2D view data from whole scanning field, wherein the distance between the scanning position is known, and obtain a plurality of 2D view data and 3D scan-data from a plurality of parts of partly overlapping whole scanning field, and wherein said control device is by analyzing the 3D position of the information calculations mark of distance between a plurality of 2D view data relevant with the whole scanning field of object and the scanning position, and set up absolute coordinate system based on the 3D position of the mark of acquisition like this, pass through the 3D position that 2D view data relevant with the whole scanning field each several part of object of partial analysis one by one and 3D scan-data calculate mark then, the respective markers that the 3D positional information search that utilizes mark comprises in from the 2D view data of whole described scanning field and the 2D view data from a plurality of parts of described whole scanning field jointly, calculate transition matrix according to corresponding mark, move each 3D scan-data so that it is arranged in absolute coordinates fastens.
19. the equipment of claim 18, wherein said image-acquisition device is configured with a plurality of big territory scanning device of the big area image that is used to obtain object and the scanning device of the image of the territory various piece that is used to obtain expose thoroughly, and wherein a plurality of big territories scanning device is separated from each other and fixed distance therebetween.
20. the equipment of each of claim 1-19, wherein said tag producing apparatus comprises:
Be used on X-direction, producing a plurality of light sources of rectilinear light beam;
Be used on Y direction, producing a plurality of light sources of rectilinear light beam;
Be used for rectilinear light beam is reflected in the prismatic polygon mirror of the X-axis on the body surface;
Be used for rectilinear light beam is reflected in the prismatic polygon mirror of the Y-axis on the body surface;
Rotating mechanism is used to rotate the prismatic polygon mirror of X-axis and Y-axis; With
The light source scintillation controller is used to control the flicker of X-axis and Y-axis light source.
21. a method of using optical markings auto arrangement 3D scan-data, this method may further comprise the steps:
Mobile image-acquisition device is to the position of the image that is suitable for obtaining a part of object;
By described tag producing apparatus at projection optical markings on the body surface and obtain to comprise the 2D view data of the part of the object that is incident upon the optical markings on the body surface by described image-acquisition device;
By described pattern projecting device in projective patterns on the body surface and obtain on it 3D scan-data of the figuratum object of projection by described image-acquisition device; With
From the 3D position that the relation between 2D view data and the 3D scan-data is asked for mark, and by arrange a plurality of 3D scan-datas that obtain from the different piece of object according to the relative position in the 3D position calculation 3D scan-data of mark.
22. the method for claim 21, the step that wherein obtains the 2D view data further may further comprise the steps:
At first obtain to comprise the 2D view data of mark from this part of this object; With
Secondly obtain not comprise coming the 2D view data of mark from the same section of this object by closing described mark.
23. the method for claim 22, the step that wherein obtains the 2D view data further may further comprise the steps: the 2D view data to the mark of the 2D view data that comprises mark and the same section that do not comprise this object is carried out the 2D position that Flame Image Process is asked for mark.
24. the method for claim 21, the step that wherein obtains the 2D view data further may further comprise the steps:
Each part by shot object when all optical markings is closed obtains the base image data;
A plurality of optical markings are projected on each part of object successively separately according to predetermined order by described tag producing apparatus, and take the view data that obtains the every part of this object by described part under each state that throws mark separately and successively to this object; With
By comparing the 2D position of asking for mark with each view data and the base image data that the number corresponding with optical markings taken.
25. the method for claim 21, the step that wherein obtains the 2D view data further may further comprise the steps:
With overlap mode described a plurality of optical markings are divided into predetermined group, and open optical markings successively by group;
Search the binary message of packet tagging, promptly search each the opening in the packet tagging, thereby ask for the intrinsic sign of the 2D position of mark as each mark.
26. the method for claim 21, the step that wherein obtains the 3D scan-data further may further comprise the steps: prevent to produce optical markings in described tag producing apparatus when the pattern projecting device projective patterns.
27. the method for claim 21, wherein 2D view data that obtains from described image-acquisition device and the relation between the 3D scan-data step of asking for the 3D position of mark further may further comprise the steps: estimate that the connection layout picture obtains the camera lens center of part and the 2D view data the straight line of the position of mark and the crossing point of crossing of 3D scan-data arbitrarily, thereby make it possible to find the 3D position of mark.
28. the method for claim 21, the step of wherein arranging the 3D scan-data further may further comprise the steps:
The respective markers that comprises is jointly searched in 3D position according to mark in overlapped adjacent 3D scan-data;
Calculate the transition matrix that is used to arrange each 3D scan-data based on corresponding mark; With
Set one of 3D scan-data as benchmark, and utilize transition matrix to move and arrange each scan-data.
29. the method for claim 28, the step of wherein searching respective markers further may further comprise the steps: utilize about the information of the distance between per two marks with relevant per two mark or near the corresponding mark of information search of average vertical vector.
30. the method for claim 28, the step of wherein searching respective markers further may further comprise the steps: select near the additional reference point the mark in the 3D scan-data, and use these relative 3D positional informations together with mark to obtain corresponding mark.
31. the method for claim 28, the step of wherein searching respective markers further may further comprise the steps: form triangle by the 3D position that utilizes per three marks, obtain the leg-of-mutton length of side, by each limit of descending sort, and each length on each limit of comparison and order, obtaining corresponding triangle, thereby search corresponding mark.
32. the method for claim 28 is wherein set one of 3D scan-data and be may further comprise the steps as benchmark and the step that moves and arrange each 3D scan-data:
According to the information on vertex of a triangle and limit, make by relevant with 3D scan-data vertex of a triangle that forms by three marks and the vertex of a triangle in the frame of reference to be complementary;
Keep the summit of described coupling to be rotated conversion and mate a limit; With
Mate two triangles to the summit in the frame of reference by the summit that does not comprise in the limit of rotation as axle.
33. the method for claim 21, further may further comprise the steps: by taking whole scanning field from a plurality of scanning positions spaced apart from each other at predetermined intervals, obtain a plurality of 2D view data, by analyzing the 3D position of this 2D view data and described relevant each mark of information calculations at interval, and set up absolute coordinate system based on the 3D position of the mark of calculating like this, the step of wherein arranging a plurality of 3D scan-datas further may further comprise the steps:
According to the 3D position of mark, search the respective markers that in each adjacent 3D scan-data, comprises jointly, the 3D scan-data that each of described acquisition is adjacent is overlapped;
Calculate the transition matrix that is used to arrange each 3D scan-data according to corresponding mark; With
Utilize transition matrix to move each 3D scan-data, so that fasten this 3D scan-data of arrangement at absolute coordinates.
34. the method for claim 21 further may further comprise the steps: when the described tag producing apparatus of control is used to take other territory, the mark that throws on the territory of having taken of periodically glimmering.
35. the method for claim 21 further may further comprise the steps: when the described tag producing apparatus of control when being used to take other territory, project on the territory that has been taken mark with project the territory that is not taken as yet on mark between the different color of generation.
36. use the method for optical markings auto arrangement 3D scan-data, may further comprise the steps:
Around object, arrange a plurality of image-acquisition devices, pattern projecting device and tag producing apparatus;
The described a plurality of tag producing apparatus that glimmers makes optical markings be incident upon on the body surface, and obtains a plurality of 2D view data that projection has the object of optical markings by a plurality of described image-acquisition devices;
Start a plurality of described pattern projecting devices pattern is incident upon on the body surface, and obtain each 3D scan-data of the figuratum object of projection by a plurality of described image-acquisition devices; And
From the 3D position that each 2D view data of being obtained by a plurality of described image-acquisition devices and the relation between the 3D scan-data are asked for each mark, and by the relative position of each 3D scan-data of 3D position calculation of mark, and arrange each 3D scan-data.
CNB038178915A 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker Expired - Fee Related CN1300551C (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR1020020043830 2002-07-25
KR20020043830 2002-07-25
KR1020030022624 2003-04-10
KR10-2003-0022624A KR100502560B1 (en) 2002-07-25 2003-04-10 Apparatus and Method for Registering Multiple Three Dimensional Scan Data by using Optical Marker

Publications (2)

Publication Number Publication Date
CN1672013A CN1672013A (en) 2005-09-21
CN1300551C true CN1300551C (en) 2007-02-14

Family

ID=31190416

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB038178915A Expired - Fee Related CN1300551C (en) 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker

Country Status (4)

Country Link
JP (1) JP4226550B2 (en)
CN (1) CN1300551C (en)
AU (1) AU2003241194A1 (en)
WO (1) WO2004011876A1 (en)

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
FR2870935A1 (en) * 2004-05-25 2005-12-02 Insidix Sarl DEVICE FOR MEASURING SURFACE DEFORMATIONS
US8111904B2 (en) * 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
JP4316668B2 (en) * 2006-05-30 2009-08-19 パナソニック株式会社 Pattern projection light source and compound eye distance measuring device
GB0615956D0 (en) * 2006-08-11 2006-09-20 Univ Heriot Watt Optical imaging of physical objects
JP5307549B2 (en) * 2006-11-08 2013-10-02 有限会社テクノドリーム二十一 Three-dimensional shape measuring method and apparatus
KR20080043047A (en) * 2006-11-13 2008-05-16 주식회사 고영테크놀러지 Three-dimensional image measuring apparatus using shadow moire
US8126260B2 (en) 2007-05-29 2012-02-28 Cognex Corporation System and method for locating a three-dimensional object using machine vision
CA2606267A1 (en) * 2007-10-11 2009-04-11 Hydro-Quebec System and method for three-dimensional mapping of a structural surface
JP5322206B2 (en) * 2008-05-07 2013-10-23 国立大学法人 香川大学 Three-dimensional shape measuring method and apparatus
JP2010025759A (en) 2008-07-18 2010-02-04 Fuji Xerox Co Ltd Position measuring system
US9734419B1 (en) 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
JP5435994B2 (en) * 2009-03-18 2014-03-05 本田技研工業株式会社 Non-contact shape measuring device
US9533418B2 (en) 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
DE102009032771B4 (en) * 2009-07-10 2017-06-29 Gom Gmbh Measuring device and method for the three-dimensional optical measurement of objects
JP5375479B2 (en) * 2009-09-17 2013-12-25 コニカミノルタ株式会社 Three-dimensional measurement system and three-dimensional measurement method
CN101813461B (en) * 2010-04-07 2011-06-22 河北工业大学 Absolute phase measurement method based on composite color fringe projection
EP2568253B1 (en) * 2010-05-07 2021-03-10 Shenzhen Taishan Online Technology Co., Ltd. Structured-light measuring method and system
US9393694B2 (en) 2010-05-14 2016-07-19 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
CN102346011A (en) * 2010-07-29 2012-02-08 上海通用汽车有限公司 Measuring tools and measuring methods
US9124873B2 (en) 2010-12-08 2015-09-01 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
DE102011114674C5 (en) 2011-09-30 2020-05-28 Steinbichler Optotechnik Gmbh Method and device for determining the 3D coordinates of an object
US9163938B2 (en) 2012-07-20 2015-10-20 Google Inc. Systems and methods for image acquisition
DE102013203399A1 (en) * 2013-02-28 2014-08-28 Siemens Aktiengesellschaft Method and projection device for marking a surface
US9789462B2 (en) * 2013-06-25 2017-10-17 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
DE102013110667B4 (en) 2013-09-26 2018-08-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Method for the non-destructive inspection of three-dimensional workpieces and apparatus for carrying out such a method
CN104315975A (en) * 2014-10-22 2015-01-28 合肥斯科尔智能科技有限公司 Linear three dimension and high precision scan method
KR101612254B1 (en) 2014-10-30 2016-04-15 한국생산기술연구원 A multi-channel head assembly for 3D printer comprising polygon mirrors rotating in single direction, and a scanning method therewith and a 3D printer therewith
CN104359405B (en) * 2014-11-27 2017-11-07 上海集成电路研发中心有限公司 Three-dimensional scanner
CN105232161B (en) * 2015-10-16 2017-05-17 北京天智航医疗科技股份有限公司 Surgical robot mark point recognition and location method
US20170116462A1 (en) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Measurement apparatus and method, program, article manufacturing method, calibration mark member, processing apparatus, and processing system
KR101788131B1 (en) * 2016-01-18 2017-10-19 한화첨단소재 주식회사 Apparatus for displaying a die mounting location of the thermosetting resin composition sheet for an electric vehicle
US20180108178A1 (en) * 2016-10-13 2018-04-19 General Electric Company System and method for measurement based quality inspection
DE102017109854A1 (en) * 2017-05-08 2018-11-08 Wobben Properties Gmbh Method for referencing a plurality of sensor units and associated measuring device
CN107169964B (en) * 2017-06-08 2020-11-03 广东嘉铭智能科技有限公司 Method and device for detecting surface defects of cambered surface reflecting lens
CN108225218A (en) * 2018-02-07 2018-06-29 苏州镭图光电科技有限公司 3-D scanning imaging method and imaging device based on optical micro electro-mechanical systems
DE102020213141A1 (en) * 2020-10-19 2022-04-21 Robert Bosch Gesellschaft mit beschränkter Haftung Method for generating an optical marking, method for detecting an optical marking and marking device with the optical marking
CN112461138B (en) * 2020-11-18 2022-06-28 苏州迈之升电子科技有限公司 Cross scanning measurement method, measurement grating and application thereof

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US5276613A (en) * 1988-12-14 1994-01-04 Etienne Schlumberger Process and device for coordinating several images of the same object
WO1998013746A1 (en) * 1996-09-27 1998-04-02 Samsung Electronics Co. Ltd. Method for feeding information into a computer
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5276613A (en) * 1988-12-14 1994-01-04 Etienne Schlumberger Process and device for coordinating several images of the same object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object
WO1998013746A1 (en) * 1996-09-27 1998-04-02 Samsung Electronics Co. Ltd. Method for feeding information into a computer

Also Published As

Publication number Publication date
WO2004011876A1 (en) 2004-02-05
JP4226550B2 (en) 2009-02-18
JP2005534026A (en) 2005-11-10
CN1672013A (en) 2005-09-21
AU2003241194A1 (en) 2004-02-16

Similar Documents

Publication Publication Date Title
CN1300551C (en) Apparatus and method for automatically arranging three dimensional scan data using optical marker
CN1192249C (en) Automatic-tracing lighting equipment, lighting controller and tracing apparatus
CN1205847C (en) Component recognizing method and apparatus
CN100338632C (en) Marker placement information estimating method and information processing device
CN1164072C (en) Image scanning device and method
CN1912537A (en) Three-dimensional measurement system and method of the same, and color-coded mark
CN1287329C (en) Monitor device
CN1208190A (en) Optical scanning-type touch panel
CN1334913A (en) Apparatus and method to measure three-dimensional data
CN1132123C (en) Methods for creating image for three-dimensional display, for calculating depth information, and for image processing using depth information
CN2622674Y (en) Motion transducer
CN1847789A (en) Method and apparatus for measuring position and orientation
CN1860837A (en) Component mounting method and apparatus
CN1667567A (en) Coordinate input apparatus and its control method
CN1831519A (en) Brightness measuring apparatus and measuring method thereof
CN1493053A (en) Data input device
CN1577051A (en) Image processing system, projector,and image processing method
CN1846232A (en) Object posture estimation/correlation system using weight information
CN1526118A (en) Method and system for producing formatted data related to geometric distortions
CN1659418A (en) Camera corrector
CN1274839A (en) Lens evaluation method and device, optical unit and lens-adjusting method and device thereof
CN1601369A (en) Image processing system, projector and image processing method
CN1894557A (en) Calibration of a surveying instrument
CN1445513A (en) Sensor calibrating device and method, program, memory medium, information processing method and device
CN1453597A (en) Optical elements, it metal mould and method for processing optical elements

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
C17 Cessation of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20070214

Termination date: 20120603