WO2004011876A1 - Procede et appareil pour agencer automatiquement les donnees de capture optique 3d au moyen de marques optiques - Google Patents

Procede et appareil pour agencer automatiquement les donnees de capture optique 3d au moyen de marques optiques Download PDF

Info

Publication number
WO2004011876A1
WO2004011876A1 PCT/KR2003/001087 KR0301087W WO2004011876A1 WO 2004011876 A1 WO2004011876 A1 WO 2004011876A1 KR 0301087 W KR0301087 W KR 0301087W WO 2004011876 A1 WO2004011876 A1 WO 2004011876A1
Authority
WO
WIPO (PCT)
Prior art keywords
markers
scan data
image
positions
marker
Prior art date
Application number
PCT/KR2003/001087
Other languages
English (en)
Inventor
Min-Ho Chang
Original Assignee
Solutionix Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR10-2003-0022624A external-priority patent/KR100502560B1/ko
Application filed by Solutionix Corporation filed Critical Solutionix Corporation
Priority to JP2004524349A priority Critical patent/JP4226550B2/ja
Priority to AU2003241194A priority patent/AU2003241194A1/en
Publication of WO2004011876A1 publication Critical patent/WO2004011876A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Definitions

  • the present invention relates to an apparatus and method for automatically arranging three-dimensional (3D) scan data by using optical markers and, more particularly, to an apparatus and method for automatically arranging relative positions of a plurality of 3D scan data scanned in various positions and angles in reference to one coordinate system.
  • an optical 3D scanner can extract 3D data by scanning the surface of an object that is only in the scanner's field of view.
  • the object to be scanned should be rotated or moved, or the scanner itself should be moved and positioned to a place where a part or parts of the object can be seen.
  • a complete 3D scan then may be captured by surveying the object in various orientations and angles, and the 3D data thus obtained is arranged and integrated into a single uniform coordinate system.
  • each 3D scan data is defined by different coordinate system according to the position of a scanner.
  • the distance in which the scanner has been moved must be known. There are two methods of calculating this distance. One is to obtain an absolute distance by using a numerically-controlled device for moving the scanner, and the other is to calculate the distance only by referring to the scanned data.
  • the scanning must be carried out so that the plurality of scanned data overlap each other and corresponding points are inputted at the overlapping positions of the scanned data.
  • the scanned data is then arranged in reference to the corresponding points such that the respective coordinate systems defining the plurality of scanned data are integrated into one uniform coordinate system.
  • conventional markers 4 are attached arbitrarily onto the surface of an object 2, and the surface of the object 2 is scanned part by part in an overlapping manner.
  • first and second scanned data II and 12 are obtained from scanning the surface of the object 2, as shown in Fig. 2, the operator searches the number of markers Ml and M2 positioned commonly in the two scanned data II and 12 and arranges the two scanned data II and 12 by matching the markers as corresponding points. Meanwhile, in a technique where corresponding points are automatically recognized, markers having patterns different from each other for identification is searched via image processing, and if markers having identical patterns are positioned in two different scanned data, the two scanned data are arranged automatically in reference to the markers.
  • the present invention provides an apparatus and method for automatically arranging 3D scan data by using optical markers which are of non-contact type so that scanned parts of an object are preserved and not deformed.
  • an apparatus using optical markers for automatically arranging 3D scan data which is obtained by scanning an object at various angles, comprising: marker generating means for projecting a plurality of optical markers on the surface of an object; pattern projecting means for projecting patterns on the surface of the object in order to obtain 3D scan data of the object; image obtaining means for obtaining 2D image data of the object including the optical markers projected on the surface of the object and for obtaining 3D scan data of the object through the patterns projected on the surface of the object; and control means for extracting 3D positions of the optical markers from the relation between the 2D image data and the 3D scan data and calculating relative positions of the 3D scan data in reference to the 3D positions of the optical markers.
  • a method for automatically arranging 3D scan data using optical markers comprising the steps of: moving the image obtaining means to a position appropriate for obtaining an image of parts of the object; projecting the optical markers on the surface of the object by marker generating means and obtaining 2D image data of parts of the object including the optical markers projected on the surface of the object by image obtaining means; projecting patterns on the surface of the object by pattern projecting means and obtaining 3D scan data of parts of the object on which the patterns are projected by the image obtaining means; and extracting 3D positions of the optical markers from the relationship between the 2D image data and the 3D scan data and arranging 3D scan data obtained from different parts of the object in reference to the 3D positions of the optical markers.
  • Fig. 1 is a drawing for illustrating an example of a 3D scan of an object by attaching conventional sticker type markers on the object;
  • Fig. 2 is a drawing for illustrating an example of an arrangement of different scan data by using sticker type markers as reference markers;
  • Fig. 3 is a schematic drawing for illustrating the construction of an apparatus for automatically arranging 3D scan data using optical markers according to a first embodiment of the present invention
  • Figs. 4a to 4c are schematic drawings for illustrating examples of a state for obtaining a 2D image data using optical markers according to the first embodiment of the present invention and a state for obtaining a 3D scan data by using patterns;
  • Fig. 5 is a schematic drawing for illustrating an example of a state for deriving 2D positions of markers from the 2D image data obtained by activating and deactivating the optical markers according to the first embodiment of the present invention;
  • Fig. 6 is a schematic drawing for illustrating an example of a state for deriving a 3D position of a marker from a 2D position of a marker and a center position of a camera lens;
  • Figs. 7a to 7b are schematic drawings for exemplarily illustrating an operation of searching corresponding markers by way of a triangular comparison at mutually different image data according to the first embodiment of the present invention
  • Figs. 8a to 8d are schematic drawings for exemplarily illustrating a conversion operation of matching two triangular structures at mutually different positions in triangular comparison within two different image data according to the first embodiment of the present invention
  • Fig. 9 is a schematic drawing for exemplarily illustrating an operation of searching corresponding markers by way of obtaining an imaginary marker at mutually different image data according to the first embodiment of the present invention.
  • Figs. 10a to 10b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention
  • Fig. 11 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a second embodiment of the present invention
  • Fig. 12 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the second embodiment of the present invention
  • Fig. 13 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a third embodiment of the present invention
  • Fig. 14 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fourth embodiment of the present invention.
  • Fig. 15 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a fifth embodiment of the present invention.
  • Fig. 16 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a sixth embodiment of the present invention.
  • Figs. 17a and 17b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the sixth embodiment of the present invention ;
  • Fig. 18 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of a reference coordinate system
  • Fig. 19 is a schematic drawing for illustrating an error occurring in the process of arranging a scanned data by way of an absolute coordinate system
  • Fig. 20 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention.
  • Figs. 21a and 21b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a seventh embodiment of the present invention.
  • Fig. 22 is a drawing of an example of an image obtained by using a large domain image obtaining part depicted in Fig. 20;
  • Fig. 23 is a drawing of an example of an image obtained by using the large domain image obtaining part depicted in Fig. 20 and an image obtaining part;
  • Fig. 24 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to an eighth embodiment of the present invention
  • Figs. 25a and 25b are flow charts for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the eighth embodiment of the present invention
  • Fig. 26a is a drawing of an example of an image obtained by using a pair of large domain image obtaining parts illustrated in Fig. 24;
  • Fig. 26b is a drawing of an example of an image obtained by using the pair of large domain image obtaining parts illustrated in Fig. 24 and an image obtaining part;
  • Fig. 27 is a schematic drawing for illustrating a principle of the eighth embodiment of the present invention
  • Fig. 28 is a drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to a ninth embodiment of the present invention
  • Fig.29 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention.
  • Fig. 30 is a flow chart for illustrating an operation of a method for automatically arranging 3D scan data using optical markers according to a tenth embodiment of the present invention.
  • Fig. 31 is a schematic drawing for illustrating a construction of a marker generator according to an eleventh embodiment of the present invention.
  • Fig. 3 is a drawing for illustrating a structure of an apparatus for automatically arranging 3D scan data using optical markers according to the first embodiment of the present invention, wherein the apparatus includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a marker blinking controller 26, a pattern projector controller 28, a microprocessor 30 and a buffer 32.
  • the marker generator 12 which is designed to project markers recognizable by the image obtaining part 18 on the surface of an object 10 includes a plurality of marker output parts 14 for projecting plural optical markers simultaneously on the surface of the object 10 in irregular directions.
  • the plurality of marker output parts 14 adopts laser pointers capable of projecting a plurality of red spots on the surface of the object 10, such that the positions of spots projected on the surface of the object 10 may be easily distinguished from images obtained by the image obtaining part 18, such as a camera or the like.
  • the marker generator 12 is by no means limited to a laser pointer, and any optical marker may be adopted as long as it can focalize properly on the surface of the object and can be easily controlled to repeat blinking.
  • a plurality of marker generators 12 may be disposed along a circumference of an object in order to project the optical markers on the surface of the entire object 10, and the number of the optical markers may be changed in accordance with the size and shape of the object 10. Furthermore, the marker generators 12 should be fixed on the object during the scanning operation so that the positions of the markers do not vary on the surface of the object.
  • the pattern projector 16 shown in the drawing projects predetermined patterns so that
  • 3D scan data of the object 10 can be obtained. Namely, space-encoded beams are projected on the surface of the object 10 by using projectors such as LCD projectors and the like, or a laser beam is projected on the surface of the object 10 so that 3D scan data of the object 10 can be obtained via the image obtaining part 18.
  • the pattern projector 16 adopts a slide projector comprising a light source, a pattern film and a lens for projecting a predetermined pattern, or an electronic LCD projector, or a laser diode for projecting laser striped patterns.
  • a pattern film equipped with striped patterns is fed between a light source and a lens by a predetermined feeding means, which allow a series of striped patterns to be projected on the object 10.
  • the pattern film may have striped patterns of varying gaps, as disclosed in Korean
  • Patent Application No.2002-10839 filed on February 28, 2002 by the present applicant, entitled “3D Scan Apparatus and Method Using Multiple Striped Patterns.” The same applies to a scanning device using laser striped patterns.
  • markers should not be projected on the object 10 while obtaining 3D scan data because scanned data might be deformed by the markers on the object 10.
  • the image obtaining part 18 comprises image sensors capable of receiving images such as a Charge Coupled Device (CCD) camera or a CMOS camera.
  • CCD Charge Coupled Device
  • CMOS complementary metal-oxide-semiconductor
  • the image obtaining part 18 may be configured separately from the pattern projector 16 but it is preferable that the image obtaining part 18 be integrally built in with the pattern projector 16 because the image obtaining part 18 integrated with the projector 16 is simple in structure and it is easy to match the 2D image data with the 3D scan data without calibration.
  • the image obtaining part 18 obtains 2D image data and 3D scan data while synchronizing the blinking period of the optical markers with the blinking period of the pattern, details of which are illustrated in Figs. 4a, 4b and 4c.
  • the image obtaining part 18 obtains a first image data 40 by photographing a certain part of the object 10 on which a plurality of optical markers (RM) are arbitrarily projected.
  • the image obtaining part 18 obtains a second image data 42 by photographing the same part of the object 10 as in Fig 4a while the marker generator 12 is turned off in order to prevent the surface of the object 10 from being projected with the laser markers.
  • the image obtaining part 18 obtains 3D scan data by photographing the object 10 on which striped patterns are projected from the pattern projector 16 while the marker generator 12 turns off.
  • the 3D scan data is obtained in the form of first to fifth scanned data 44a - 44e that correspond to the same part of the object 10 with different striped patterns PT1 - PT5 on the surface thereof, respectively.
  • the pattern film has 5 different striped patterns, it is not limited thereto and it may have more patterns.
  • the movement driving part 20 moves the pattern projector 16 and the image obtaining part 18 relatively to the object 10 according to the driving control of the microprocessor 30 so that images of the whole object 10 can be obtained.
  • the moving mechanism 22 receives a signal from the movement driving part 20 to move the pattern projector 16 and the image obtaining part 18 to a prescribed direction relative to the object 10.
  • the movement driving part 20 is adopted in the present embodiment for electrically moving the pattern projector 16 and the image obtaining part 18, it should be apparent that the moving mechanism 22 may be manually manipulated.
  • the image input part 24 shown in the drawing receives the image data obtained from the image obtaining part 18, and the marker blinking controller 26 serves to blink the optical markers of the marker generator 12 according to the control of the microprocessor.
  • the project controller 28 controls the feeding speed and direction of the pattern film of the pattern projector 16, and also controls the blinking of the light source for the projection.
  • the microprocessor 30 receives and analyzes the 2D image data and the 3D scan data photographed at various angles through the image input part 24 and arranges the 3D scan data automatically on a single uniform coordinate system.
  • the microprocessor 30 carries out an image process on the first image data 40 with the laser markers (RM) and the second image data 42 without the laser markers (RM), and as a result the third image data 46 comprising only the laser markers (RM) is obtained.
  • the microprocessor 30 calculates 3D positions of the markers from the relation between a camera lens center 50 of the image obtaining part 18 and a 2D image data 52 extracted with the marker position.
  • the 3D positions of the markers (a', b', c') corresponding to relevant markers can be obtained by estimating intersection points where straight lines connecting the camera lens center 50 of the image obtaining part 18 and the positions (a, b, c) of arbitrary markers in the 2D image data intersect the 3D scan data 54.
  • a scan data should preferably include more than 4 - 5 markers, and the neighboring two scan data should include 3 or more common markers. This is because three or more points are necessary for defining a unique position in 3D space and are also a requirement for obtaining corresponding markers (described later).
  • each marker is used with different patterns to be distinguished from each other.
  • the configuration of the equipment and manufacture thereof may be complicated because scores or hundreds of marker output parts should project different shapes of markers.
  • the markers are distinguished from each other by using information on the relative positions of the markers calculated based on respective 3D scan data at the microprocessor 30.
  • three points formed by markers can construct a triangle, and triangles constructed by three different points are different from each other, such that each triangle can be distinguished by comparing the angles and lengths thereof. Therefore, one marker corresponding to a vertex of a triangle can be distinguished from another marker.
  • the scan data 60 contains M C3 number of different triangles and another scan data contains N C 3 number of different triangles. Then, comparing the triangles between the two scan data a total of M C3 X N C 3 times, corresponding pairs of triangles can be obtained.
  • the microprocessor 30 constructs plural triangles TI and T2 according to points obtained by markers included in one scan data 60 and plural triangles T3 and T4 included in another scan data 62.
  • the microprocessor 30 seeks a pair of mutually corresponding triangles, e.g., TI and T3, which are contained in the two scan data 60 and 62, respectively.
  • Various methods can be used to compare the triangles. One of them is to seek a pair of corresponding triangles by comparing the lengths of each side. In other words, three sides of each triangle (al, a2, a3) (bl, b2, b3) are compared, and if the length of each side is identical to its counterpart and if the order of each side is all the same, it can be determined that the two triangles are corresponding.
  • each side In seeking triangles with three identical sides, the length of each side is arranged in a descending order, for example, and if at least more than two triangles having identical sides are detected, the order of each side is checked. That is, two triangles are judged corresponding if each of the sides compared in a counterclockwise or clockwise order from the longest side are identical.
  • the scan data are moved so that these markers can be positioned on the same point in a single uniform coordinate system. Namely, one of the triangles serves as a reference and the corresponding triangle moves to the reference, and as a result, the two different coordinate systems are matched.
  • Figs. 8a - 8d The matching process for the two triangles located in two different scan data is illustrated in Figs. 8a - 8d.
  • Fig. 8a in case two triangles are given, each triangle is identical in size and shape but located at different scan data, and information on vertexes and sides are known while two corresponding triangles are determined.
  • a translation matrix (T) where a reference coordinate system relating to one triangle is set up as A and the other coordinate system is set up as B.
  • the translation matrix (T) thereto is defined in the following formula 1 :
  • Fig. 8d depicts a rotational translation carried out in order to match the remainder of one of the corresponding vertexes by using a rotational matrix (R2), where the rotational matrix (R2) is defined by Formula 3:
  • a point (P) included in one scan data moves to a new position in another scan data by the following Formula 5 :
  • Q is a state vector of registering
  • Q is defined as [Q R I Q ⁇ ] 2
  • Q R Quaternion Vector
  • Q T is a Translation Vector and defined as [q 4 q 5 q6] t .
  • f(Q) defines an average distance of squared xi minus (R(Q R )pi+Q ⁇ ), and at this time, R(Q R ) and Q T for minimizing f(Q) can be calculated by the least square method.
  • R(Q R ) can be defined by a 3x3 rotation matrix as in the following Formula 7:
  • Cyclic compounds of the anti-symmetric matrix (Aij) are used to form the column vector ( ⁇ ).
  • This vector ( ⁇ ) is then used to form a symmetric 4x4 matrix Q ( ⁇ px), which can be given by the following Formula 10:
  • Q ⁇ is an eigen vector corresponding to a maximum eigenvalue of Q ( ⁇ px ), and quaternion ( Q0, Ql, Q2, Q3 ) is obtained by utilizing the eigen vector, and the rotation matrix is obtained by substituting the quaternion into Formula 7. Meanwhile, Q ⁇ (q4,q5,qe) can be obtained from the following Formula 11 by utilizing R(Q R ) given from the above Formula 7.
  • the microprocessor 30 computes a matrix for total translation based on said one scan data 60 as a reference coordinate, thereby arranging all of the 3D scan data automatically to the reference coordinate system.
  • a formula applied to the point cloud data (P), which are to be arranged by the method of coordinate mapping, can be defined by the below-mentioned Formula 14:
  • each scan data with markers has 3D information, it is possible to seek corresponding markers even if only 2 corresponding markers are included in the overlapped region.
  • Corresponding markers are sought by comparing two vertical vectors at the points where the markers are positioned and the distance between the two markers .
  • additional markers and 3D scan data are used for creating additional references. For example, in case there are three corresponding markers, a triangle is formed by the three markers, and then a line is drawn perpendicularly from the weight center of the triangle. Then, the intersecting point of the perpendicular line and the 3D scan data is obtained as a fourth reference point. Next, corresponding markers can be found by utilizing information of an average pe ⁇ endicular vector of an object surface at the markers or around the markers.
  • a straight line is drawn by connecting the two markers, and a circle is drawn on a plane perpendicular to the straight line from the center of the straight line. Then, an intersecting point of the circle and the 3D scan data is obtained as fourth and fifth reference points.
  • the microprocessor 30 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S10).
  • the microprocessor 30 controls the marker blinking controller 26 to allow the plurality of marker output parts 14 disposed at the marker generator 12 to project the markers arbitrarily on the surface of the object 10 (Sll).
  • the image obtaining part 18 photographs a prescribed domain of the object 10 to obtain a 2D image including optical markers projected on the surface of the object 10 and then the microprocessor 30 receives the 2D image data via the image input part 24 (S 12).
  • the microprocessor 30 controls the marker blinking controller 26 to turn off the marker generator 12 so that the markers may not be projected on the object 10 (S13).
  • the image obtaining part 18 photographs the same domain as the above to obtain a 2D image without the markers, and then the microprocessor 30 receives the 2D image data via the image input part 24 (S14).
  • the microprocessor 30 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off. Then, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 from the pattern projector 16. Next, the image obtammg part 18 photographs the object 10 with striped patterns projected on it to get 3D scan data, and then the microprocessor 30 receives the 3D scan data via the image input part 24 (S15).
  • prescribed patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 30 computes 2D positions of the markers by image processing the 2D image data with the markers and the 2D image data without the markers (S16).
  • the microprocessor 30 computes 3D positions of the markers by using the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data (S 17).
  • the microprocessor 30 discriminates whether the register of the buffer 32 is vacant or not (SI 8). If the register of the buffer 32 is not vacant, the 3D positions of markers obtained at S17 (current 3D scan data) and 3D positions of the markers stored in the register of the buffer 32 (in other words, 3D data overlapping partly with the current 3D scan data) are compared for searching corresponding markers (SI 9).
  • the microprocessor 30 computes translation matrix for matching the two 3D scan data (S20).
  • the positions of 3D scan data registered at the register of the buffer 32 are given as a reference coordinate system, to which the current scan data are translated (S21).
  • the microprocessor 30 registers the markers newly obtained from the current scan data in the register of the buffer 32 (S22). Then, the microprocessor 30 discriminates whether an automatic arrangement relative to the 3D scan data has been completed or not
  • Fig. 11 illustrates a structure of an apparatus for automatically arranging 3D scan data using the optical markers according to the second embodiment of the present invention.
  • An apparatus for arranging 3D scan data automatically according to the second embodiment of the present invention includes a marker generator 70, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a moving mechanism 22, an image input part 24, a individual marker blinking controller 74, a projector controller 28, a microprocessor 76, and a buffer 32.
  • the marker generator 70 comprises a plurality of marker output parts 72 and projects markers recognizable by the image obtaining part 18 on the surface of object 10 randomly.
  • the marker generator 70 turns on the 1 - N marker output parts 72 one by one sequentially in response to the control of the individual marker blinking controller 74, which enables a different marker to be included for each image obtained from the image obtaining part 18.
  • the individual marker blinking controller 74 sequentially and individually blinks the plurality of marker output parts 72 mounted at the marker generator 70 in a predetermined order according to the control of the microprocessor 76.
  • the microprocessor 76 carries out the arrangement of the coordinate systems corresponding to the plurality of 3D scan data photographed at various angles by analyzing the 2D image data and 3D scan data inputted through the image input part 24. That is, the microprocessor 76 sets up an image photographed by the image obtaining part 18 while all the markers turn off as a reference image, and then compares the reference image with a plurality of images photographed while the markers turn on one by one sequentially. Through the above process, the 2D position of each marker can be obtained.
  • the microprocessor 76 carries out the same processes as performed in the first embodiment.
  • the microprocessor analyzes the 2D image data and the
  • 3D scan data to compute the 3D positions of the markers, and searches corresponding markers to obtain a translation matrix, and translates the plurality of 3D scan data to the reference coordinate system.
  • the microprocessor 76 controls the movement driving part 20 to activate the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning the object 10 (S30).
  • the microprocessor 76 obtains the image data photographed from the image obtaining part 18 as a reference image while all of the optical markers are turned off. Then, the microprocessor 76 controls the individual marker blinking controller 74 to turn on a firstly-designated marker output part 72 out of the plurality of marker output parts 72 equipped in the marker generator 70, which allows the first marker to be projected on the surface of the object 10 (S31). Then, an image is obtained as a first image data by photographing by the image obtaining part 18 (S32).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the secondly-designated marker output part according to a predetermined order and to allow the second optical marker to be projected on the object 10 (S33). Then, the second image data is obtained (S34). Next, the microprocessor 76 discriminates whether the marker contained in the image is the last (N-th) marker out of the predetermined plural markers (S35). If the marker is not the last marker, steps S33 and S34 are repeatedly carried out until N-th image data is obtained.
  • the microprocessor 76 controls the projector controller 28 to activate the pattern projector 16 while the marker generator 70 is turned off to prevent the optical marker from being projected and to allow a prescribed pattern (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) for 3D scanning to be projected on the object 10 by the pattern projector 16.
  • a prescribed pattern for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 76 receives the 3D scan data from the image input part 24 (S36).
  • the microprocessor 76 compares each of the first to N-th image data with the reference image and searches a bright spot formed by the optical markers in each comparison, which helps the 2D positions of each marker to be found easily (S37).
  • the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and the 3D scan data, and searches corresponding markers included in overlapped regions in reference to the 3D positions of the markers, and calculates translation matrices, and translates the plurality of 3D scan data to the reference coordinate system (S38), which is the same as described in the first embodiment.
  • microprocessor 76 discriminates whether the automatic arrangement for the
  • the construction of the apparatus for automatically arranging the 3D scan data according to the third embodiment of the present invention is identical to the one shown in Fig. 11. However, the method is different between the second embodiment and the third embodiment. That is, in the second embodiment, N number of images, each of which includes one marker different from others, have to be photographed respectively. However, in the third embodiment, log 2 (N+l) number of images are photographed, each of which includes a group of markers for binarization.
  • the individual marker blinking controller 74 divides the marker output parts 72 disposed at the marker generator 70 into several groups for binarization, and turns on the markers group by group.
  • the individual marker blinking controller 74 divides the 16 marker output parts 72 into 4 groups in an overlapping way.
  • a first group comprises 9th to 16th markers
  • a second group comprises 5th to 8th markers and 13th to 16th markers
  • a third group comprises 3rd, 4th, 7th, 8th, 11th, 12th, 15th and 16th markers
  • a fourth group comprises even numbers of markers (2nd, 4th, 6th, 8th, 10th, 12th, 14th, 16th), which are all represented in Table 1.
  • : '0" represents that the marker is turned off while "1" represents that the marker is turned on.
  • the first marker maintains a turned-off state at all times while the 16th marker always maintains a turned-on state and all markers have intrinsic values respectively.
  • the microprocessor 76 controls the individual marker blinking controller 76 such that the N markers are projected group by group, and compares the log 2 (N) image data obtained by the image obtaining part 18, and computes the 2D positions of the markers.
  • the 16 markers are projected group by group to obtain the first to fourth image data as shown in Table 1, the 16 markers are differentiated by their binary codes which represent their turned-on or turned-off state, that is, identification (ID). Therefore, the 2D positions of the 16 markers can be obtained. For example, the 10th marker is recognized as a binary "1001" and the 13th marker is recognized as a binary "1100". Meanwhile, the first marker, always maintaining a turned-off state, is not used such that a total of 15 markers can be utilized in the real sense.
  • ID identification
  • the microprocessor 76 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data, searches corresponding markers, calculates translation matrix, and moves the plurality of 3D scan data by the translation matrix. The above process is the same as that described in the first embodiment.
  • the microprocessor 76 controls the movement driving part 20 to drive the movement mechanism 22, which moves the image obtaining part 18 integrated with the pattern projector 16 to a position appropriate for scanning an object 10 (S40).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that the markers (9th - 16th markers) belonging to the first group can be projected (S41).
  • a first image data photographed by the image obtaining part 18 is obtained via the image input part 24 (S42).
  • the microprocessor 76 controls the individual marker blinking controller 74 to turn on the marker output part 72 so that Nth group of markers, for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44).
  • Nth group of markers for example, 5th - 8th and 13th - 16th markers are projected (S43), to thereby obtain Nth image data photographed by the image obtaining part 18 (S44).
  • the microprocessor 76 discriminates whether the marker group contained in the image data is the last one or not (S45), and if it is not the last one, the flow is returned to S43 to repeat the process.
  • the projector controller 28 drives the pattern projector 16 to project patterns on the surface of the object 10 while the marker generator 70 is turned off to prevent the optical markers from being projected.
  • the microprocessor 76 receives the 3D scan data through the image input part 24 (S46).
  • the microprocessor 76 compares the first - Nth images obtained from the image obtaining part 18, which results in obtaining binary information of the markers relative to the first - fourth image data. Therefore, the ID of each marker, that is, the 2D positions of the markers are obtained (S47). 5 Meanwhile, the microprocessor 76 computes the 3D positions of the markers by analyzing the 2D positions of the markers and 3D scan data, and searches the corresponding markers included in the overlapped region of two different 3D scan data in reference to the 3D positions of the markers, calculates the translation matrix, and translates one of the 3D scan data to the reference coordinate system by the translation matrix (S48), which is same as0 described in the first embodiment.
  • the microprocessor 76 discriminates whether the automatic arrangement of the 3D scan data has been completed or not (S49). If the automatic arrangement of the 3D scan data has not been completed, the flow is returned to S40. Therefore, steps S40 to S48 are repeated.
  • an apparatus for automatically arranging 3D scan data includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 80, a individual marker blinking controller 84, and a microprocessor 86.
  • the marker generator 80 projects markers recognizable by the image obtaining part 18 on the surface of the object 10.
  • the marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at irregular angles on the entire surface of the object 10.
  • the marker generator 80 selectively blinks the plural marker output parts 82 according to the control of the individual marker blinking controller 84.
  • the individual marker blinking controller 84 individually controls the plural marker output parts 82 according to the control of the microprocessor 86.
  • the microprocessor 86 analyzes the scanned data obtained from the object 10 for automatically arranging the 3D scan data in a single uniform coordinate system.
  • the microprocessor 86 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles via the image input part 24 to analyze them for automatic arrangement on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
  • the markers projected on the region blink at a predetermined cycle (for example, approximately 0.5 second), while the markers projected on the other region maintain an "on" state by the control of the individual marker blinking controller 84.
  • markers for a region for which the obtaining process is already terminated maintain an "on" state, while the markers for the other region blink at a predetermined cycle.
  • the conditions of the markers are differently set up between one region where image data and scan data have been already obtained and the other region. Therefore, regions are easily differentiated by operators.
  • an apparatus for automatically arranging 3D scan data includes a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, an image input part 24, a projector controller 28, a buffer 32, a marker generator 90, a marker individual blinking/color controller 94 and a microprocessor 96.
  • the marker generator 90 projects patterns recognizable by the image obtaining part 18 on a surface of an object.
  • the marker generator 80 is disposed with a plurality of marker output parts 82 for projecting a plurality of optical markers at arbitrary angles on the surface of the object 10.
  • the marker generator 90 is constructed such that at least more than two different colors can be selectively projected from each marker output part 92 according to the control of the marker individual blinking/color controller 94.
  • each marker output part 92 is equipped with more than two light sources each having different colors such that these light sources can be selectively lighted.
  • the marker individual ' blinking/color controller 94 controls the blinking and individual coloring of the plurality of marker output parts 92 disposed at the marker generator 90 according to the control of the microprocessor 96.
  • the microprocessor 96 analyzes the 2D image data and 3D scan data photographed from various angles by the image obtaining part 18 for automatically arranging the 3D scan data on one coordinate system.
  • the detailed operation procedures thereto are the same as those in the first embodiment of the present invention.
  • the microprocessor 96 controls the marker individual blinking/color controller 94 so that markers projected on a region where image data and scan data have already been obtained have different colors from that of the markers projected on the other region.
  • an apparatus for automatically arranging 3D scan data includes marker generators 12, a pattern projector 16, an image obtaining part 18, an image input part 24, a marker blinking controller 26, a projector controller 28, a buffer 32, a rotating table 100, a rotating drive part 102, a rotating mechanism 104 and a microprocessor 106.
  • the rotating table 100 rotates with an object 10 placed on the upper plate of the rotating table 100, and also rotates with a plurality of marker generators 12 disposed at a circumference of the upper plate.
  • the rotating drive part 102 drives the rotating mechanism 104 to rotate the rotating table 100 according to the control of the microprocessor 106 such that the object can be set at an angle appropriate for scanning.
  • the rotating drive part 102 is utilized to electrically rotate the rotating table 100 in the sixth embodiment of the present invention, it should be apparent that the rotating mechanism 104 may be manually rotated to allow an operator to control the rotating table 100 arbitrarily.
  • the marker generators and the object can be rotated together in a fixed state, not only the rotating table 100 but also other devices may be possibly applied herein.
  • the microprocessor 106 receives 2D image data and 3D scan data photographed by the image obtaining part 18 at various angles and analyzes the data for arranging the 3D scan data automatically on one coordinate system, the detailed operation procedures of which are the same as those of the microprocessor in the first embodiment.
  • the object 10 and the marker generator 12 are rotated during the scanning process instead of the image obtaining part 18 and the pattern projector 16.
  • the microprocessor 106 controls the rotating drive part 102 to drive the rotating mechanism 104, thus rotating the rotating table 100 at a predetermined angle so that the object 10 can be rotated to a position appropriate for scanning (S50).
  • the microprocessor 106 controls the marker blinking controller
  • the microprocessor 106 receives the 2D image data obtained by the image obtaining part 18 via the image input part 24 (S52).
  • the microprocessor 106 controls the marker blinking controller 26 to turn off the marker generators 12, thereby preventing the optical markers from being projected on the object 10 (S53).
  • the same region of the object 10 without the markers is photographed and the 2D image data thereof is received via the image input part 24 (S54).
  • the microprocessor 106 controls the projector controller 28 to activate the pattern projector 16 while the marker generators 12 are turned off to prevent the optical markers from being projected. Therefore, prescribed patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10 for 3D scanning.
  • the microprocessor 106 receives the 3D scan data via the image input part 24 (S55).
  • the microprocessor 106 calculates 2D positions of the markers by image-processing the 2D image data that contains and lacks optical markers (S56).
  • the microprocessor 30 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data (S57). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the
  • 2D image data intersect the 3D scan data.
  • the microprocessor 106 discriminates whether the register of buffer 32 is vacant or not (S58).
  • the microprocessor 106 compares the 3D positions of the markers obtained at S57 with those of the markers included in the 3D scan data stored in the registers of the buffer 32, thereby searching markers that correspond to each other (S59).
  • the microprocessor 106 computes translation matrices (S60) by analyzing the relation of the corresponding markers, and the current scan data are translated to a reference coordinate by which the 3D scan data listed with the register of the buffer 32 is defined (S61), which is the same as described in the first embodiment.
  • the microprocessor 106 registers the markers at the register of the buffer 32, which serves as a reference in the next calculation (S62). Successively, the microprocessor 106 checks whether the automatic arrangement of the 3D scan data obtained from the object 10 is completed (S63).
  • the sixth embodiment of the present invention is so constructed as to allow the object 10 to be moved, and thus, it is easy to obtain and arrange 3D scan data from an object relatively smaller than that of the first embodiment of the present invention where the projector and the image obtaining part are structured to move.
  • the marker generators are fixed on the rotating table for preventing relative movement between them, until the scanning process is completed.
  • the arrangement method by using reference coordinates in the aforementioned embodiments has a drawback in that errors can increase if the number of regions to be scanned are great. Because in the above methods, arrangement is carried out by integrating one 3D scan data to a reference coordinate system in which its neighboring 3D scan data already obtained is defined, and the arrangement process is repeated over all regions of the object. Therefore, a careless error at one process can be amplified at the end of the arrangement.
  • Figs. 18a and 18b illustrate two scan data obtained by scanning two adjacent regions that overlap each other.
  • the dotted lines indicate real data of an object and the solid lines indicate scan data which are not identical to the real data.
  • a method of arranging 3D scan data on an absolute coordinate system instead of a reference coordinate system, is presented in the seventh and eighth embodiments of the present invention.
  • the absolute coordinate system in these embodiments is different from the reference coordinate system in that every 3D scan data of the regions of an object is mapped to an absolute coordinate. Therefore, errors occurring in obtaining a 3D scan data is not transmitted to obtaining adjacent 3D scan data.
  • Figs.19a and 19b illustrate two scan data obtained from scanning two adjacent regions, and a part of the scan data overlaps. If the two scan data of Figs.19a and 19b are translated to an absolute coordinate system respectively and are attached to each other, as shown in Fig. 19d, errors in occurring in the two scan data, respectively, are not added as shown in Fig. 19c, such that an error amplification problem caused by inaccuracy of the image obtaining part thus described can be prevented.
  • An apparatus for automatically arranging 3D scan data includes a marker generator 12, a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120.
  • a marker generator 12 includes a projector 16, an image obtaining part 18, a first movement driving part 20, a first movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a large domain image obtaining part 110, an image input part 112, a second movement driving part 114, a second movement mechanism 116, a microprocessor 118, and a reference object 120.
  • the large domain image obtaining part 110 comprises an image sensor for receiving images such as CCD camera or Complementary Metal Oxide Semiconductor (CMOS) camera.
  • CMOS Complementary Metal Oxide Semiconductor
  • the large domain image obtaining part 10 is separately positioned from the image obtaining 1087
  • part 18 to photograph and obtain an image for large domain of the object 10.
  • the large domain image obtaining part 110 is preferred to adopt an image sensor having a relatively higher accuracy than that of the image obtaining part 18 for obtaining images for part of the scan domain.
  • the image input part 112 receives image data from the image obtaining part 18 and the large domain image obtaining part 110.
  • the second movement driving part 114 drives the second movement mechanism 116 to move the large domain image obtaining part 110 to a position suitable for obtaining the image of large part of the object 10 according to the driving control of the microprocessor 118.
  • the large domain image obtaining part 110 is moved electrically by the second movement driving part 114 in the seventh embodiment of the present invention, it may be moved by manipulating the second movement mechanism.
  • the microprocessor 118 computes 3D positions of each marker for the large scan domain by analyzing image data of the object 10 and a reference object 120 photographed by the large domain image obtaining part 110 in two or more different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
  • the 3D positions of the markers thus obtained serve as an absolute coordinate system.
  • the microprocessor 118 receives via the image input part 112 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinate system, which results in an arrangement of a complete 3D scan data of the object 10.
  • the reference object 120 an object of a prescribed shape whose size (dimension) information is pre-inputted in the microprocessor 118, is arranged close to the object 10. An image of the reference object 120 is obtained along with that of the object 10 via the large domain image obtaining part 110.
  • an object 10 is placed beside the marker generator 12, and a reference object
  • the microprocessor 118 controls a second movement driving part 114 to drive the second movement mechanism 116 so that the large domain image obtaining part 110 moves to a position suitable for scanning the object 10.
  • the microprocessor 118 controls the marker blinking controller 26 to turn on a plurality of marker output parts 14 equipped at the marker generator 12, whereby a plurality of markers can be arbitrarily projected on the surface of the object 10 (S70).
  • the large domain of the object 10 and the reference object 120 are photographed by the large domain image obtaining part 18 to obtain a 2D image data including optical markers. Then the microprocessor 118 receives the 2D image data obtained from the large domain image obtaining part 110 via the image input part 112 (S71).
  • FIG. 23 an example of an image including the entire domain of the object 10 and the reference object 120 obtained by the large domain image obtaining part 110 is shown.
  • RM indicates an optical marker projected on the surface of the object 10 while another reference symbol “Bl " refers to an image obtained by the large domain image obtaining part 110.
  • the microprocessor 118 controls the second movement driving part 114 to drive the second movement mechanism for moving the large domain image obtaining part 118 to a position suitable for scanning another part of the object 10 (S72).
  • the microprocessor 118 controls the large domain image obtaining part 118 to photograph the large domain of the object 10 including the reference object 120, whereby a 2D image including the optical markers in a different direction from that of S71 is obtained.
  • the 2D image is received by the microprocessor 118 via the image input part 112 (S73).
  • the microprocessor 118 then controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S74).
  • the microprocessor 118 combines the 2D images of the large domain of the object 10 obtained in different directions obtained by the large domain image obtaining part 110 and computes the 3D positions of the markers included in the combined 2D images in reference to the dimension of the reference object 120 already-known (S75). Next, the microprocessor
  • the microprocessor 118 controls the first movement driving part 20 to drive the first movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 is moved to a position suitable for scanning the object 10 (S77).
  • the microprocessor 118 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12 to allow the plural markers to be arbitrarily projected on the surface of the object 10 (S78).
  • the microprocessor 18 receives the 2D image data obtained by the image obtaining part 18 via the image input part 112 (S79).
  • the microprocessor 118 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the object 10 (S80). Under this condition, the same part of the large domain mentioned above is photographed by the image obtaining part 18 to obtain the 2D image not included with optical markers. The 2D image data thus obtained is inputted to the microprocessor via the image input part 112 (S 81).
  • microprocessor 118 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected, whereby predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 118 receives the 3D scan data via the image input part 112 (S82). Under such circumstances, the microprocessor 118 computes the 2D positions of the markers by image-processing the 2D image data including the markers and the 2D image data not including the markers (S83).
  • the microprocessor 118 computes the 3D positions of the markers from the 2D positions of the markers and the 3D scan data. That is, the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of three arbitrary markers in the 2D image data intersect the 3D scan data (S84).
  • the microprocessor 118 compares the 3D positions of markers found at
  • the microprocessor 118 calculates the translation matrices for translating the markers in the current 3D scan data to the absolute coordinate system (S86). Then, the current scan data are moved by the translation matrices to be arranged on the absolute coordinate system by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S87).
  • the microprocessor 118 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the object 10 are all arranged or not (S88).
  • Steps S77 to S88 are repeatedly carried out.
  • a large domain image obtaining part and an image obtaining part are introduced by two different elements, it may be preferably possible that one image obtaining part is utilized for obtaining both images of the large domain of the object and images for parts of the large domain of the object.
  • the apparatus for automatically arranging 3D scan data using optical markers includes a marker generator 12, a pattern projector 16, an image obtaining part 18, a movement driving part 20, a movement mechanism 22, a marker blinking controller 26, a project controller 28, a buffer 32, a pair of or plural large domain image obtaining parts 130 and 132, an image input part 134 and a microprocessor 136.
  • the pair of large domain image obtaining parts 130 and 132 comprise image sensors for receiving images such as CCD cameras or CMOS cameras.
  • the cameras are fixed to each other, and they capture images of the same object from different angles, the method of which is called Stereo Vision.
  • the large domain image obtaining parts 130 and 132 are preferred to adopt image sensors of relatively higher resolution than the image obtaining part 10 for obtaining images of parts of the domain.
  • the image input part 134 is intended to receive image data obtained by the image obtaining part 18 and the large domain image obtaining parts 130 and 132.
  • the microprocessor 136 computes 3D positions of each marker for large scan domains by analyzing image data of the object 10 photographed by the large domain image obtaining parts 130 and 132 in two different directions, while a plurality of optical markers are projected on the surface of the object 10 by the marker generator 12.
  • the 3D positions of markers thus obtained serve as an absolute coordinate system.
  • the microprocessor 136 receives via the image input part 134 a plurality of 2D image data and 3D scan data photographed at various angles by the image obtaining part 18, and analyzes them, and translates every 3D scan data to the absolute coordinates, which results in an arrangement of the whole 3D scan data of the object 10.
  • a predetermined object 10 is placed beside the marker generator 12, and the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, allowing plural markers to be arbitrarily projected on the surface of the object 10 (S90).
  • the microprocessor 118 receives two 2D image data obtained from the large domain image obtaining parts 130 and 132 via the image input part 134 when the large domain of the object 10 is photographed in an overlapping manner in different directions by the large domain image obtaining parts 130 and 132, respectively, while the optical markers from the marker generator 12 are projected on the object 10 (S91).
  • Fig. 26 illustrates an example of images of the large scan domain of the object 10 obtained by the large domain image obtaining parts 130 and 132.
  • RM indicates an optical marker projected on the surface of the object 10
  • BI-1 is an image obtained by the large domain image obtaining part 132
  • BI-2 is an image obtained by the large domain image obtaining part 130.
  • the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, thereby preventing the optical markers from being projected on the surface of the object 10 (S92).
  • the microprocessor 136 computes 3D positions of the markers included in the large scan domain of the object in reference to the two 2D image data photographed in two different directions by the large domain image obtaining parts 130 and 132 (S93). In other words, from the relation between the positions of the pair of the large domain image obtaining parts 130 and 132 and 2D positions of each marker projected on the object 10, 3D positions of each marker is calculated by triangulation, the details of which will be explained later. Next, the microprocessor 136 registers the 3D positions of each marker thus calculated in the register of the buffer 32 (S 94).
  • the microprocessor 136 then controls the movement driving part 20 to drive the movement mechanism 22, whereby the image obtaining part 18 integrated with the pattern projector 16 moves to a position suitable for scanning the object 10 (S95).
  • the microprocessor 136 controls the marker blinking controller 26 to turn on the plurality of marker output parts 14 equipped at the marker generator 12, thereby projecting the plural markers arbitrarily on the surface of the object 10 (S96).
  • the microprocessor 136 receives the 2D image data via the image input part 134 (S97).
  • the microprocessor 136 controls the marker blinking controller 26 to turn off the marker generator 12, preventing the optical markers from being projected on the object 10 (S98). Under such condition, when the same part as mentioned above is photographed by the image obtaining part 18 to obtain 2D image date not including the optical markers, the microprocessor 136 receives the 2D image data via the image input part 112 (S99).
  • microprocessor 136 controls the project controller 28 to activate the pattern projector 16 while the marker generator 12 is turned off to avoid the optical markers from being projected. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the object 10.
  • the microprocessor 136 receives the 3D scan data via the image input part 112 (S 100).
  • the microprocessor 136 analyzes the 2D image data including the markers and the 2D image data not including the markers for calculating 2D positions of the markers (S 101 ) .
  • the microprocessor 136 computes 3D positions of the markers in reference to the 2D positions of the markers and the 3D scan data (S102). That is, the 3D positions of the marker can be obtained by estimating intersection points where straight lines connecting the camera lens center of the image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data. Successively, the microprocessor 136 compares the 3D positions of markers found at SI 02 with 3D positions of markers stored in the register of the buffer 32 at S94 to search corresponding markers (S103).
  • the microprocessor 136 calculates the translation matrices for translating the markers in the current 3D scan data (S104).
  • the current scan data are moved by the translation matrices to be arranged on the absolute coordinates by which the 3D positions of the markers stored in the register of the buffer 32 are defined (S105).
  • the microprocessor 136 discriminates whether an automatic arrangement in relation to the 3D data obtained from the object 10 is completed or not, in other words, whether 3D scan data obtained from parts of the entire scan data domain for the object 10 are all arranged (SI 06).
  • Steps S95 to SI 06 are repeatedly carried out.
  • a pair of large domain image obtaining parts, an image obtaining part and a marker generator are configured separately, the pair of large domain image obtaining parts and the marker generator may be integrally configured as a modification thereto. In this case, this may be more convenient because there is no need to set positions of the pair of large domain image obtaining parts according to a domain where the optical markers are projected.
  • a pair of large domain image obtaining parts and an image obtaining part may be integrally constructed.
  • the scan domain in the process of obtaining an absolute coordinate, the scan domain may become a bit smaller and accuracy may also be decreased, however, in the process of obtaining parts of images of the scan domain it is not necessary to photograph in an overlapping manner. Therefore, the number of scanning process can be reduced.
  • the large domain image obtaining parts 130 and 132 disclosed in the eighth embodiment of the present invention can be modeled by two cameras facing one object, which can be modified according to the field of applications.
  • two cameras are arranged in parallel as shown in Fig. 27.
  • the variables in Fig. 27 are defined below.
  • A, B image plane obtained by each camera
  • a method for obtaining a position of a point by using stereo image can be defined in Formulas (Equations) 15 and 16.
  • a plurality of projectors, image obtaining parts and marker generators are arranged around an object, such that the projectors and image obtaining parts need not be moved to obtain 2D images and 3D scan data in relation to the entire scan domain of the object, and one scanning operation makes it possible to obtain 2D images and 3D scan data, thereby simplifying the job and shortening the consuming time involved therein
  • Fig 28 is a schematic drawing for illustrating a construction of an apparatus for automatically arranging 3D scan data using optical markers according to the ninth embodiment of the present invention, wherein the apparatus comprises N number of marker generators 142,
  • M number of pattern projectors 146 L number of image obtammg parts 148, an image input part 150, a project controller 152, a marker blinking controller 154, a microprocessor 156 and a buffer 158.
  • the N number of marker generators 142 intended to project markers recognizable by the image obtaining part 148 on the surface of an object, are disposed with a plurality of marker output parts 144 for projecting a plurality of optical markers on the entire surface of the object 10 in arbitrary scanning directions.
  • the N number of marker generators 142 are directed to the object 10, and are apart from each other at a predetermined interval, and the markers are so arranged as to cover the entire object.
  • the M number of pattern projectors 146 project predetermined patterns or laser striped patterns on the surface of the object 10 for obtaining 3D scan data.
  • LCD projectors may be utilized to project space-coded beams or laser beams on the surface of the object 10, thereby obtaining 3D scan data via the image obtaining part 148.
  • the M number of pattem projectors 146 are directed to the object 10, and are apart from each other at a predetermined interval, and the space-coded beams projected from each pattern projector 146 are made to cover the entire domain of the object 10.
  • the L number of image obtaining parts 148 which comprise image sensors capable of receiving images, such as CCD cameras or CMOS cameras, photograph and obtain images of the object 10. It is preferable that each of the L number of image obtaining parts 148 is integrated with an individual pattern projector 146 instead of being separated.
  • the N number of image- obtaining parts 148 are directed to the object 10, and are apart from each other at a predetermined interval, and the scanning domain of image obtaining parts 148 cover the entire domain of the object 10.
  • the image input part 150 receives each image data obtained from the L number of image obtaining parts 148, and the project controller 152 controls the transfer speed and transfer directions of pattern films and the blinking cycle of light sources for projecting the pattern films.
  • the marker blinking controller 154 periodically blinks each optical markers from the N number of marker generators 142 according to the control of the microprocessor 156.
  • the microprocessor 156 computes the 3D positions of the markers on each domain in reference to the 2D image data and 3D scan data obtained from the L number of image obtaining parts 148, respectively, and searches corresponding markers on every overlapped domains in reference to the 3D positions of the markers, and calculates translation matrices by using the corresponding markers. As a result, the microprocessor 156 arranges each of the 3D scan data by the translation matrices.
  • the buffer 158 stores data necessary for computing and the resultant data thereof.
  • the object 10 is placed in a position suitable for scanning, and the N number of marker generators 142, M number of pattem projectors 146 and L number of image obtaining parts 148 are arranged around the object 10. Then, the microprocessor 156 controls the marker blinking controller 154 to turn on the plural marker output parts 144 each equipped at the N number of marker generators 142, thereby allowing the plural markers to be arbitrarily projected on the surface of the object 10 (S 110).
  • the microprocessor 156 receives the L number of 2D image data obtained from the L number of image obtaining parts 148 via the image input part 150 (Sill).
  • the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing optical markers from being projected on the surface of the object (SI 12). Under such condition, when the same domain of the object 10 as mentioned above is photographed by each L number of image obtaining parts 148 to obtain L number of 2D images not including the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150 (S113).
  • microprocessor 156 controls the project controller 152 to operate the
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattem are projected on the surface of the object 10 from the M number of pattern projectors 146.
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S 114). Under such circumstances, the microprocessor 156 calculates 2D positions of the markers by image-processing the 2D image data including the optical markers and the 2D image data not including the markers (SI 15).
  • microprocessor 136 computes 3D positions of the markers from the
  • the 3D positions of the markers can be obtained by estimating intersection points where straight lines connecting the camera lens centers of the L number of image obtaining part 18 and the positions of arbitrary markers in the 2D image data intersect the 3D scan data.
  • the microprocessor 118 compares the 3D positions of markers for each L number of 3D scan data for searching corresponding markers (SI 17).
  • the microprocessor 156 calculates the translation matrices for translating the markers in the current 3D scan data (SI 18).
  • One of the L number of 3D scan data is set up as a reference coordinate and the current 3D scan data moves according to the obtained translation matrices for arrangement (SI 19).
  • a reference object whose dimensions is already known is placed in a position suitable for scanning, and N number of marker generators 142, M number of pattern projectors 146 and L number of image obtaining parts 148 are respectively arranged around the reference object.
  • the reference object may be specially manufactured for calibration, or may be an actual object if dimensions thereof are already known.
  • the microprocessor 156 controls the marker blinking controller
  • the microprocessor 156 carries out calibration for seeking a correlation between the reference object and the L number of image obtaining parts 148 (S121). The detailed operating processes thereto is described below.
  • step S121 optical markers from the N number of marker generators 142 are projected on the surface of the reference object, and when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain 2D images containing the optical markers, the microprocessor 156 receives the L number of 2D image data via the image input part 150.
  • the microprocessor 156 controls the project controller 152 to operate the M number of pattern projectors 146. Therefore, predetermined patterns (for example, patterns having stripes with different gaps therebetween or multiple striped pattern) are projected on the surface of the reference object from the M number of pattem projectors 146.
  • predetermined patterns for example, patterns having stripes with different gaps therebetween or multiple striped pattern
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150.
  • the microprocessor 156 estimates 3D positions of the markers in the L number of 3D scan data by calculating intersection points where straight lines connecting each center of the cameras equipped in the L number of image obtaining parts 148 with the markers included in the each 2D image data intersect the 3D scan data, respectively.
  • the microprocessor 156 compares the 3D positions of the markers for each L number of 3D scan data for searching corresponding markers, and calculates translation matrices from the relations of the corresponding markers. Then, the microprocessor 156 registers the obtained translation matrices in the register of the buffer 158, whereby calibration is completed at S 121.
  • the reference object is removed and the object 10 is placed in a place where the reference object has been removed, and the microprocessor 156 controls the marker blinking controller 154 to turn off the N number of marker generators 142, thereby preventing the optical markers from being projected on the surface of the object 10 (S122).
  • the microprocessor 156 receives the L number of 2D image data via the image input part 150 when the scan domains of the object 10 are photographed by the L number of image obtaining parts 148 to obtain the L number of 2D images not including the optical markers (S123).
  • microprocessor 156 controls the project controller 152 to activate the
  • predetermined pattems for example, pattems having stripes with different gaps therebetween or multiple striped pattem are projected on the surface of the object 10 from the M number of pattem projectors 146.
  • the microprocessor 156 receives the L number of 3D scan data via the image input part 150 (S124).
  • the microprocessor 156 reads out the translation matrices stored in the register of the buffer 158, and sets one of the L number of 3D scan data as a reference, and moves L-l number of 3D scan data by the translation matrices (S 125).
  • S 121 to S 123 can be omitted. Since 3D scan data can be arranged by the translation matrices stored in the register of the buffer 158, the scanning time can be reduced. However, if needed, calibration from S121 to S123 may be executed for every scan, and can be easily changed, modified or altered according to the operator's intention or system configuration.
  • the eleventh embodiment provides marker generators and peripherals different from those used in the first to tenth embodiments of the present invention.
  • a marker generator of the eleventh embodiment, as shown in Fig. 31, includes a plurality of light sources of X-axis, a blinking controller, a polygon mirror 164 rotating about the X-axis, a rotary driving part 166, a rotary mechanism 168, a plurality of light sources of the
  • the plurality of light sources of the X-axis 160 generate beams having excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 164.
  • the light sources of the X-axis may be, for example, laser pointers.
  • the blinking controller 162 blinks each light source 160 according to the control of a microprocessor (not shown).
  • the polygon mirror 164 equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 168 in order to reflect the plural beams, thereby projecting the plural beams on the surface of an object (OB).
  • the rotary driving part 166 drives the rotary mechanism 168 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
  • the plurality of light sources of Y-axis 170 generate beams of excellent straight traveling properties such as laser beams to be emitted to the reflecting surface of the polygon mirror 174.
  • the light sources may be, for example, laser pointers.
  • the blinking controller 172 blinks each light source according to the control of a microprocessor (not shown).
  • the polygon mirror equipped with a plurality of reflecting surfaces is rotated by the rotary mechanism 178 in order to reflect the plurality of beams, thereby projecting the plural beams on surface of the object (OB).
  • the rotary driving part 176 drives the rotary mechanism 178 to rotate the polygon mirror to one direction in response to the control of a microprocessor.
  • a driving power generated by the rotary driving part 166 and the rotary driving part 176 is applied to the rotary mechanism 168 and the rotary mechanism 178 according to a control signal of a microprocessor.
  • the rotary mechanism 168 and the rotary mechanism 178 respectively driven by the driving power rotate the polygon mirrors 164 and 174.
  • the light sources 160 and 170 are lighted by the blinking controller 162 and 172 in response to the control signal of the microprocessor, beams created by the plurality of light sources 160 and 170 are emitted to the reflecting surfaces of polygon mirrors 164 and 174. Then, the beams are projected on the surface of the object (OB).
  • the polygon mirrors 164 and 174 are rotated to make the angle of the reflecting surfaces different. Therefore, on the surface of the object (OB) lines of plural number of beams are formed in the directions of the X-axis and Y-axis, and intersecting points where lines of X-axis and Y-axis are intersected become optical markers (RM), respectively.
  • OB object
  • RM optical markers
  • m*n number of intersecting points can be formed on the surface of the object (OB), and m*n number of intersecting points become respective optical markers (RM). Therefore, it is possible to employ a small number of light sources to generate relatively large number of optical markers.
  • the present invention is disclosed to provide an apparatus and method for automatically arranging 3D scan data obtained from different angles and positions using optical markers.
  • optical markers having no physical volume are used to find out relative positions of mutually different scan data, such that scan data are not lost or damaged even in portions where markers exist.
  • there is no need for placing markers on or removing markers from an object for scanning the object thereby providing convenience and safety in scanning, and preventing damage to the object as a result of attaching and removing markers.
  • the present invention can be used infinitely.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un appareil et un procédé permettant d'agencer automatiquement des données de capture optique 3D au moyen de marques optiques, les marques sans contact ayant été préférées pour agencer automatiquement les données 3D, sans endommager les pièces objet de la capture optique. Cet appareil utilisant les marques optiques pour agencer automatiquement les données de capture optique 3D obtenues en photographiant un objet sous différents angles comprend plusieurs modules. Un générateur de marques projette une pluralité de marques optiques sur une surface de l'objet. Un projecteur de motifs projette des motifs sur la surface de l'objet de façon à donner des données de capture optique 3D. Un capteur d'images assure l'obtention, d'une part d'images numériques 2D de l'objet avec les marques en projection sur la surface de l'objet, et d'autre part de données de capture optique 3D de l'objet au moyen des motifs projetés sur la surface de l'objet. Enfin, un contrôleur calcule les positions 3D des marques à partir de la relation entre l'image numérique 2D et les données de capture optique 3D, puis il calcule les positions relatives des données de capture optique 3D sur la base des positions 3D des marques.
PCT/KR2003/001087 2002-07-25 2003-06-03 Procede et appareil pour agencer automatiquement les donnees de capture optique 3d au moyen de marques optiques WO2004011876A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2004524349A JP4226550B2 (ja) 2002-07-25 2003-06-03 光学式マーカーを用いた三次元測定データ自動整列装置及びその方法
AU2003241194A AU2003241194A1 (en) 2002-07-25 2003-06-03 Apparatus and method for automatically arranging three dimensional scan data using optical marker

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
KR10-2002-0043830 2002-07-25
KR20020043830 2002-07-25
KR10-2003-0022624 2003-04-10
KR10-2003-0022624A KR100502560B1 (ko) 2002-07-25 2003-04-10 광학식 마커를 이용한 3차원 측정 데이터 자동 정렬장치및 그 방법

Publications (1)

Publication Number Publication Date
WO2004011876A1 true WO2004011876A1 (fr) 2004-02-05

Family

ID=31190416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2003/001087 WO2004011876A1 (fr) 2002-07-25 2003-06-03 Procede et appareil pour agencer automatiquement les donnees de capture optique 3d au moyen de marques optiques

Country Status (4)

Country Link
JP (1) JP4226550B2 (fr)
CN (1) CN1300551C (fr)
AU (1) AU2003241194A1 (fr)
WO (1) WO2004011876A1 (fr)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008500524A (ja) * 2004-05-25 2008-01-10 アンシディス 表面歪み測定装置
WO2008017878A2 (fr) * 2006-08-11 2008-02-14 The University Of Leeds Imagerie optique d'objets physiques
WO2009046519A1 (fr) * 2007-10-11 2009-04-16 Hydro-Quebec Système et méthode de cartographie tridimensionnelle d'une surface structurelle
US8170329B2 (en) 2008-07-18 2012-05-01 Fuji Xerox Co., Ltd. Position measuring system, position measuring method and computer readable medium
EP2568253A1 (fr) * 2010-05-07 2013-03-13 Shenzhen Taishan Online Technology Co., Ltd. Procédé et système de mesure de lumière structurée
EP2574876A1 (fr) 2011-09-30 2013-04-03 Steinbichler Optotechnik GmbH Procédé et dispositif destinés à déterminer les coordonnées 3D d'un objet
US8434874B2 (en) 2006-05-30 2013-05-07 Panasonic Corporation Pattern projection light source and compound-eye distance measurement apparatus
WO2014014635A1 (fr) * 2012-07-20 2014-01-23 Google Inc. Systèmes et procédés d'acquisition d'image
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
CN105232161A (zh) * 2015-10-16 2016-01-13 北京天智航医疗科技股份有限公司 一种手术机器人标志点识别定位方法
WO2016068598A1 (fr) * 2014-10-30 2016-05-06 한국생산기술연구원 Ensemble tête multivoie pour appareil de modélisation en trois dimension, ayant un miroir polygonal tournant dans une seule direction et appareil de modélisation en trois dimensions utilisant ce dernier
DE102016120026A1 (de) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Messvorrichtung und Verfahren, Programm, Produktherstellungsverfahren, Kalibrierungsmarkierungselement, Verarbeitungsvorrichtung und Verarbeitungssystem
DE102009032771B4 (de) * 2009-07-10 2017-06-29 Gom Gmbh Messeinrichtung und Verfahren zum dreidimensionalen optischen Vermessen von Objekten
CN108225218A (zh) * 2018-02-07 2018-06-29 苏州镭图光电科技有限公司 基于光学微机电系统的三维扫描成像方法及成像装置
DE102013110667B4 (de) 2013-09-26 2018-08-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren zum bildgebenden zerstörungsfreien Prüfen von dreidimensionalen Werkstücken und Vorrichtung zur Durchführung eines derartigen Verfahrens
DE102017109854A1 (de) * 2017-05-08 2018-11-08 Wobben Properties Gmbh Verfahren zur Referenzierung mehrerer Sensoreinheiten und zugehörige Messeinrichtung
US11077557B2 (en) 2010-05-14 2021-08-03 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
US20220122282A1 (en) * 2020-10-19 2022-04-21 Robert Bosch Gmbh Method for generating an optical marker, method for recognizing an optical marker, and marker device that includes the optical marker

Families Citing this family (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070057946A1 (en) * 2003-07-24 2007-03-15 Dan Albeck Method and system for the three-dimensional surface reconstruction of an object
US8111904B2 (en) 2005-10-07 2012-02-07 Cognex Technology And Investment Corp. Methods and apparatus for practical 3D vision system
WO2008056427A1 (fr) * 2006-11-08 2008-05-15 Techno Dream 21 Co., Ltd. Procédé de mesure de forme tridimensionnelle et dispositif correspondant
KR20080043047A (ko) * 2006-11-13 2008-05-16 주식회사 고영테크놀러지 새도우 모아레를 이용한 3차원형상 측정장치
US8126260B2 (en) 2007-05-29 2012-02-28 Cognex Corporation System and method for locating a three-dimensional object using machine vision
JP5322206B2 (ja) * 2008-05-07 2013-10-23 国立大学法人 香川大学 3次元形状の計測方法および装置
US9734419B1 (en) 2008-12-30 2017-08-15 Cognex Corporation System and method for validating camera calibration in a vision system
JP5435994B2 (ja) * 2009-03-18 2014-03-05 本田技研工業株式会社 非接触形状測定装置
US9533418B2 (en) 2009-05-29 2017-01-03 Cognex Corporation Methods and apparatus for practical 3D vision system
JP5375479B2 (ja) * 2009-09-17 2013-12-25 コニカミノルタ株式会社 三次元測定システムおよび三次元測定方法
CN101813461B (zh) * 2010-04-07 2011-06-22 河北工业大学 基于复合彩色条纹投影的绝对相位测量方法
CN102346011A (zh) * 2010-07-29 2012-02-08 上海通用汽车有限公司 测量工具和测量方法
US9124873B2 (en) 2010-12-08 2015-09-01 Cognex Corporation System and method for finding correspondence between cameras in a three-dimensional vision system
DE102013203399A1 (de) * 2013-02-28 2014-08-28 Siemens Aktiengesellschaft Verfahren und Projektionsvorrichtung zur Markierung einer Oberfläche
CN104315975A (zh) * 2014-10-22 2015-01-28 合肥斯科尔智能科技有限公司 一种线性三维高精度扫描方法
CN104359405B (zh) * 2014-11-27 2017-11-07 上海集成电路研发中心有限公司 三维扫描装置
KR101788131B1 (ko) * 2016-01-18 2017-10-19 한화첨단소재 주식회사 전기자동차의 열경화성수지 조성물 시트의 금형 안착위치를 표시하는 장치
US20180108178A1 (en) * 2016-10-13 2018-04-19 General Electric Company System and method for measurement based quality inspection
CN107169964B (zh) * 2017-06-08 2020-11-03 广东嘉铭智能科技有限公司 一种检测弧面反光镜片表面缺陷的方法和装置
CN112461138B (zh) * 2020-11-18 2022-06-28 苏州迈之升电子科技有限公司 一种交叉扫描测量方法及其测量光栅和应用

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5276613A (en) * 1988-12-14 1994-01-04 Etienne Schlumberger Process and device for coordinating several images of the same object
US5227985A (en) * 1991-08-19 1993-07-13 University Of Maryland Computer vision system for position monitoring in three dimensions using non-coplanar light sources attached to a monitored object
RU2123718C1 (ru) * 1996-09-27 1998-12-20 Кузин Виктор Алексеевич Способ ввода информации в компьютер

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6239868B1 (en) * 1996-01-02 2001-05-29 Lj Laboratories, L.L.C. Apparatus and method for measuring optical characteristics of an object
US6417917B1 (en) * 1996-01-02 2002-07-09 Lj Laboratories, Llc Apparatus and method for measuring optical characteristics of an object

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008500524A (ja) * 2004-05-25 2008-01-10 アンシディス 表面歪み測定装置
JP4739330B2 (ja) * 2004-05-25 2011-08-03 アンシディス 表面歪み測定装置
US8434874B2 (en) 2006-05-30 2013-05-07 Panasonic Corporation Pattern projection light source and compound-eye distance measurement apparatus
WO2008017878A2 (fr) * 2006-08-11 2008-02-14 The University Of Leeds Imagerie optique d'objets physiques
WO2008017878A3 (fr) * 2006-08-11 2008-04-03 Univ Heriot Watt Imagerie optique d'objets physiques
WO2009046519A1 (fr) * 2007-10-11 2009-04-16 Hydro-Quebec Système et méthode de cartographie tridimensionnelle d'une surface structurelle
US8462208B2 (en) 2007-10-11 2013-06-11 Hydro-Quebec System and method for tridimensional cartography of a structural surface
US8170329B2 (en) 2008-07-18 2012-05-01 Fuji Xerox Co., Ltd. Position measuring system, position measuring method and computer readable medium
DE102009032771B4 (de) * 2009-07-10 2017-06-29 Gom Gmbh Messeinrichtung und Verfahren zum dreidimensionalen optischen Vermessen von Objekten
EP2568253A1 (fr) * 2010-05-07 2013-03-13 Shenzhen Taishan Online Technology Co., Ltd. Procédé et système de mesure de lumière structurée
JP2013525821A (ja) * 2010-05-07 2013-06-20 深▲せん▼泰山在線科技有限公司 構造化光測定方法及びシステム
EP2568253A4 (fr) * 2010-05-07 2013-10-02 Shenzhen Taishan Online Tech Procédé et système de mesure de lumière structurée
US11077557B2 (en) 2010-05-14 2021-08-03 Cognex Corporation System and method for robust calibration between a machine vision system and a robot
DE102011114674B4 (de) * 2011-09-30 2015-01-29 Steinbichler Optotechnik Gmbh Verfahren und Vorrichtung zum Bestimmen der 3D-Koordinaten eines Objekts
US20130271573A1 (en) * 2011-09-30 2013-10-17 Steinbichler Optotechnik Gmbh Method and apparatus for determining the 3d coordinates of an object
EP2574876A1 (fr) 2011-09-30 2013-04-03 Steinbichler Optotechnik GmbH Procédé et dispositif destinés à déterminer les coordonnées 3D d'un objet
DE102011114674C5 (de) 2011-09-30 2020-05-28 Steinbichler Optotechnik Gmbh Verfahren und Vorrichtung zum Bestimmen der 3D-Koordinaten eines Objekts
DE102011114674A1 (de) * 2011-09-30 2013-04-04 Steinbichler Optotechnik Gmbh Verfahren und Vorrichtung zum Bestimmen der 3D-Koordinaten eines Objekts
EP2574876B1 (fr) 2011-09-30 2017-11-15 Carl Zeiss Optotechnik GmbH Procédé et dispositif destinés à déterminer les coordonnées 3D d'un objet
DE202012013561U1 (de) 2011-09-30 2017-12-21 Carl Zeiss Optotechnik GmbH Vorrichtung zur Bestimmung der 3D-Koordinaten eines Objekts
US10200670B2 (en) 2011-09-30 2019-02-05 Carl Zeiss Optotechnik GmbH Method and apparatus for determining the 3D coordinates of an object
US9163938B2 (en) 2012-07-20 2015-10-20 Google Inc. Systems and methods for image acquisition
WO2014014635A1 (fr) * 2012-07-20 2014-01-23 Google Inc. Systèmes et procédés d'acquisition d'image
US20140375794A1 (en) * 2013-06-25 2014-12-25 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
US9789462B2 (en) * 2013-06-25 2017-10-17 The Boeing Company Apparatuses and methods for accurate structure marking and marking-assisted structure locating
DE102013110667B4 (de) 2013-09-26 2018-08-16 Deutsches Zentrum für Luft- und Raumfahrt e.V. Verfahren zum bildgebenden zerstörungsfreien Prüfen von dreidimensionalen Werkstücken und Vorrichtung zur Durchführung eines derartigen Verfahrens
WO2016068598A1 (fr) * 2014-10-30 2016-05-06 한국생산기술연구원 Ensemble tête multivoie pour appareil de modélisation en trois dimension, ayant un miroir polygonal tournant dans une seule direction et appareil de modélisation en trois dimensions utilisant ce dernier
US10810788B2 (en) 2014-10-30 2020-10-20 Korea Institute Of Industrial Technology Multichannel head assembly for three-dimensional modeling apparatus, having polygon mirror rotating in single direction, and three-dimensional modeling apparatus using same
CN105232161A (zh) * 2015-10-16 2016-01-13 北京天智航医疗科技股份有限公司 一种手术机器人标志点识别定位方法
DE102016120026B4 (de) * 2015-10-22 2019-01-03 Canon Kabushiki Kaisha Messvorrichtung und Verfahren, Programm, Produktherstellungsverfahren, Kalibrierungsmarkierungselement, Verarbeitungsvorrichtung und Verarbeitungssystem
DE102016120026A1 (de) * 2015-10-22 2017-04-27 Canon Kabushiki Kaisha Messvorrichtung und Verfahren, Programm, Produktherstellungsverfahren, Kalibrierungsmarkierungselement, Verarbeitungsvorrichtung und Verarbeitungssystem
DE102017109854A1 (de) * 2017-05-08 2018-11-08 Wobben Properties Gmbh Verfahren zur Referenzierung mehrerer Sensoreinheiten und zugehörige Messeinrichtung
WO2018206527A1 (fr) 2017-05-08 2018-11-15 Wobben Properties Gmbh Procédé pour référencer plusieurs unités de détection et dispositif de mesure associé
CN108225218A (zh) * 2018-02-07 2018-06-29 苏州镭图光电科技有限公司 基于光学微机电系统的三维扫描成像方法及成像装置
US20220122282A1 (en) * 2020-10-19 2022-04-21 Robert Bosch Gmbh Method for generating an optical marker, method for recognizing an optical marker, and marker device that includes the optical marker

Also Published As

Publication number Publication date
CN1672013A (zh) 2005-09-21
JP4226550B2 (ja) 2009-02-18
CN1300551C (zh) 2007-02-14
AU2003241194A1 (en) 2004-02-16
JP2005534026A (ja) 2005-11-10

Similar Documents

Publication Publication Date Title
WO2004011876A1 (fr) Procede et appareil pour agencer automatiquement les donnees de capture optique 3d au moyen de marques optiques
US9672630B2 (en) Contour line measurement apparatus and robot system
US7456842B2 (en) Color edge based system and method for determination of 3D surface topology
CN100518488C (zh) 具有元件布局检查功能的抓取式设备
US7423666B2 (en) Image pickup system employing a three-dimensional reference object
JP2919284B2 (ja) 物体認識方法
US20070091174A1 (en) Projection device for three-dimensional measurement, and three-dimensional measurement system
US20060044546A1 (en) Ranging apparatus
CN101013028A (zh) 图像处理方法以及图像处理装置
KR100502560B1 (ko) 광학식 마커를 이용한 3차원 측정 데이터 자동 정렬장치및 그 방법
JPWO2008026722A1 (ja) 3次元モデルデータ生成方法及び3次元モデルデータ生成装置
WO2006135040A1 (fr) Dispositif et procédé de traitement d'image effectuant une mesure tridimensionnelle
CN108965690A (zh) 图像处理系统、图像处理装置及计算机可读存储介质
CN113954085A (zh) 一种基于双目视觉与线激光传感数据融合的焊接机器人智能定位与控制方法
EP1286309A2 (fr) Procédé automatique de planification de capteurs par CAO
JPH11166818A (ja) 三次元形状計測装置の校正方法及び校正装置
JPS60200111A (ja) 3次元物体認識装置
US6505148B1 (en) Method for combining the computer models of two surfaces in 3-D space
US11763473B2 (en) Multi-line laser three-dimensional imaging method and system based on random lattice
JPH09212643A (ja) 三次元物体認識方法及びその装置
CN100518487C (zh) 一种获取抓取式设备中多个图像的方法
JPH09329440A (ja) 複数枚の画像の各計測点の対応づけ方法
El-Hakim A hierarchical approach to stereo vision
CN114494316A (zh) 角点标记方法、参数标定方法、介质及电子设备
JPH0820207B2 (ja) 光学式3次元位置計測方法

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NI NO NZ OM PH PL PT RO RU SC SD SE SG SK SL TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2004524349

Country of ref document: JP

Ref document number: 20038178915

Country of ref document: CN

122 Ep: pct application non-entry in european phase