WO2011145285A1 - Image processing device, image processing method and program - Google Patents

Image processing device, image processing method and program Download PDF

Info

Publication number
WO2011145285A1
WO2011145285A1 PCT/JP2011/002561 JP2011002561W WO2011145285A1 WO 2011145285 A1 WO2011145285 A1 WO 2011145285A1 JP 2011002561 W JP2011002561 W JP 2011002561W WO 2011145285 A1 WO2011145285 A1 WO 2011145285A1
Authority
WO
WIPO (PCT)
Prior art keywords
pattern
dimensional
correspondence
image processing
curve
Prior art date
Application number
PCT/JP2011/002561
Other languages
French (fr)
Japanese (ja)
Inventor
洋 川崎
亮 古川
立昌 佐川
八木 康史
Original Assignee
有限会社テクノドリーム二十一
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 有限会社テクノドリーム二十一 filed Critical 有限会社テクノドリーム二十一
Publication of WO2011145285A1 publication Critical patent/WO2011145285A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2522Projection by scanning of the object the position of the object changing and being recorded
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present invention relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program for restoring a three-dimensional shape of an object displayed in an input two-dimensional image.
  • the active measurement method has been considered as a main solution.
  • a method using a projector capable of acquiring a wide range in a short time is suitable.
  • a time encoding method is a method of projecting a plurality of patterns, it is essentially not suitable for measuring a moving object at a high frame rate. Also, when using a plurality of projectors to cover a wide range, it is difficult to achieve synchronization.
  • the spatial encoding method the projection pattern of each projector is invariant and it is suitable for measuring a moving object because only one shot image is required, but the pattern itself is complicated, so Problems such as color and shape interference occur, and image processing is often difficult.
  • Non-Patent Document 4 In practical three-dimensional measurement, an active method of projecting light from a sensor has been frequently used. In particular, many methods using a video projector have been proposed for efficiency (Non-Patent Document 4) and (Non-Patent Document 5). Regarding the active method using a projector, two types of methods, a time encoding method and a spatial encoding method, have been proposed. *
  • Non-patent Document 6 a system combining pattern projection and stereo vision based on the phase shift method
  • Non-patent Document 7 a method for reconstructing a shape by identifying a high-speed time-series pattern generated by a DLP projector
  • Non-Patent Document 9 Non-Patent Document 10
  • Non-Patent Document 11 Non-Patent Document 11
  • Non-Patent Document 12 Non-Patent Document 13
  • Patent Document 1 Another method for dealing with the above-described problem is described in Patent Document 1.
  • Hiroshi Kawasaki Ryo Furukawa, Ryusuke Sagawa, and Yasushi Yagi. Dynamic scene shape restructuring using a single structured light pattern. InCVPR, pages 18 and June 23-28 2008. Ryusuke Sagawa, Yuichi Ota, Yasushi Yagi, Ryo Furuka, Naoki Asada, and Hiroshi Kawasaki. Dense 3d restructuring-tion method using a single pattern for fast moving object. In ICCV, 2009. Ali Osman Ulusoy, Fatih Calakli, and Gabriel Taubin. One-shot scanning using de briijn paced grids. InThe 7th IEEE Conf. 3DIM, 2009. J. et al. Batle, E.M.
  • Non-Patent Document 4 when a time encoding method is used, stable high-precision restoration is possible in a static scene, but a plurality of different patterns can be projected. Because it is necessary, it is inherently difficult to use for dynamic scenes.
  • Non-Patent Document 6 In the methods described in Non-Patent Document 6 to Non-Patent Document 8, these methods can acquire depth information at a high frame rate, but it is necessary to recognize a time-series code. Must move at a constant speed or less. Furthermore, high-precision synchronization of the equipment to be used is required.
  • Non-Patent Document 9 In the methods described in Non-Patent Document 9 to Non-Patent Document 13, in many cases, a complicated pattern is used, so that it is affected by the texture to be observed, or the spatial pattern information cannot be identified at the depth edge portion. The problem that the error becomes large occurs. Furthermore, in order to measure a wide range, when patterns are projected from a plurality of projectors onto the same observation target, the patterns interfere with each other, so that the separation is not easy.
  • the present invention has been made to solve the various problems described above, and the main object of the present invention is to measure the shape of a moving object at a high density and a high frame rate, and obtained as a result.
  • An object of the present invention is to provide an image processing apparatus, an image processing method, and a program for appropriately restoring the three-dimensional shape of an object projected on a two-dimensional image.
  • the present invention is an image processing device that restores a three-dimensional shape from a two-dimensional image
  • the two-dimensional image includes a first light projecting unit that projects a first pattern onto an object existing in a three-dimensional space; Photographing the first pattern light and the second pattern light reflected by the object, second light projecting means for projecting the second pattern intersecting the surface of the first pattern and the object onto the object; A first curve that is the first pattern obtained by the imaging means for obtaining a two-dimensional image and projected on the object in the two-dimensional image; and the second curve projected on the object in the two-dimensional image.
  • a first calculation unit that detects a second curve as a pattern and calculates an intersection coordinate that is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates; the first light projecting unit; Parameters of the second light projecting means and the photographing hand
  • a second calculation unit that determines a first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern, from the parameters of: By calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the first correspondence, the second correspondence, or both, the third shape is restored. And a calculation unit.
  • the present invention is an image processing method for restoring a three-dimensional shape from a two-dimensional image, wherein the two-dimensional image includes first light projecting means for projecting a first pattern onto an object existing in a three-dimensional space. Photographing the first pattern light and the second pattern light reflected by the object; and second light projecting means for projecting the second pattern intersecting the surface of the object to the object. And a first curve that is the first pattern obtained by the imaging means for obtaining a two-dimensional image and projected onto the object in the two-dimensional image, and the first curve projected onto the object in the two-dimensional image.
  • An image processing method comprising: three steps.
  • the present invention is a program for causing an image processing apparatus to execute a function of restoring a three-dimensional shape from a two-dimensional image, and the two-dimensional image projects a first pattern onto an object existing in the three-dimensional space.
  • a first function that detects a second curve that is the second pattern projected onto the first pattern and calculates an intersection coordinate that is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates; 1 light projecting means and the second light projecting means
  • the three-dimensional shape is calculated by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the two functions and the first correspondence, the second correspondence, or both. And a third function for restoring the program.
  • the present invention can obtain a solution without indefiniteness only from information on the intersections of patterns projected on an object. Therefore, since it is not necessary to use pattern density information, both vertical and horizontal patterns can be made dense. In the background art, particularly in the restoration of a thin area, it is difficult to detect the connection relation of the grid pattern and the restoration may fail. However, the present invention greatly reduces this problem.
  • a technique using a line ID based on a de Bruijn sequence is adopted to improve the accuracy of the solution. Furthermore, a technique using adjacent information of detected lines is adopted. In this way, dense one-shot shape measurement using a plurality of projectors and one camera system was realized. In addition, a one-shot restoration linear solution using a grid pattern was realized. Furthermore, it has become possible to measure moving objects at a high frame rate.
  • (A) is a figure which shows the state which acquires the two-dimensional image of an object using the image processing apparatus of this invention
  • (B) is a figure which shows the structure of an image processing apparatus. It is a flowchart which shows the image processing method of this invention. It is a figure which shows the positional relationship of the camera and projector which are contained in the image processing apparatus of this invention. In the image processing apparatus of this invention, it is a figure which shows the pattern position on the image surface of a projector. It is a figure which shows the experiment scene using the image processing apparatus of this invention. It is a figure which shows the setup of evaluation experiment, (A) shows an input image, (B) is a top view which shows arrangement
  • (A) is a restoration result by the method A using only the restraint of a grid point
  • (B) is restraint by adjacent information.
  • (C) is a restoration result by the method C using adjacent information and line ID. It is a figure which shows the evaluation experiment with respect to the case where the distance of the axis
  • (A) is an input image
  • (B) is a figure which shows arrangement
  • (A) And (B) is a table
  • positioned on the left side is based on background art
  • positioned on the right side is based on this form
  • the image of the upper stage is an input image
  • the lower image is a three-dimensional shape restoration result.
  • positioned at the left side is an input image
  • positioned at the center is the detected grid pattern
  • positioned at the right side is a three-dimensional shape restoration result. is there.
  • FIG. 1A is a diagram illustrating an example of the entire apparatus according to the present embodiment
  • FIG. 1B is a diagram illustrating a configuration of the image processing apparatus 10.
  • an image processing apparatus 10 a camera 28 (photographing means), a projector 24 (first light projecting means), and a projector 26 (second light projecting means).
  • a two-dimensional image of the object 30 existing in the three-dimensional space is photographed, and the three-dimensional shape of the object is restored from the photographed two-dimensional image.
  • the camera 28 and the projectors 24 and 26 have been calibrated. That is, each internal parameter and the rigid body transformation parameter between apparatuses are known.
  • the camera 28, the projectors 24 and 26, and the image processing apparatus 10 may be regarded as the image processing apparatus of this embodiment.
  • the projector 24 has a function of projecting light including a horizontal pattern onto the object 30 that is a subject.
  • an apparatus such as a video projector can be considered.
  • line laser projectors may be arranged or combined.
  • the laser light source may be irradiated in a plurality of directions by a prism or a beam splitter.
  • the projector 26 projects a vertical pattern on the object 30. Other features of the projector 26 are the same as those of the projector 24. Since a fixed pattern is projected from each projector, there is no need for synchronization between the camera 28 and the projectors 24 and 26.
  • the pattern light projected from the projector 24 and the pattern light projected from the projector 26 intersect on the surface of the object 30.
  • the intersecting angle is arbitrary.
  • the two projectors can essentially restore the shape with a single color pattern, but use a color pattern to improve accuracy and stability.
  • stable vertical / horizontal line detection and separation are realized by using line detection based on belief propagation and a color code based on a debroin sequence which is a periodic pattern.
  • This method is described in Non-Patent Document 10 and Non-Patent Document 13, for example.
  • this matter is described in Joaquim Salvi, Joan Battle, and El Musta Muaddiv. Arobust-coded pattern projection for dynamic 3D scene mea-surlement. Pattern Recognition, 19 (11): 1055 1065, 1998. ).
  • the camera 28 is a means for photographing the object 30 and employs a solid-state imaging device such as a CCD image sensor. Specifically, the camera 28 images pattern light in which the pattern projected from the projectors 24 and 26 is reflected by the object 30. Furthermore, the intersection point where these pattern lights intersect is also photographed by the camera 28. A two-dimensional image is taken by the camera 28, and data based on the two-dimensional image is subjected to image processing by the image processing apparatus 10, whereby the three-dimensional shape of the object 30 is restored.
  • a solid-state imaging device such as a CCD image sensor
  • the image processing apparatus 10 mainly includes an image processing unit 12, a control unit 14, an input unit 16, a storage unit 18, a display unit 20, and an operation unit 22.
  • the general function of the image processing apparatus 10 is to perform image processing on an input two-dimensional image, restore a three-dimensional shape, and output the image.
  • the embodied image processing apparatus 10 may be a computer such as a personal computer in which an application (program) for executing a predetermined function is installed, or dedicated to image processing configured to execute the predetermined function. It may be configured as a device.
  • the respective parts constituting the image processing apparatus 10 are electrically connected to each other via a bus.
  • the image processing unit 12 is a part that performs a main image processing function, and includes a first calculation unit 12A, a second calculation unit 12B, and a third calculation unit 12C.
  • the first calculation unit 12A has a function of calculating the vertical curve of the vertical pattern and the horizontal curve of the horizontal pattern projected on the object 30 and the intersection coordinates of both patterns from the captured two-dimensional image.
  • the second calculator 12B has a function of determining the correspondence from the parameters of each projector, the parameters of the camera, and the light spot coordinates. Specifically, the first correspondence between the vertical pattern projected from the projector 26 and the vertical curve detected from the image and the second correspondence between the horizontal pattern projected from the projector 24 and the horizontal curve detected from the image are determined. is doing.
  • the third calculator 12C has a function of determining the three-dimensional coordinates of the portion irradiated with both pattern lights from the obtained first correspondence, second correspondence, or both.
  • the number of calculation units included in the image processing unit 12 is not necessarily the above-described three, and the number of calculation units may be increased or decreased as necessary.
  • the control unit 14 is a part that controls the operation of the entire image processing apparatus 10 (the image processing unit 12, the input unit 16, the storage unit 18, and the display unit 20).
  • the input unit 16 is a part where information is input to the image processing apparatus 10 from the outside.
  • a moving image or a still image that is a two-dimensional image is input.
  • the storage unit 18 is a fixed storage disk represented by an HDD (Hard Disk Drive), a removable storage disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disk), a fixed or removable semiconductor memory, or the like. is there.
  • the storage unit 18 stores a two-dimensional image before processing, a three-dimensional shape restored from the two-dimensional image, data during treatment, and the like.
  • the storage unit 18 stores a program for causing the image processing apparatus 10 to execute the following image processing method. This program is called when the user operates the operation unit 22, and executes the functions of the respective parts described above so as to restore the three-dimensional shape data from the input two-dimensional image data.
  • the display unit 20 is, for example, a liquid crystal display, a CRT (Cathode Ray Tube), or a video projector, and displays an input two-dimensional image and a three-dimensional shape restored based on the two-dimensional image.
  • a CRT Cathode Ray Tube
  • a video projector displays an input two-dimensional image and a three-dimensional shape restored based on the two-dimensional image.
  • the operation unit 22 is, for example, a keyboard or a mouse.
  • the image processing apparatus 10 restores a three-dimensional shape from the two-dimensional image.
  • the image processing method of the present embodiment includes step S10 for capturing a two-dimensional image of an object, step S11 for extracting a vertical pattern curve and a horizontal pattern curve from the captured two-dimensional image, Step S12 for detecting the intersection of the pattern curve, Step S13 for deriving a simultaneous equation of the pattern curve from the intersection, and Step S14 for obtaining the three-dimensional shape of the pattern curve from the simultaneous equation.
  • deriving simultaneous equations in step S13 is equivalent to determining the correspondence between each pattern and each curve.
  • step S10 pattern light is projected toward the object 30 from the projector 24 and the projector 26 using the system shown in FIG.
  • the projector 24 projects lateral pattern light that extends linearly in the lateral direction onto the object 30.
  • a horizontal pattern light extending linearly in the vertical direction is projected from the projector 26 onto the object 30.
  • both pattern lights are projected onto the surface of the object 30, and the intersection of both pattern lights exists on the surface of the object 30.
  • the object 30 in this state is photographed by the camera 28. Thereby, a two-dimensional image of the object 30 irradiated with each pattern light is obtained. Image data based on the two-dimensional image is transmitted to the image processing apparatus 10.
  • step S11 a vertical pattern curve and a horizontal pattern curve are extracted from the two-dimensional image.
  • the line detection method proposed in Non-Patent Document 2 is used to identify the vertical and horizontal patterns while using the same color.
  • the debroin sequence used in this embodiment is used to determine the line ID given to each line.
  • a DeBruin sequence of length n and number of symbols q is a number sequence of length qn. If a partial number sequence of length n is observed, the position in the number sequence can be uniquely specified.
  • the projection pattern is coded with two or more symbols that can be distinguished in the image, the correspondence between the projection pattern and the observation pattern is uniquely determined by matching a sequence of length n.
  • Non-Patent Document 10 Non-Patent Document 10
  • Non-Patent Document 13 large q and n values are used to determine a globally unique ID. Use a pattern generated using values.
  • the position of the intersection of the vertical pattern and the horizontal pattern is calculated with sub-pixel accuracy (step S12). Further, since adjacent intersections can be found by using the continuity of the detected lines, a grid graph in which the intersections are connected in a lattice shape is obtained as a result of line detection. By using the intersection coordinates and the grid graph that is the connection information of the intersections, three-dimensional reconstruction is realized by the method described below.
  • a plane in a specific pencil can be expressed with one parameter in the same way as a point in a straight line is expressed with one parameter.
  • Non-Patent Document 2 has proposed a method of projecting vertical and horizontal lines by a single projector to restore three-dimensionally. Also, this method is described in Ryo Furuka, Hiroshi Kawasaki, Ryusuke Sagawa, and Yasushi Yagi. Shape from grid pattern based on co-narity constraints for one-shot scanning. IPSJ Transaction on Computer Vision and Applications, 1: 139 157, 2009. But it is also mentioned.
  • the intersection of vertical and horizontal lines is called a grid point.
  • the axis through which the vertical pattern plane passes and the axis through which the horizontal pattern plane passes intersect at the optical center of the projector.
  • the constant term of the linear equation obtained from the grid point information is eliminated, so that the configured simultaneous equations are always indefinite.
  • Non-Patent Document 3 eliminates this indefiniteness by using the debrew-in ID.
  • a unique solution is obtained by using a plurality of projectors and projecting the vertical plane and the horizontal plane with different projectors. That is, as shown in FIG. 3, the vertical pattern plane share a line l v passing through the optical center of the projector (that is, a pencil to axis l v.). Similarly horizontal pattern plane share l h. It is unknown which pattern projected from the projector corresponds to each of the curves irradiated with vertical and horizontal pattern light observed by the camera. This information is restored from the equation for the observed curve. When l v and l h positions the projector so that the skewed, the resulting equation becomes a linear equation with a constant term, generally solution has no ambiguity. For this reason, a solution can be uniquely determined only from a linear equation. Pattern plane p
  • the three-dimensional vector p (p 1 , p 2 , p 3 ) T is a plane parameter vector.
  • the vector p is also called the original plane dual vector.
  • the space of vector p is also called the dual space of the original space.
  • a set of planes including l v that is an intersection of v a and v b is expressed by the following expression.
  • This equation has the same form as the equation of a straight line in the three-dimensional space, and represents that a set of vertical pattern planes is included in a straight line passing v a and v b in the dual space.
  • a straight line passing v a and v b in the dual space.
  • is a parameter that can be defined for each line on the image projected by the projector.
  • v 0 can be arbitrarily selected from a set of planes including l v . Further, the plane containing the l v, but only plane passing through the optical center of the camera exists, v inf coincides with the normal vector of the plane. Equation (1) assumes that the plane does not pass through the optical center of the camera, but as the plane approaches the optical center of the camera, the parameter p approaches the infinity point in the direction of v inf in dual space. .
  • the parameter ⁇ in equation (2) is ⁇ i
  • the parameter ⁇ in equation (4) is ⁇ j .
  • the k-th grid point u k (s k , t k , ⁇ 1) is the ⁇ (k) -th vertical plane and ⁇ (k) -th horizontal Suppose that it is an intersection with a plane. At this time,
  • Non-Patent Document 2 Although a mathematical expression similar to the mathematical expression 7 is also given in Non-Patent Document 2, as a significant difference from the above-mentioned document, in this embodiment, the mathematical expression 7 has a constant term, and this term is not erased by variable substitution or the like. . In Non-Patent Document 2, the constant term can be deleted as it is or by replacing variables. Therefore, the solution of the linear equation is not unique, but in this embodiment, the only solution is obtained from the linear equation.
  • D is a constant and can be determined in advance from how to take v 0 and v inf and the order relationship of ⁇ ⁇ (l) and ⁇ ⁇ (l) .
  • the pattern plane of each detected curve can be determined by solving the linear simultaneous equations relating to ⁇ i and ⁇ j obtained from the equations (7) and (8).
  • an equation solving method by LU decomposition may be used.
  • the solution obtained by the above equation represents the position information of the corresponding pattern plane for the observed curve of the vertical pattern or the horizontal pattern.
  • the curve can be reconstructed three-dimensionally by the principle of triangulation.
  • curves observed in both the vertical pattern and the horizontal pattern may be used (step S14).
  • the above position information may use the solution obtained from the equation as it is, but the projected pattern plane is finite and the position is known. May be the corresponding plane. Thereby, if the accuracy of the solution obtained from the equation is high, three-dimensional restoration without errors other than calibration errors can be performed.
  • Equation (3) does not have a value proportional to the pattern coordinates (that is, the dual of the pattern plane does not have a constant interval). For this reason, in order for ⁇ to be linear according to the pattern, it is necessary to calculate the pattern to be projected every time according to the rigid transformation between the projector and the camera. In this case, the pattern is changed every time the positional relationship between the projector and the camera is changed, and convenience is lost.
  • the optical center of the camera is included in the focal plane of the projector, and the dual of the pattern plane including the optical axis of the projector is v 0 .
  • a set of vertical pattern planes generated from the equidistant grids become equidistant points even in the dual space. This is intuitively shown by the fact that the position of the pattern at infinity on the image plane coincides with the point at infinity in the dual space (the plane including the optical center of the camera).
  • the projector and the camera are arranged in this way, the adjacent relationship of the patterns can be incorporated into the linear equation.
  • the front direction of the vertical pattern projector (first light projecting means) and the direction from the camera toward the optical center of the vertical pattern projector (first light projecting means) are orthogonal and horizontal.
  • the front direction of the pattern projector (second light projecting means) and the direction from the camera toward the optical center of the horizontal pattern projector (second light projecting means) are orthogonal to each other.
  • the adjacency relationship does not become a linear equation, but can be used as a nonlinear constraint condition of the solution.
  • the accuracy improvement using the devourin series As described above, the three-dimensional position of the pattern plane including the detected curve can be uniquely determined from the linear simultaneous equations. However, in practice, if an erroneous line and grid point are detected due to an error in image processing, an error may occur in the obtained solution. Therefore, in the present embodiment, the error of the solution is corrected by using the periodic line ID attached to the curve by the debroin sequence.
  • the parameter of the pattern plane projected from the projector and the line ID of the debroin series of each plane are known.
  • possible solutions are limited to those whose surface positions and line IDs coincide with these planes.
  • the solution is corrected using this principle. Specifically, in the vicinity of the solution for the detected curve obtained above, the projection pattern IDs are matched with each other, and whether or not the equation (5) that is the condition of each intersection is established. Investigate. This condition represents that the distance on the image between the projected line of two planes on the camera and the detected grid point is actually zero. Therefore, these distances are obtained for each grid point, the sum of squares thereof is obtained, and a solution having a small value is selected.
  • first light projection means axis first light projection means axis
  • second light projection means axis straight line shared by the vertical pattern
  • Equations 7 and 8 may be obtained by a search by combination optimization.
  • the correspondence with respect to a certain curve A is adopted as a hypothesis
  • the position of the curve is determined, and the position of the curve B intersecting with the curve A or the curve C adjacent to the curve A is calculated from the equations 7 and 8.
  • Information is obtained and the hypothesis of B or C correspondence is also limited. In this way, by finding the correspondence of the pattern plane to the curve one after another, the solution can also be obtained by obtaining the hypothesis that best satisfies the conditions of Equations 7 and 8 from the limited hypotheses.
  • one camera and two projectors were arranged and measured.
  • the arrangement of each device was adjusted each time depending on the observation target, and in the measurement of a person, the distance between the camera and the projector was about 1 m, and the relative angle was about 25 degrees.
  • the camera resolution is 1280 ⁇ 960 pixels
  • the frame rate is 15 FPS
  • the projector resolution is 1024 ⁇ 768 pixels.
  • Methods A Use only constraints obtained from grid points
  • Methodhod B Use constraints between grid points and adjacent information
  • Methodhod C Solution correction using line ID in addition to constraints between grid points and adjacent information
  • RMSE root mean square error
  • FIG. 7 shows the result of restoring the three-dimensional shape of the two surfaces of the cube
  • the table of FIG. 9A shows the angle between the planes and the RMS error in the plane fitting.
  • the RMS error when the three-dimensional shape restoration result was applied to a plane increased each time additional information was given. This is a natural result for the following reasons. When there is no additional information, restoring the vertical and horizontal lines minimizes the distance between the vertical and horizontal lines at the intersection to minimize the distance between the vertical and horizontal lines. As a result, the RMS error during plane fitting is small. Become. On the contrary, when the constraint of adjacency or line ID is added, the deviation between the vertical line and the horizontal line becomes large, and the error at the time of plane fitting becomes large. This is because in an actual system, it is difficult to perform calibration with sub-pixel accuracy or less over the entire screen.
  • Non-patent document 2 a comparison was made with the one-shot shape restoration (non-patent document 2) by the conventional method using only one projector.
  • a one-dimensional search is necessary to determine one degree of freedom remaining in the linear solution, but in this embodiment, a dense pattern can be used as compared with Non-Patent Document 2.
  • the advantage is that a fine shape such as a thin finger can be easily restored.
  • FIG. 10 is a comparison of the results of both methods.
  • the upper row is an input image
  • the middle row is a grid graph obtained by line detection
  • the lower row is a three-dimensional shape restoration result. It can be seen that the conventional method shown on the left side lacks a finger due to a lack of pattern, whereas the present embodiment shown on the right side can restore the shape without missing the finger.
  • FIG. 11 shows the shape restoration results for a plurality of poses.
  • column (A) is an input image
  • the result of determining line detection and line ID by line detection processing is column (B).
  • the line ID is represented by the color of each line (in this case, shading).
  • Column (C) shows the result of the shape restoration result using the line detection result.
  • some parts such as the flank and shoulders, there is a blind spot from one projector, and there is a part where only a vertical or horizontal pattern is projected. However, it can be seen that the shape is restored by connecting the pattern to a portion that is not a blind spot. This is one of the advantages of this embodiment using a plurality of projectors.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

Provided are an image processing device, an image processing method, and a program, which enable measurement of the shape of a moving object at a high density and a high frame rate. A two-dimensional image of an object (30) present in a three-dimensional space is captured by an image processing device (10), a camera (28), a projector (24), and a projector (26), and the three-dimensional shape of the object is restored from the captured two-dimensional image. The projector (24) projects a horizontal pattern onto the object (30), and the projector (26) projects a vertical pattern onto the object (30). The two-dimensional image is acquired by capturing an image of pattern light produced by the reflection of the patterns by the object (30) by the camera (28), and the three-dimensional image is restored from the two-dimensional image by the image processing device (10).

Description

画像処理装置、画像処理方法およびプログラムImage processing apparatus, image processing method, and program
本発明は、画像処理装置、画像処理方法およびプログラムに関し、特に、入力された2次元画像に映し出された物体の3次元形状を復元する画像処理装置、画像処理方法およびプログラムに関する。 The present invention relates to an image processing device, an image processing method, and a program, and more particularly, to an image processing device, an image processing method, and a program for restoring a three-dimensional shape of an object displayed in an input two-dimensional image.
高密度かつ高フレームレートでダイナミックな動きのある対象物体の3次元計測が必要とされている。例えば、運動する人物の筋肉の変化や、物体が破裂する瞬間の構造解析などは、高密度で高フレームレートでの形状復元が望まれている。  There is a need for three-dimensional measurement of a target object having high density and high frame rate and dynamic motion. For example, it is desired to restore the shape at a high density and a high frame rate for a change in muscles of an exercising person and a structural analysis at the moment when an object bursts. *
このような、実用的な3次元計測ではこれまで、アクティブ計測法が主な解決策と考えられてきた。特にダイナミックに変化するオブジェクトに関しては、短時間に広範囲を取得可能なプロジェクタを利用した方法が適している。しかし、プロジェクタによるアクティブ手法を、動物体の高密度な形状計測に用いることには、本質的な課題がある。  In such a practical three-dimensional measurement, the active measurement method has been considered as a main solution. Particularly for dynamically changing objects, a method using a projector capable of acquiring a wide range in a short time is suitable. However, there is an essential problem in using the active method by the projector for high-density shape measurement of the moving object. *
プロジェクタを用いた3次元計測として、主に、時間エンコード法と空間エンコード法の2つが知られている。時間エンコード法は複数のパターンを投影する手法のため、動いている物体の高フレームレートでの計測には本質的に適していない。また、広い範囲をカバーしようとして複数のプロジェクタを利用する場合、同期を取ることが難しい。一方で、空間エンコード手法の場合、各プロジェクタの投影パターンは不変であり、撮影画像も一枚で済むため動きのある物体の計測には適しているが、パターン自体が複雑なため、物体表面の色や形状との干渉などの問題が生じ、しばしば画像処理が困難となる。  Two main methods of three-dimensional measurement using a projector are known: a time encoding method and a spatial encoding method. Since the time encoding method is a method of projecting a plurality of patterns, it is essentially not suitable for measuring a moving object at a high frame rate. Also, when using a plurality of projectors to cover a wide range, it is difficult to achieve synchronization. On the other hand, in the case of the spatial encoding method, the projection pattern of each projector is invariant and it is suitable for measuring a moving object because only one shot image is required, but the pattern itself is complicated, so Problems such as color and shape interference occur, and image processing is often difficult. *
もし、固定された単色ラインパターンを投影した1フレームの画像から形状復元できれば、上記問題を全て解消することができる。そのような手法として、縦・横の直線からなるグリッドパターンを用いた、共面性拘束原理(coplanarity constraint:COP)に基づくワンショット復元手法(shape from grid pattern:SFG)が最近提案されている(非特許文献1)、(非特許文献2)、(非特許文献3)。  If the shape can be restored from an image of one frame on which a fixed monochromatic line pattern is projected, all of the above problems can be solved. As such a method, a one-shot restoration method (shape from ground pattern: SFG) based on a coplanarity constraint (COP) using a grid pattern composed of vertical and horizontal straight lines has been recently proposed. (Non-patent document 1), (Non-patent document 2), (Non-patent document 3). *
実用的な3次元計測ではセンサから光を投影するアクティブ手法が多く利用されてきた。特に、効率化のためにビデオプロジェクタを用いた手法が多く提案されている(非特許文献4)、(非特許文献5)。プロジェクタを用いたアクティブ手法に関して、これまでに大きく分けて、時間エンコード法と空間エンコード法の2種類の手法が提案されてきた。  In practical three-dimensional measurement, an active method of projecting light from a sensor has been frequently used. In particular, many methods using a video projector have been proposed for efficiency (Non-Patent Document 4) and (Non-Patent Document 5). Regarding the active method using a projector, two types of methods, a time encoding method and a spatial encoding method, have been proposed. *
近年、ハイスピードカメラとDLPプロジェクタを用いて、ダイナミックシーンを形状復元する研究が行われている。Weiseらは位相シフト法に基づいたパターン投影とステレオ視を組み合わせたシステムを提案した(非特許文献6)。NarasimhanらはDLPプロジェクタが生成する高速な時系列パターンを識別して形状復元を行う方法を提案した(非特許文献7)。また、形状の復元に必要なパターン数を削減する研究も行われている(非特許文献8)、(非特許文献5)。  In recent years, research on shape restoration of dynamic scenes using high-speed cameras and DLP projectors has been conducted. Weise et al. Proposed a system combining pattern projection and stereo vision based on the phase shift method (Non-patent Document 6). Narasiman et al. Proposed a method for reconstructing a shape by identifying a high-speed time-series pattern generated by a DLP projector (Non-patent Document 7). In addition, studies have been conducted to reduce the number of patterns necessary for shape restoration (Non-Patent Document 8) and (Non-Patent Document 5). *
一方で、空間エンコード法では、パターンが固定されているため、映像中の1フレームのみから形状復元が可能であり、ダイナミックシーンの計測に適している(非特許文献9)、(非特許文献10)、(非特許文献11)、(非特許文献12)、(非特許文献13)。  On the other hand, in the spatial encoding method, since the pattern is fixed, it is possible to restore the shape from only one frame in the video, which is suitable for dynamic scene measurement (Non-Patent Document 9) (Non-Patent Document 10). ), (Non-Patent Document 11), (Non-Patent Document 12), (Non-Patent Document 13). *
更にまた、上記した問題の対処方法の一つが特許文献1に記載されている。 Furthermore, one method for dealing with the above-described problem is described in Patent Document 1.
特開2009-300277号公報JP 2009-3000277 A
しかしながら、上記した文献に記載された技術には以下のような問題があった。 However, the techniques described in the above documents have the following problems.
非特許文献1~非特許文献3に記載のSFGでは、唯一の解を得るために、パターンの線の間隔の粗密情報を利用する。このため、縦横両方のパターンを十分密にすることが出来ないという制約があった。これは、計測の密度低下の原因になるばかりでなく、細い、あるいは小さな形状において、交点が不足しやすくなり、形状復元が行えなくなるという問題を生じさせる。 In the SFGs described in Non-Patent Document 1 to Non-Patent Document 3, in order to obtain a unique solution, the density information of the pattern line interval is used. For this reason, there is a restriction that both vertical and horizontal patterns cannot be made sufficiently dense. This not only causes a decrease in measurement density, but also causes a problem in that the shape cannot be restored because the intersections tend to be insufficient in a thin or small shape.
非特許文献4および非特許文献5に記載された技術事項では、時間エンコード法を利用する場合、静的シーンでは安定した高精度の復元が可能であるが、複数の異なるパターンを投影することが必要なため、ダイナミックなシーンに利用することが本質的に難しい。 In the technical matters described in Non-Patent Document 4 and Non-Patent Document 5, when a time encoding method is used, stable high-precision restoration is possible in a static scene, but a plurality of different patterns can be projected. Because it is necessary, it is inherently difficult to use for dynamic scenes.
非特許文献6~非特許文献8に記載された手法では、これらの手法は高フレームレートで奥行き情報を取得することができるが、時系列コードを認識する必要があるため、画像中の観測対象の動きは一定速度以下である必要がある。さらに、使用する機器の高精度な同期が必要となる。 In the methods described in Non-Patent Document 6 to Non-Patent Document 8, these methods can acquire depth information at a high frame rate, but it is necessary to recognize a time-series code. Must move at a constant speed or less. Furthermore, high-precision synchronization of the equipment to be used is required.
非特許文献9~非特許文献13に記載された手法では、多くの場合、複雑なパターンを用いるため、観測対象のテクスチャに影響されたり、奥行きエッジの部分で空間的なパターン情報を識別できず誤差が大きくなる、という問題が発生する。さらに、広い範囲を計測するために、複数のプロジェクタからパターンが同じ観測対象に投影された場合、パターンが干渉し合うため、その分離は容易ではなくなる。 In the methods described in Non-Patent Document 9 to Non-Patent Document 13, in many cases, a complicated pattern is used, so that it is affected by the texture to be observed, or the spatial pattern information cannot be identified at the depth edge portion. The problem that the error becomes large occurs. Furthermore, in order to measure a wide range, when patterns are projected from a plurality of projectors onto the same observation target, the patterns interfere with each other, so that the separation is not easy.
更に、特許文献1に記載された手法では、残存している1次元の不定性を、投影したパターンと得られた解とのマッチングを行うことで解消していたが、マッチングが必要とされる分計算時間が増大していた。 Furthermore, in the technique described in Patent Document 1, the remaining one-dimensional indeterminacy has been eliminated by matching the projected pattern with the obtained solution, but matching is required. Minute calculation time has increased.
本発明は、上記した種々の課題を解決するために成されたものであり、本発明の主たる目的は、動きのある物体の形状を高密度かつ高フレームレートに計測し、この結果得られた2次元画像に映し出された物体の3次元形状を適切に復元する画像処理装置、画像処理方法およびプログラムを提供することにある。 The present invention has been made to solve the various problems described above, and the main object of the present invention is to measure the shape of a moving object at a high density and a high frame rate, and obtained as a result. An object of the present invention is to provide an image processing apparatus, an image processing method, and a program for appropriately restoring the three-dimensional shape of an object projected on a two-dimensional image.
本発明は、2次元画像から3次元形状を復元する画像処理装置であり、前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1計算部と、前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2計算部と、前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3計算部と、を備えることを特徴とする。 The present invention is an image processing device that restores a three-dimensional shape from a two-dimensional image, and the two-dimensional image includes a first light projecting unit that projects a first pattern onto an object existing in a three-dimensional space; Photographing the first pattern light and the second pattern light reflected by the object, second light projecting means for projecting the second pattern intersecting the surface of the first pattern and the object onto the object; A first curve that is the first pattern obtained by the imaging means for obtaining a two-dimensional image and projected on the object in the two-dimensional image; and the second curve projected on the object in the two-dimensional image. A first calculation unit that detects a second curve as a pattern and calculates an intersection coordinate that is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates; the first light projecting unit; Parameters of the second light projecting means and the photographing hand A second calculation unit that determines a first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern, from the parameters of: By calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the first correspondence, the second correspondence, or both, the third shape is restored. And a calculation unit.
更に本発明は、2次元画像から3次元形状を復元する画像処理方法であり、前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1ステップと、前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2ステップと、前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3ステップと、を備えることを特徴とする画像処理方法。 Furthermore, the present invention is an image processing method for restoring a three-dimensional shape from a two-dimensional image, wherein the two-dimensional image includes first light projecting means for projecting a first pattern onto an object existing in a three-dimensional space. Photographing the first pattern light and the second pattern light reflected by the object; and second light projecting means for projecting the second pattern intersecting the surface of the object to the object. And a first curve that is the first pattern obtained by the imaging means for obtaining a two-dimensional image and projected onto the object in the two-dimensional image, and the first curve projected onto the object in the two-dimensional image. A first step of detecting a second curve that is two patterns and calculating an intersection coordinate that is a coordinate of an intersection of the first curve and the second curve; the intersection coordinates; the first light projecting means; Parameters of the second light projecting means, and A second step of determining a first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern from the parameters of the shadow means; The three-dimensional shape is restored by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the first correspondence, the second correspondence, or both. An image processing method comprising: three steps.
更に本発明は、2次元画像から3次元形状を復元する機能を画像処理装置に実行させるプログラムであり、前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1機能と、前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2機能と、前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3機能と、を実行させることを特徴とするプログラム。 Furthermore, the present invention is a program for causing an image processing apparatus to execute a function of restoring a three-dimensional shape from a two-dimensional image, and the two-dimensional image projects a first pattern onto an object existing in the three-dimensional space. First light projecting means, second light projecting means for projecting a second pattern intersecting the first pattern and the surface of the object to the object, the first pattern light reflected by the object, and the first light A first curve, which is the first pattern, obtained by photographing means for photographing a two-pattern light to obtain a two-dimensional image and projected onto the object in the two-dimensional image, and the object in the two-dimensional image A first function that detects a second curve that is the second pattern projected onto the first pattern and calculates an intersection coordinate that is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates; 1 light projecting means and the second light projecting means A first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern are determined from the parameters and the parameters of the photographing unit. The three-dimensional shape is calculated by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the two functions and the first correspondence, the second correspondence, or both. And a third function for restoring the program.
本発明は、物体に投影されたパターン同士の交点の情報のみから、不定性のない解を得ることができる。従って、パターンの粗密情報を利用する必要がないため、縦横両方のパターンを密にすることができる。背景技術では、特に細い領域の復元などにおいて、グリッドパターンの接続関係の検出が困難になり、復元が失敗する場合があったが、本発明ではこの問題が大きく軽減される。 The present invention can obtain a solution without indefiniteness only from information on the intersections of patterns projected on an object. Therefore, since it is not necessary to use pattern density information, both vertical and horizontal patterns can be made dense. In the background art, particularly in the restoration of a thin area, it is difficult to detect the connection relation of the grid pattern and the restoration may fail. However, the present invention greatly reduces this problem.
更に本発明では、複数のプロジェクタを用いるため、パターンが遮蔽されて形状計測できない部分を大きく減らすことが可能となる。 Further, in the present invention, since a plurality of projectors are used, it is possible to greatly reduce the portion where the pattern is shielded and the shape cannot be measured.
更にまた、本発明では、解の精度向上のためデブルーイン(de Bruijn)系列に基づいた線IDを用いる手法を採用した。更には、検出された線の隣接情報を用いる手法を採用している。このようにすることで、複数のプロジェクタと1つのカメラ方式を用いた密なワンショット形状計測が実現された。更に、グリッドパターンを用いたワンショット復元の線形解法が実現された。更には、高フレームレートで動物体を計測することが可能となった。 Furthermore, in the present invention, a technique using a line ID based on a de Bruijn sequence is adopted to improve the accuracy of the solution. Furthermore, a technique using adjacent information of detected lines is adopted. In this way, dense one-shot shape measurement using a plurality of projectors and one camera system was realized. In addition, a one-shot restoration linear solution using a grid pattern was realized. Furthermore, it has become possible to measure moving objects at a high frame rate.
(A)は本発明の画像処理装置を用いて物体の2次元画像を取得する状態を示す図であり、(B)は画像処理装置の構成を示す図である。(A) is a figure which shows the state which acquires the two-dimensional image of an object using the image processing apparatus of this invention, (B) is a figure which shows the structure of an image processing apparatus. 本発明の画像処理方法を示すフローチャートである。It is a flowchart which shows the image processing method of this invention. 本発明の画像処理装置に含まれるカメラとプロジェクタの位置関係を示す図である。It is a figure which shows the positional relationship of the camera and projector which are contained in the image processing apparatus of this invention. 本発明の画像処理装置に於いて、プロジェクタの画像面上でのパターン位置を示す図である。In the image processing apparatus of this invention, it is a figure which shows the pattern position on the image surface of a projector. 本発明の画像処理装置を用いた実験シーンを示す図である。It is a figure which shows the experiment scene using the image processing apparatus of this invention. 評価実験のセットアップを示す図であり、(A)は入力画像を示し、(B)はプロジェクタの配置を示す上面図であり、(C)はその側面図である。It is a figure which shows the setup of evaluation experiment, (A) shows an input image, (B) is a top view which shows arrangement | positioning of a projector, (C) is the side view. 本発明を用いて3つの手法による復元結果を同じ座標系に表示した図であり、(A)はグリッドポイントの拘束のみを用いた方法Aによる復元結果であり、(B)は隣接情報による拘束を加えた方法Bによる復元結果であり、(C)は隣接情報および線IDを利用した方法Cによる復元結果である。It is the figure which displayed the decompression | restoration result by three methods using the present invention on the same coordinate system, (A) is a restoration result by the method A using only the restraint of a grid point, (B) is restraint by adjacent information. (C) is a restoration result by the method C using adjacent information and line ID. 本発明を用いて2つのペンシルの軸どうしの距離が短い場合に対する評価実験を示す図であり、(A)は入力画像であり、(B)はプロジェクタの配置を示す図であり、(C)は側面図である。It is a figure which shows the evaluation experiment with respect to the case where the distance of the axis | shafts of two pencils is short using this invention, (A) is an input image, (B) is a figure which shows arrangement | positioning of a projector, (C) Is a side view. (A)および(B)は、隣接情報と線IDの利用による精度向上を示す表である。(A) And (B) is a table | surface which shows the precision improvement by utilization of adjacent information and line ID. 本発明の効果を示す図であり、左側に配置された画像は背景技術によるものであり、右側に配置された画像は本形態によるものであり、上段の画像は入力画像であり、中断の画像は線検出によって得られたグリッドグラフであり、下段の画像は3次元形状復元結果である。It is a figure which shows the effect of this invention, the image arrange | positioned on the left side is based on background art, The image arrange | positioned on the right side is based on this form, The image of the upper stage is an input image, The image of interruption Is a grid graph obtained by line detection, and the lower image is a three-dimensional shape restoration result. 本発明の効果を示す図であり、左側に配置された図は入力画像であり、中央に配置された図は検出されたグリッドパターンであり、右側に配置された図は3次元形状復元結果である。It is a figure which shows the effect of this invention, the figure arrange | positioned at the left side is an input image, the figure arrange | positioned at the center is the detected grid pattern, and the figure arrange | positioned at the right side is a three-dimensional shape restoration result. is there.
<第1の実施の形態:画像処理装置>
 図1を参照して、本発明の実施の形態に係る画像処理装置10の構成を説明する。図1(A)は本形態による装置全体の一例を示す図であり、図1(B)は画像処理装置10の構成を示す図である。
<First Embodiment: Image Processing Device>
With reference to FIG. 1, the structure of the image processing apparatus 10 which concerns on embodiment of this invention is demonstrated. FIG. 1A is a diagram illustrating an example of the entire apparatus according to the present embodiment, and FIG. 1B is a diagram illustrating a configuration of the image processing apparatus 10.
図1(A)を参照して、本形態では、画像処理装置10と、カメラ28(撮影手段)と、プロジェクタ24(第1投光手段)と、プロジェクタ26(第2投光手段)とで、3次元空間中に存在する物体30の2次元画像を撮影し、撮影された2次元画像から物体の3次元形状を復元している。ここで、カメラ28とプロジェクタ24、26は校正済みである。すなわち、それぞれの内部パラメータおよび、機器間の剛体変換パラメータは既知である。更にまた、カメラ28、プロジェクタ24、26および画像処理装置10を、本形態の画像処理装置とみなしても良い。 Referring to FIG. 1A, in the present embodiment, an image processing apparatus 10, a camera 28 (photographing means), a projector 24 (first light projecting means), and a projector 26 (second light projecting means). A two-dimensional image of the object 30 existing in the three-dimensional space is photographed, and the three-dimensional shape of the object is restored from the photographed two-dimensional image. Here, the camera 28 and the projectors 24 and 26 have been calibrated. That is, each internal parameter and the rigid body transformation parameter between apparatuses are known. Furthermore, the camera 28, the projectors 24 and 26, and the image processing apparatus 10 may be regarded as the image processing apparatus of this embodiment.
プロジェクタ24は、被写体である物体30に対して横パターンを含む光を投光する機能を有し、例えば、ビデオプロジェクタ等の装置が考えられる。その他、ラインレーザ投光機を並べたり組み合わせたりしても良い。あるいはレーザ光源をプリズムやビームスプリッターなどで複数方向に分けて照射しても良い。 The projector 24 has a function of projecting light including a horizontal pattern onto the object 30 that is a subject. For example, an apparatus such as a video projector can be considered. In addition, line laser projectors may be arranged or combined. Alternatively, the laser light source may be irradiated in a plurality of directions by a prism or a beam splitter.
プロジェクタ26は、物体30に対して縦パターンを投光する。その他のプロジェクタ26の特徴はプロジェクタ24と同様である。なお、各プロジェクタからは固定したパターンが投影されるため、カメラ28とプロジェクタ24、26との間で同期の必要が無い。 The projector 26 projects a vertical pattern on the object 30. Other features of the projector 26 are the same as those of the projector 24. Since a fixed pattern is projected from each projector, there is no need for synchronization between the camera 28 and the projectors 24 and 26.
プロジェクタ24により投光されたパターン光と、プロジェクタ26から投光されたパターン光とは物体30の表面にて交わる。この交わる角度は任意である。 The pattern light projected from the projector 24 and the pattern light projected from the projector 26 intersect on the surface of the object 30. The intersecting angle is arbitrary.
ここで、本形態では、2つのプロジェクタは本質的には単色のパターンでの形状復元が可能であるが、精度と安定性の向上のためにカラーパターンを利用する。具体的には、beliefpropagationに基づく線検出と、周期的なパターンであるデブルーイン系列に基づいたカラーコードを用いて、安定した縦・横の線検出と分離を実現する。この方法は、例えば上記した非特許文献10や非特許文献13に記載されている。更には、当該事項はJoaquim Salvi, Joan Batlle, and El Mustapha Mouaddib. Arobust-coded pattern projection for dynamic 3D scene mea-surement.Pattern Recognition, 19(11):1055 1065, 1998.)にも記載されている。 Here, in this embodiment, the two projectors can essentially restore the shape with a single color pattern, but use a color pattern to improve accuracy and stability. Specifically, stable vertical / horizontal line detection and separation are realized by using line detection based on belief propagation and a color code based on a debroin sequence which is a periodic pattern. This method is described in Non-Patent Document 10 and Non-Patent Document 13, for example. Furthermore, this matter is described in Joaquim Salvi, Joan Battle, and El Musta Muaddiv. Arobust-coded pattern projection for dynamic 3D scene mea-surlement. Pattern Recognition, 19 (11): 1055 1065, 1998. ).
カメラ28は、物体30を撮影する手段であり、例えばCCDイメージセンサ等の固体撮像装置が採用される。具体的には、カメラ28は、プロジェクタ24、26から投光されたパターンが物体30で反射したパターン光を撮影する。更には、これらのパターン光が交わる交点もカメラ28により撮影される。カメラ28により2次元画像が撮影され、この2次元画像に基づくデータが画像処理装置10により画像処理されることで、物体30の3次元形状が復元される。 The camera 28 is a means for photographing the object 30 and employs a solid-state imaging device such as a CCD image sensor. Specifically, the camera 28 images pattern light in which the pattern projected from the projectors 24 and 26 is reflected by the object 30. Furthermore, the intersection point where these pattern lights intersect is also photographed by the camera 28. A two-dimensional image is taken by the camera 28, and data based on the two-dimensional image is subjected to image processing by the image processing apparatus 10, whereby the three-dimensional shape of the object 30 is restored.
図1(B)を参照して、2次元画像から3次元形状を復元する画像処理装置10の構成を説明する。 With reference to FIG. 1B, a configuration of an image processing apparatus 10 that restores a three-dimensional shape from a two-dimensional image will be described.
本実施の形態の画像処理装置10は、画像処理部12と、制御部14と、入力部16と、記憶部18と、表示部20と、操作部22とを主要に具備する。画像処理装置10の概略的機能は、入力された2次元画像を画像処理して、3次元形状を復元して出力することにある。また、具現化された画像処理装置10としては、所定の機能を実行するアプリケーション(プログラム)がインストールされたパーソナルコンピュータ等のコンピュータでも良いし、所定の機能を実行するように構成された画像処理専用の機器として構成されても良い。更にまた、画像処理装置10を構成する各部位は、バスを経由して相互に電気的に接続される。 The image processing apparatus 10 according to the present embodiment mainly includes an image processing unit 12, a control unit 14, an input unit 16, a storage unit 18, a display unit 20, and an operation unit 22. The general function of the image processing apparatus 10 is to perform image processing on an input two-dimensional image, restore a three-dimensional shape, and output the image. Further, the embodied image processing apparatus 10 may be a computer such as a personal computer in which an application (program) for executing a predetermined function is installed, or dedicated to image processing configured to execute the predetermined function. It may be configured as a device. Furthermore, the respective parts constituting the image processing apparatus 10 are electrically connected to each other via a bus.
画像処理部12は、主たる画像処理の機能を果たす部位であり、第1計算部12Aと、第2計算部12Bと、第3計算部12Cとを含む。 The image processing unit 12 is a part that performs a main image processing function, and includes a first calculation unit 12A, a second calculation unit 12B, and a third calculation unit 12C.
第1計算部12Aは、撮影された2次元画像から、物体30に投影された縦パターンの縦曲線および横パターンの横曲線と、両パターンの交点座標を算出する機能を備えている。 The first calculation unit 12A has a function of calculating the vertical curve of the vertical pattern and the horizontal curve of the horizontal pattern projected on the object 30 and the intersection coordinates of both patterns from the captured two-dimensional image.
第2計算部12Bは、各プロジェクタのパラメータ、カメラのパラメータおよび前記光点座標から、対応を決定する機能を備えている。具体的には、プロジェクタ26から投影される縦パターンと画像から検出された縦曲線との第1対応およびプロジェクタ24から投影される横パターンと画像から検出された横曲線との第2対応を決定している。 The second calculator 12B has a function of determining the correspondence from the parameters of each projector, the parameters of the camera, and the light spot coordinates. Specifically, the first correspondence between the vertical pattern projected from the projector 26 and the vertical curve detected from the image and the second correspondence between the horizontal pattern projected from the projector 24 and the horizontal curve detected from the image are determined. is doing.
第3計算部12Cは、得られた第1対応、第2対応またはこれらの両方から、両パターン光が照射された部分の3次元座標を決定する機能を備えている。 The third calculator 12C has a function of determining the three-dimensional coordinates of the portion irradiated with both pattern lights from the obtained first correspondence, second correspondence, or both.
上記した画像処理部を構成する各部位の詳細は、画像処理方法として以下に詳述する。ここで、画像処理部12に含まれる計算部は必ずしも上記した3つである必要はなく、必要に応じて計算部の個数は増減されても良い。 Details of each part constituting the above-described image processing unit will be described in detail below as an image processing method. Here, the number of calculation units included in the image processing unit 12 is not necessarily the above-described three, and the number of calculation units may be increased or decreased as necessary.
制御部14は、画像処理装置10全体(画像処理部12、入力部16、記憶部18、表示部20)の動作を制御している部位である。 The control unit 14 is a part that controls the operation of the entire image processing apparatus 10 (the image processing unit 12, the input unit 16, the storage unit 18, and the display unit 20).
入力部16は、外部から画像処理装置10に情報が入力される部位である。本実施の形態では、2次元画像である動画像または静止画像が入力される。 The input unit 16 is a part where information is input to the image processing apparatus 10 from the outside. In the present embodiment, a moving image or a still image that is a two-dimensional image is input.
記憶部18は、HDD(Hard Disk Drive)に代表される固定式の記憶ディスク、CD(Compact Disc)やDVD(Digital Versatile Disk)等の着脱式記憶ディスク、固定式あるいは着脱式の半導体メモリ等である。本実施の形態では、記憶部18には、処理前の2次元画像、当該2次元画像から復元された3次元形状或いは処置途中のデータ等が記憶される。 The storage unit 18 is a fixed storage disk represented by an HDD (Hard Disk Drive), a removable storage disk such as a CD (Compact Disc) or a DVD (Digital Versatile Disk), a fixed or removable semiconductor memory, or the like. is there. In the present embodiment, the storage unit 18 stores a two-dimensional image before processing, a three-dimensional shape restored from the two-dimensional image, data during treatment, and the like.
更に、記憶部18には、下記する画像処理方法を画像処理装置10に実行させるためのプログラムが格納される。このプログラムは、使用者が操作部22を操作することにより呼び出されて、入力された2次元画像のデータから、3次元形状のデータを復元するように、上記した各部位の機能を実行させる。 Further, the storage unit 18 stores a program for causing the image processing apparatus 10 to execute the following image processing method. This program is called when the user operates the operation unit 22, and executes the functions of the respective parts described above so as to restore the three-dimensional shape data from the input two-dimensional image data.
表示部20は、例えば液晶ディスプレイ、CRT(Cathode Ray Tube)、ビデオプロジェクタであり、入力された2次元画像や、この2次元画像を基に復元された3次元形状が表示される。 The display unit 20 is, for example, a liquid crystal display, a CRT (Cathode Ray Tube), or a video projector, and displays an input two-dimensional image and a three-dimensional shape restored based on the two-dimensional image.
操作部22は、例えば、キーボードやマウスであり、使用者がこの操作部22を操作することにより、画像処理装置10は2次元画像から3次元形状を復元する。 The operation unit 22 is, for example, a keyboard or a mouse. When the user operates the operation unit 22, the image processing apparatus 10 restores a three-dimensional shape from the two-dimensional image.
<第2の実施の形態:画像処理方法>
 上記構成の画像処理装置10を用いて2次元画像に映し出された物体の3次元形状を復元する方法を以下に説明する。
<Second Embodiment: Image Processing Method>
A method for restoring the three-dimensional shape of an object projected on a two-dimensional image using the image processing apparatus 10 having the above configuration will be described below.
図2を参照して、本形態の画像処理方法は、物体の2次元画像を撮影するステップS10と、撮影された2次元画像から縦パターン曲線と横パターン曲線とを抽出するステップS11と、これらのパターン曲線の交点を検出するステップS12と、この交点からパターン曲線の連立方程式を導出するステップS13と、連立方程式からパターン曲線の3次元形状を得るステップS14とを備えている。ここで、ステップS13にて連立方程式を導出することは、各パターンと各曲線との対応関係を決定することと等価である。 Referring to FIG. 2, the image processing method of the present embodiment includes step S10 for capturing a two-dimensional image of an object, step S11 for extracting a vertical pattern curve and a horizontal pattern curve from the captured two-dimensional image, Step S12 for detecting the intersection of the pattern curve, Step S13 for deriving a simultaneous equation of the pattern curve from the intersection, and Step S14 for obtaining the three-dimensional shape of the pattern curve from the simultaneous equation. Here, deriving simultaneous equations in step S13 is equivalent to determining the correspondence between each pattern and each curve.
本形態の具現化された画像処理手法を以下にて説明する。 The image processing method embodied in this embodiment will be described below.
先ず、ステップS10では、図1に示したシステムを用いて、プロジェクタ24おおよびプロジェクタ26からパターン光を物体30に向けて投光する。ここでは、プロジェクタ24から、横方向に直線状に伸びる横パターン光を物体30に投光する。そしてプロジェクタ26から、縦方向に直線状に伸びる横パターン光を物体30に投光する。この結果、物体30の表面に両パターン光が投光され、両パターン光の交点が物体30の表面に存在する。この状態の物体30をカメラ28で撮影する。このことにより、各パターン光が照射された物体30の2次元画像が得られる。この2次元画像に基づく画像データは、画像処理装置10に伝送される。 First, in step S10, pattern light is projected toward the object 30 from the projector 24 and the projector 26 using the system shown in FIG. Here, the projector 24 projects lateral pattern light that extends linearly in the lateral direction onto the object 30. Then, a horizontal pattern light extending linearly in the vertical direction is projected from the projector 26 onto the object 30. As a result, both pattern lights are projected onto the surface of the object 30, and the intersection of both pattern lights exists on the surface of the object 30. The object 30 in this state is photographed by the camera 28. Thereby, a two-dimensional image of the object 30 irradiated with each pattern light is obtained. Image data based on the two-dimensional image is transmitted to the image processing apparatus 10.
次にステップS11では、2次元画像から縦パターン曲線および横パターン曲線を抽出する。2次元画像からカラーコードを安定に検出するには、全体の色数を減らすことが望ましい。そこで、縦・横のパターンに同じ色を用いつつ、それらを識別するために、非特許文献2において提案された線検出法を用いる。 In step S11, a vertical pattern curve and a horizontal pattern curve are extracted from the two-dimensional image. In order to stably detect a color code from a two-dimensional image, it is desirable to reduce the total number of colors. Therefore, the line detection method proposed in Non-Patent Document 2 is used to identify the vertical and horizontal patterns while using the same color.
本形態で使用されるデブルーイン系列は、それぞれの線に与えられた線IDの決定に用いられる。長さn、記号数qのデブルーイン系列は、長さqnの数列であり、長さnの部分数列を観測すれば、数列中の位置が一意に特定可能という特徴を持つ。投影パターンを画像中で区別できる2つ以上の記号でコード化した場合、投影パターンと観測パターン間の対応が長さnの数列のマッチングによって一意に決まる。非特許文献10、非特許文献10、非特許文献13ではグローバルに一意なIDを決定するため、大きなqとnの値が用いられていたが、本形態では非特許文献2のように、小さな値を用いて生成されたパターンを用いる。 The debroin sequence used in this embodiment is used to determine the line ID given to each line. A DeBruin sequence of length n and number of symbols q is a number sequence of length qn. If a partial number sequence of length n is observed, the position in the number sequence can be uniquely specified. When the projection pattern is coded with two or more symbols that can be distinguished in the image, the correspondence between the projection pattern and the observation pattern is uniquely determined by matching a sequence of length n. In Non-Patent Document 10, Non-Patent Document 10, and Non-Patent Document 13, large q and n values are used to determine a globally unique ID. Use a pattern generated using values.
これは、本形態ではデブルーインIDは、ノイズなどの影響による誤推定の修正にのみ利用されるため、局所的な一意性が確保されれば十分なためである。小さいqとnのため、IDを決定するために必要な連続パターン数が少なくて済み、少数の色でIDを表現できるので、形状の凹凸や、画像処理の影響を受けにくくなる。本形態では色数q=2、およびコードの長さn=3を用いた。すなわち、カラーパターンの周期は8本となり、各線に与えられる線IDは0から7となる。また、オクルーディングエッジなどにおいて起こる線の誤接続も、このデブルーインIDを用いて解消することができる。 This is because, in this embodiment, the deburin ID is used only for correcting erroneous estimation due to the influence of noise or the like, and therefore it is sufficient if local uniqueness is ensured. Because of the small q and n, the number of continuous patterns required to determine the ID is small, and the ID can be expressed with a small number of colors, so that it is less susceptible to shape irregularities and image processing. In this embodiment, the number of colors q = 2 and the cord length n = 3 are used. That is, the period of the color pattern is 8 and the line ID given to each line is 0 to 7. In addition, an erroneous connection of lines that occurs at an occluded edge or the like can also be eliminated by using this deburin ID.
縦パターンと横パターンの交点の位置はサブピクセル精度で計算される(ステップS12)。また検出された線の連続性を用いて、隣接する交点が分かるので、線検出の結果として、交点を格子状に接続したグリッドグラフが得られる。この交点座標と、交点の接続情報であるグリッドグラフにより、以下で述べる手法により3次元復元が実現される。 The position of the intersection of the vertical pattern and the horizontal pattern is calculated with sub-pixel accuracy (step S12). Further, since adjacent intersections can be found by using the continuity of the detected lines, a grid graph in which the intersections are connected in a lattice shape is obtained as a result of line detection. By using the intersection coordinates and the grid graph that is the connection information of the intersections, three-dimensional reconstruction is realized by the method described below.
ここで、最小構成である2つのプロジェクタと1つのカメラを用いた復元方法について述べる。あるプロジェクタから一組の平行線を投影すると、各線の3次元空間中での軌跡は平面であり、その全ての平面はある一つの直線を共有する。すなわち、この直線を軸とするpencil of planes(同一の直線を共有する面の集合を表し、以下ペンシルと表記する)の要素である。平面を3個のパラメータで表し、その3次元ベクトルを3次元空間の点と見なした時、この点を平面の双対と呼び、その空間を双対空間と呼ぶ。平面の双対は点であり、ペンシルの双対は直線となる。このことから、直線中の点を1パラメータで表現するのと同じように、特定のペンシル中の平面を1パラメータで表現できる。この時、グリッドの交点位置から、パターン平面のパラメータに関する線形方程式を作成することが可能である(ステップS13)。 Here, a restoration method using two projectors and one camera which are the minimum configuration will be described. When a set of parallel lines is projected from a projector, the trajectory of each line in the three-dimensional space is a plane, and all the planes share a single straight line. That is, it is an element of pencil of planes (representing a set of surfaces sharing the same straight line, hereinafter referred to as “pencil”) around this straight line. When a plane is represented by three parameters and the three-dimensional vector is regarded as a point in a three-dimensional space, this point is called a plane dual, and the space is called a dual space. A plane dual is a point, and a pencil dual is a straight line. From this, a plane in a specific pencil can be expressed with one parameter in the same way as a point in a straight line is expressed with one parameter. At this time, it is possible to create a linear equation relating to the parameters of the pattern plane from the intersection position of the grid (step S13).
これまで非特許文献2において、単一のプロジェクタによって縦・横の線を投影し、3次元復元する手法が提案されている。また、この方法はRyo Furukawa, Hiroshi Kawasaki, Ryusuke Sagawa, and Yasushi Yagi. Shape from grid pattern based on copla-narity constraints for one-shot scanning.IPSJ Transaction on Computer Vision and Applications, 1:139 157, 2009.でも述べられている。 So far, Non-Patent Document 2 has proposed a method of projecting vertical and horizontal lines by a single projector to restore three-dimensionally. Also, this method is described in Ryo Furuka, Hiroshi Kawasaki, Ryusuke Sagawa, and Yasushi Yagi. Shape from grid pattern based on co-narity constraints for one-shot scanning. IPSJ Transaction on Computer Vision and Applications, 1: 139 157, 2009. But it is also mentioned.
上記文献において、縦・横の線の交点はグリッドポイントと呼ばれる。上記文献より、縦パターン平面が通過する軸と、横パターン平面が通過する軸は、プロジェクタの光学中心で交わる。これにより、グリッドポイントの情報から得られる線形方程式の定数項が消去されるため、構成される連立方程式は、必ず不定性を持つ方程式となる。 In the above document, the intersection of vertical and horizontal lines is called a grid point. From the above document, the axis through which the vertical pattern plane passes and the axis through which the horizontal pattern plane passes intersect at the optical center of the projector. As a result, the constant term of the linear equation obtained from the grid point information is eliminated, so that the configured simultaneous equations are always indefinite.
上記文献においては、線の粗密情報をパターンに付与し、残っている1次元の不定性を、投影したパターンと得られた解とのマッチングを行うことで解消する手法としていた。また、非特許文献3は、デブルーインIDにより、この不定性を解消している。 In the above document, line density information is given to a pattern, and the remaining one-dimensional indefiniteness is solved by matching the projected pattern with the obtained solution. Further, Non-Patent Document 3 eliminates this indefiniteness by using the debrew-in ID.
これに対して、本形態では、複数のプロジェクタを利用し、縦平面と横平面を異なるプロジェクタで投影することで一意な解を得る。即ち、図3に示すように、縦パターン平面は、プロジェクタの光学中心を通る直線lを共有する(つまり、lを軸とするペンシルである。)。同様に横パターン平面はlを共有する。カメラで観測された、縦横のパターン光で照射された曲線が、それぞれ、プロジェクタから投影されたどのパターンに対応するかは未知である。この情報を、観測された曲線についての方程式から復元する。lとlがねじれの位置になるようにプロジェクタを配置すると、得られる方程式は、定数項を持つ線形方程式となり、一般に解は不定性を持たない。このため、線形方程式のみから一意に解を定めることが出来る。パターン平面pを On the other hand, in this embodiment, a unique solution is obtained by using a plurality of projectors and projecting the vertical plane and the horizontal plane with different projectors. That is, as shown in FIG. 3, the vertical pattern plane share a line l v passing through the optical center of the projector (that is, a pencil to axis l v.). Similarly horizontal pattern plane share l h. It is unknown which pattern projected from the projector corresponds to each of the curves irradiated with vertical and horizontal pattern light observed by the camera. This information is restored from the equation for the observed curve. When l v and l h positions the projector so that the skewed, the resulting equation becomes a linear equation with a constant term, generally solution has no ambiguity. For this reason, a solution can be uniquely determined only from a linear equation. Pattern plane p
Figure JPOXMLDOC01-appb-M000001
 で表す。このとき、3次元ベクトルp=(p、p、pは、平面のパラメータベクトルである。ベクトルpは、元の平面の双対ベクトルとも呼ばれる。ベクトルpの空間は、元の空間の双対空間とも呼ばれる。ある縦パターン平面をvとし、vと異なる縦パターン平面をvとする。このとき、v、vの交線であるlを含む平面の集合は、以下の式で表現される。
Figure JPOXMLDOC01-appb-M000001
Represented by At this time, the three-dimensional vector p = (p 1 , p 2 , p 3 ) T is a plane parameter vector. The vector p is also called the original plane dual vector. The space of vector p is also called the dual space of the original space. Some vertical pattern plane and v a, v a and different vertical pattern planes and v b. At this time, a set of planes including l v that is an intersection of v a and v b is expressed by the following expression.
Figure JPOXMLDOC01-appb-M000002
 この式は、3次元空間中における直線の式と同じ形をしており、縦パターン平面の集合が、双対空間中でv、vを通る直線に含まれることを表す。この直線を表現するために、直線上の任意の点vと、直線の方向ベクトル(あるいは無限遠点を正規化したもの)vinfを用いて、
Figure JPOXMLDOC01-appb-M000002
This equation has the same form as the equation of a straight line in the three-dimensional space, and represents that a set of vertical pattern planes is included in a straight line passing v a and v b in the dual space. In order to express this straight line, using an arbitrary point v 0 on the straight line and a direction vector (or a normalized infinity point) v inf of the straight line,
Figure JPOXMLDOC01-appb-M000003
 と表現することが出来る。ここで、μは、プロジェクタで投影する画像上の各ラインに対して定義できるパラメータである。
Figure JPOXMLDOC01-appb-M000003
Can be expressed. Here, μ is a parameter that can be defined for each line on the image projected by the projector.
はlを含む平面の集合から任意に選ぶことが出来る。また、lを含む平面には、カメラの光学中心を通過する唯一の平面が存在するが、vinfは、その平面の法線ベクトルに一致する。式(1)は、平面がカメラの光学中心を通らないことを仮定しているが、カメラの光学中心に平面が近づくと、パラメータpは双対空間中でvinfの方向の無限遠点に近づく。 v 0 can be arbitrarily selected from a set of planes including l v . Further, the plane containing the l v, but only plane passing through the optical center of the camera exists, v inf coincides with the normal vector of the plane. Equation (1) assumes that the plane does not pass through the optical center of the camera, but as the plane approaches the optical center of the camera, the parameter p approaches the infinity point in the direction of v inf in dual space. .
横パターンの集合も、同様に、 Similarly, the set of horizontal patterns
Figure JPOXMLDOC01-appb-M000004
と表されるとする。
Figure JPOXMLDOC01-appb-M000004
It is assumed that
ある縦パターンvと横パターンhの交点が、正規化カメラ座標において(s、t)で観測されたとする。この時、v、hの双対ベクトルをv、hとし、u=(s、t、-1)とすると、 Assume that an intersection of a certain vertical pattern v and horizontal pattern h is observed at (s, t) in normalized camera coordinates. At this time, if the dual vectors of v and h are v and h, and u = (s, t, −1) T ,
Figure JPOXMLDOC01-appb-M000005
 である。式(4)、(2)を代入して、
Figure JPOXMLDOC01-appb-M000005
It is. Substituting equations (4) and (2),
Figure JPOXMLDOC01-appb-M000006
を得る。
Figure JPOXMLDOC01-appb-M000006
Get.
上記の式はグリッドポイントごとに得られるので、これから線形連立1次方程式を作ることが出来る。i番目の縦平面について、式(2)のパラメータμをμ、同様にj番目の横平面について、式(4)のパラメータρをρとする。また、K個のグリッドポイントが検出されたとして、そのうちk番目のグリッドポイントu=(s、t、-1)が、α(k)番目の縦平面とβ(k)番目の横平面との交点であるとする。この時、 Since the above equation is obtained for each grid point, a linear simultaneous linear equation can be created from this. For the i-th vertical plane, the parameter μ in equation (2) is μ i , and for the j-th horizontal plane, the parameter ρ in equation (4) is ρ j . Also, assuming that K grid points are detected, the k-th grid point u k = (s k , t k , −1) is the α (k) -th vertical plane and β (k) -th horizontal Suppose that it is an intersection with a plane. At this time,
Figure JPOXMLDOC01-appb-M000007
がk=1、・・・、Kについて成立する。
Figure JPOXMLDOC01-appb-M000007
Holds for k = 1,..., K.
数式7と類似の数式は、非特許文献2でも与えられるが、上記文献との大きな違いとして、本形態においては、数式7には定数項があり、この項は、変数の置き換えなどによって消去されない。非特許文献2では、そのままで、あるいは変数の置き換えによって、定数項を消去することが出来るので、線形方程式の解が唯一ではなかったが、本形態では唯一の解が線形方程式から求められる。 Although a mathematical expression similar to the mathematical expression 7 is also given in Non-Patent Document 2, as a significant difference from the above-mentioned document, in this embodiment, the mathematical expression 7 has a constant term, and this term is not erased by variable substitution or the like. . In Non-Patent Document 2, the constant term can be deleted as it is or by replacing variables. Therefore, the solution of the linear equation is not unique, but in this embodiment, the only solution is obtained from the linear equation.
さらに、パターン同士の隣接情報が与えられている場合、それを利用した拘束式を作ることが出来る。特に、パターン上の隣接するラインどうしのμの差が一定値の場合(パターン平面の集合が、双対空間中で等間隔に並ぶ場合に相当する)は、拘束式が以下に述べるように線形となり都合が良い。そこで、本形態では、このように、μは隣接するパターンにおいて一定間隔で変化するという線形関係があると仮定し、以下議論を進める。 Furthermore, when adjacent information between patterns is given, a constraint equation using the information can be created. In particular, when the difference in μ between adjacent lines on the pattern is a constant value (corresponding to a set of pattern planes arranged at equal intervals in the dual space), the constraint equation is linear as described below. convenient. Therefore, in the present embodiment, it is assumed that μ has a linear relationship that changes at a constant interval in adjacent patterns as described above, and the following discussion will proceed.
後述するように、これは投影するパターンの間隔を調整するか、プロジェクタの配置を工夫することで常に成立させることが出来るため、この仮定により手法の一般性が失われることはない。L個のパターンのペアが隣接する場合を考える。l番目のペアがγ(l)番目の縦平面とδ(l)番目の縦平面である場合、 As will be described later, this can always be established by adjusting the interval of the pattern to be projected or by devising the arrangement of the projectors, so that the generality of the method is not lost by this assumption. Consider a case in which L pairs of patterns are adjacent. If the l th pair is the γ (l) th vertical plane and the δ (l) th vertical plane,
Figure JPOXMLDOC01-appb-M000008
 となる。Dは定数であり、v、vinfの取り方と、μγ(l)、μδ(l)の順序関係からあらかじめ決定できる。
Figure JPOXMLDOC01-appb-M000008
It becomes. D is a constant and can be determined in advance from how to take v 0 and v inf and the order relationship of μ γ (l) and μ δ (l) .
式(7)、(8)から得られるμとρに関する線形連立方程式を解くことで、検出された各曲線のパターン平面を決定できる。 The pattern plane of each detected curve can be determined by solving the linear simultaneous equations relating to μ i and ρ j obtained from the equations (7) and (8).
実際の計算においては、行列方程式Ax=bを生成する。ただしxは、検出された縦パターン及び横パターンのパラメータμ、ρを並べてベクトルにしたものである。十分な交点と隣接情報があれば、Aの行数は列数より大きいので、擬似逆行列を用いてx=(AA)-1cによって解を求める。(AA)-1の計算にはLU分解による方程式解法を利用すればよい。 In the actual calculation, the matrix equation Ax = b is generated. Where x is a vector in which the detected parameters μ i and ρ j of the vertical and horizontal patterns are arranged. If there are sufficient intersections and adjacent information, the number of rows of A is larger than the number of columns, so a solution is obtained by x = (A T A) −1 A T c using a pseudo inverse matrix. For the calculation of (A T A) −1 , an equation solving method by LU decomposition may be used.
上記方程式で得られる解は、縦パターンまたは横パターンの、観測された曲線について、対応するパターン平面の位置情報を表すものである。この情報が求められると、三角測量の原理によって、その曲線を3次元復元できる。ここで、縦パターンおよび横パターンの両方で観測された曲線が用いられても良い(ステップS14)。 The solution obtained by the above equation represents the position information of the corresponding pattern plane for the observed curve of the vertical pattern or the horizontal pattern. When this information is obtained, the curve can be reconstructed three-dimensionally by the principle of triangulation. Here, curves observed in both the vertical pattern and the horizontal pattern may be used (step S14).
上記の位置情報は、方程式から得られた解をそのまま利用してもよいが、投影されるパターン平面は有限であり、位置は既知であるので、これらの既知の平面集合と照合し、近い平面を、対応する平面としても良い。これにより、方程式から得られた解の精度が高ければ、校正誤差以外の誤差のない3次元復元を行うことが出来る。 The above position information may use the solution obtained from the equation as it is, but the projected pattern plane is finite and the position is known. May be the corresponding plane. Thereby, if the accuracy of the solution obtained from the equation is high, three-dimensional restoration without errors other than calibration errors can be performed.
次に、プロジェクタとカメラの配置について説明する。 Next, the arrangement of the projector and camera will be described.
一般に、プロジェクタの画像上のラインパターンが一定間隔であっても、式(3)におけるμはパターン座標に比例した値とはならない(つまり、パターン平面の双対が一定間隔にならない)。このため、μがパターンに応じて線形となるためには、プロジェクタとカメラとの間の剛体変換に従って、投影するパターンを毎回計算する必要がある。この場合、プロジェクタとカメラの位置関係を変更する度にパターンを変更することになり利便性が損なわれる。 In general, even if the line pattern on the image of the projector is at a constant interval, μ in Equation (3) does not have a value proportional to the pattern coordinates (that is, the dual of the pattern plane does not have a constant interval). For this reason, in order for μ to be linear according to the pattern, it is necessary to calculate the pattern to be projected every time according to the rigid transformation between the projector and the camera. In this case, the pattern is changed every time the positional relationship between the projector and the camera is changed, and convenience is lost.
もう一つの方法として、一定間隔のパターンを投影した場合であっても、μと、プロジェクタの画像面上での位置関係が線型となるような配置が考えられる。プロジェクタの光学中心を通り、プロジェクタの画像面に平行な平面を、O. Faugeras.Three-dimensional computer vision: A geo-metric viewpoint. The MIT press, Cambridge, MA, 1993.に倣ってプロジェクタの焦点面(focal plane)と呼ぶことにする。 As another method, even when a pattern with a constant interval is projected, an arrangement is possible in which the positional relationship between μ and the image plane of the projector is linear. A plane passing through the optical center of the projector and parallel to the image plane of the projector Faugeras. Three-dimensional computer vision: A geo-metric viewpoint. The MIT press, Cambridge, MA, 1993. The focal plane of the projector (focal plane) will be referred to.
図4に示すように、プロジェクタの焦点面にカメラの光学中心が含まれるようにし、さらにプロジェクタの光軸を含むパターン平面の双対を、vとする。この時、等間隔のグリッドから生成される縦パターン平面の集合が、双対空間中でも等間隔の点となる。このことは、直感的には、画像面上でのパターンの無限遠の位置と、双対空間中での無限遠点(カメラの光学中心を含む平面)が一致することから示される。本形態ではプロジェクタとカメラが、このように配置されているとすると、パターンの隣接関係を線形方程式に組み入れることが出来る。 As shown in FIG. 4, the optical center of the camera is included in the focal plane of the projector, and the dual of the pattern plane including the optical axis of the projector is v 0 . At this time, a set of vertical pattern planes generated from the equidistant grids become equidistant points even in the dual space. This is intuitively shown by the fact that the position of the pattern at infinity on the image plane coincides with the point at infinity in the dual space (the plane including the optical center of the camera). In this embodiment, if the projector and the camera are arranged in this way, the adjacent relationship of the patterns can be incorporated into the linear equation.
このような配置の例としては、縦パターンプロジェクタ(第1投光手段)の正面方向と、カメラから、縦パターンプロジェクタ(第1投光手段)光学中心に向かう方向とが直交し、かつ、横パターンプロジェクタ(第2投光手段)の正面方向と、カメラから、横パターンプロジェクタ(第2投光手段)光学中心に向かう方向とが、直交する配置がある。 As an example of such an arrangement, the front direction of the vertical pattern projector (first light projecting means) and the direction from the camera toward the optical center of the vertical pattern projector (first light projecting means) are orthogonal and horizontal. There is an arrangement in which the front direction of the pattern projector (second light projecting means) and the direction from the camera toward the optical center of the horizontal pattern projector (second light projecting means) are orthogonal to each other.
上記の配置が満たされないばあい、隣接関係は線形方程式にはならないが、解の非線形な制約条件としては利用できる。 次に、デブルーイン系列を用いた精度向上に関して説明する。上記したように、線型連立方程式から、検出された曲線を含むパターン平面の3次元位置を一意に決定できる。しかし、実際には、画像処理における誤差の影響で誤った線およびグリッドポイントが検出された場合、得られる解に誤差が生じることが生じる。そこで本形態では、デブルーイン系列によって曲線に付けられた周期的な線IDを利用して、解の誤差を修正する。本形態ではカメラとプロジェクタは校正済みであると仮定しているため、プロジェクターから投影されるパターン平面のパラメータと、それぞれの平面のデブルーイン系列の線IDは既知である。このとき、可能な解はこれらの平面に対して面の位置と線IDが一致するものに限られる。この原理を利用して解の修正を行う。具体的には、上記で得られた検出された曲線についての解の周辺で、投影パターンのIDが一致するものと対応させ、各交点の条件である式(5)が成立しているかどうかを調べる。この条件は、実際には2枚の平面の交線をカメラに投影したものと、検出されたグリッドポイントとの画像上での距離が0であることを表す。そこで、これらの距離を各グリッドポイントについて求め、それらの二乗和を求め、それが小さい解を選べば良い。 If the above arrangement is not satisfied, the adjacency relationship does not become a linear equation, but can be used as a nonlinear constraint condition of the solution. Next, an explanation will be given regarding the accuracy improvement using the devourin series. As described above, the three-dimensional position of the pattern plane including the detected curve can be uniquely determined from the linear simultaneous equations. However, in practice, if an erroneous line and grid point are detected due to an error in image processing, an error may occur in the obtained solution. Therefore, in the present embodiment, the error of the solution is corrected by using the periodic line ID attached to the curve by the debroin sequence. In this embodiment, since it is assumed that the camera and the projector have been calibrated, the parameter of the pattern plane projected from the projector and the line ID of the debroin series of each plane are known. At this time, possible solutions are limited to those whose surface positions and line IDs coincide with these planes. The solution is corrected using this principle. Specifically, in the vicinity of the solution for the detected curve obtained above, the projection pattern IDs are matched with each other, and whether or not the equation (5) that is the condition of each intersection is established. Investigate. This condition represents that the distance on the image between the projected line of two planes on the camera and the detected grid point is actually zero. Therefore, these distances are obtained for each grid point, the sum of squares thereof is obtained, and a solution having a small value is selected.
上記の手順において、「曲線についての解の周辺」を探索する場合、全ての平面について変数を変化させると、大きな次元の空間を探索する必要がある。計算時間の短縮のためには、線形方程式の係数行列Aについて、(AA)の最小固有値に対応する固有ベクトルを求め、このベクトルの方向に解を探索すると、効率がよい。これは、上記の方程式において、誤差が出やすい方向は、多くの場合1次元空間であることから正当化される。隣接関係が非線形な制約条件となるばあい、上記の精度向上と同様な方法で、精度向上の手段として、隣接関係を利用しても良い。 In the above procedure, when searching for “around a solution for a curve”, if a variable is changed for all the planes, it is necessary to search a large dimensional space. In order to shorten the calculation time, it is efficient to obtain an eigenvector corresponding to the minimum eigenvalue of (A T A) for the coefficient matrix A of the linear equation and search for a solution in the direction of this vector. This is justified in the above equation because the direction in which errors are likely to occur is often a one-dimensional space. When the adjacency relationship becomes a non-linear constraint, the adjacency relationship may be used as a means for improving accuracy in the same manner as the accuracy improvement described above.
パターン同士の隣接関係を線形方程式に利用した場合、縦パターンが共有する直線(第1投光手段の軸)と、縦パターンが共有する直線(第2投光手段の軸)とは、特殊な場合を除いて、交わっていても唯一の解が求められる。しかし、パターン同士の隣接関係を線形方程式に利用しない場合には、これらの軸が交わる場合、方程式に不定性が生じる。このため、隣接関係が得られない場合、これらの軸が交わらないようにプロジェクタを配置する必要がある。 When the adjacent relationship between patterns is used in a linear equation, a straight line shared by the vertical pattern (first light projection means axis) and a straight line shared by the vertical pattern (second light projection means axis) are special. Except in some cases, a unique solution is required even if they meet. However, when the adjacent relationship between patterns is not used in a linear equation, the equation becomes indefinite when these axes intersect. For this reason, when the adjacent relationship cannot be obtained, it is necessary to arrange the projector so that these axes do not intersect.
上記のように、線形方程式を直接解く手法とは異なる解法も考えられる。既に述べたように、投影されるパターン平面は有限であり、位置は既知であるので、解(観測された曲線と、投影される既知のパターン平面との対応)の候補は有限である。そこで、数式7、数式8が成立する解を、組み合わせ最適化による探索で求めても良い。この場合、ある曲線Aについての対応を仮説として採用すると、その曲線の位置が決定し、数式7、数式8から、曲線Aと交わる曲線Bや、あるいは、曲線Aと隣接する曲線Cについての位置情報が得られ、BあるいはCの対応の仮説も限定される。このようにして、次々と曲線に対するパターン平面の対応を求めることで、限定された仮説の中から、数式7,8の条件を最もよく満たす仮説を求めることで、解を求めることもできる。 As described above, a solution method different from the method of directly solving the linear equation is also conceivable. As already described, the projected pattern plane is finite and the position is known, so the solution (correspondence between the observed curve and the projected known pattern plane) is finite. Therefore, a solution that satisfies Equations 7 and 8 may be obtained by a search by combination optimization. In this case, when the correspondence with respect to a certain curve A is adopted as a hypothesis, the position of the curve is determined, and the position of the curve B intersecting with the curve A or the curve C adjacent to the curve A is calculated from the equations 7 and 8. Information is obtained and the hypothesis of B or C correspondence is also limited. In this way, by finding the correspondence of the pattern plane to the curve one after another, the solution can also be obtained by obtaining the hypothesis that best satisfies the conditions of Equations 7 and 8 from the limited hypotheses.
<第3の実施の形態:実験結果>
 本形態では、上記した本形態の画像処理装置および画像処理方法を用いた実験結果を説明する。
<Third Embodiment: Experimental Results>
In the present embodiment, experimental results using the above-described image processing apparatus and image processing method of the present embodiment will be described.
本形態の実験では、図5に示すように、1台のカメラと2台のプロジェクタを配置し計測を行った。各装置の配置は観測対象によって毎回調整を行い、人物の計測においては、カメラ-プロジェクタの間隔は約1m、相対角は約25度とした。カメラの解像度は1280x960ピクセルでフレームレートは15FPS、プロジェクタの解像度は1024x768ピクセルである。 In the experiment of this embodiment, as shown in FIG. 5, one camera and two projectors were arranged and measured. The arrangement of each device was adjusted each time depending on the observation target, and in the measurement of a person, the distance between the camera and the projector was about 1 m, and the relative angle was about 25 degrees. The camera resolution is 1280 × 960 pixels, the frame rate is 15 FPS, and the projector resolution is 1024 × 768 pixels.
先ず、隣接情報および線IDを用いた解の精度向上の評価を説明する。具体的には、隣接情報および線IDを用いて形状復元を行う効果について、次の3つの条件で立方体を観測し、3次元復元結果を比較した。(方法A):グリッドポイントから得られる拘束のみの利用(方法B):グリッドポイントと隣接情報の拘束の利用(方法C):グリッドポイントと隣接情報の拘束に加え、線IDを用いた解修正の利用 復元した点データを2つの平面に当てはめ、面当てはめの二乗平均平方根誤差(root mean square error、 RMSE)と2面間の角度を評価した。実験環境においてカメラから観測対象までの距離は1.6mであった。 First, the evaluation of the accuracy improvement of the solution using the adjacent information and the line ID will be described. Specifically, regarding the effect of shape restoration using adjacent information and line ID, a cube was observed under the following three conditions, and the three-dimensional restoration results were compared. (Method A): Use only constraints obtained from grid points (Method B): Use constraints between grid points and adjacent information (Method C): Solution correction using line ID in addition to constraints between grid points and adjacent information Use of the restored point data was applied to two planes, and the root mean square error (RMSE) and the angle between the two planes were evaluated. In the experimental environment, the distance from the camera to the observation target was 1.6 m.
入力画像とプロジェクタの配置を図6に示す。立方体の2面を3次元形状復元した結果を図7に示し、平面間の角度および平面当てはめにおけるRMS誤差を図9(A)の表に示す。これらの結果より、2面間の角度は、隣接情報(方法B)、線ID(方法C)を使うことにより、90度に近づくため、形状の精度が改善されていることが分かる。グレイコードを使った時間エンコード法と比較すると、方法Cでは、線IDを用いた対応付けによって、正しい対応が得られたため線検出による誤差を除き、グレイコード法と同じ結果が得られた。 The arrangement of the input image and the projector is shown in FIG. FIG. 7 shows the result of restoring the three-dimensional shape of the two surfaces of the cube, and the table of FIG. 9A shows the angle between the planes and the RMS error in the plane fitting. From these results, it is understood that the angle between the two surfaces approaches 90 degrees by using the adjacent information (method B) and the line ID (method C), so that the accuracy of the shape is improved. Compared to the time encoding method using the gray code, the method C obtained the same result as the gray code method except for errors due to the line detection because the correct correspondence was obtained by the correspondence using the line ID.
3次元形状復元結果を平面に当てはめた際のRMS誤差は追加情報を与える毎に大きくなった。これは以下の理由により、自然な結果である。追加情報がない場合には縦線と横線それぞれを復元すると、交点においてその間の距離を最小化する最適化のため、縦線と横線のずれが小さくなり、結果として平面当てはめ時のRMS誤差は小さくなる。逆に隣接関係や線IDの拘束を追加する場合、縦線と横線のずれが大きくなり、平面当てはめ時の誤差が大きくなる。これは、実際のシステムにおいては、画面全体にわたりサブピクセル以下の精度で校正を行うことが困難なためである。 The RMS error when the three-dimensional shape restoration result was applied to a plane increased each time additional information was given. This is a natural result for the following reasons. When there is no additional information, restoring the vertical and horizontal lines minimizes the distance between the vertical and horizontal lines at the intersection to minimize the distance between the vertical and horizontal lines. As a result, the RMS error during plane fitting is small. Become. On the contrary, when the constraint of adjacency or line ID is added, the deviation between the vertical line and the horizontal line becomes large, and the error at the time of plane fitting becomes large. This is because in an actual system, it is difficult to perform calibration with sub-pixel accuracy or less over the entire screen.
次に、図8に示したプロジェクタの配置を用いて、2つのペンシルの軸どうしの距離が短い場合について評価を行った。同様に立方体の計測を行い、3次元形状復元した結果を2個の平面に当てはめ、面間の角度および当てはめ時のRMS誤差を図9(B)に示す。カメラと観測対象間の距離は3.7mであった。この条件の下では、方法Aはランク落ちにより失敗したが、方法B、Cでは、隣接情報による拘束を用いることで3次元形状復元が可能となることが示された。 Next, the case where the distance between the axes of two pencils was short was evaluated using the projector arrangement shown in FIG. Similarly, a cube is measured, and the result of restoring the three-dimensional shape is applied to two planes. The angle between the surfaces and the RMS error at the time of fitting are shown in FIG. The distance between the camera and the observation target was 3.7 m. Under this condition, method A failed due to a rank drop, but methods B and C showed that three-dimensional shape restoration can be achieved by using constraints based on adjacent information.
次に、小物体の形状復元に関して説明する。具体的には、1プロジェクタのみを用いる従来法によるワンショット形状復元(非特許文献2)との比較を行った。従来法では線形解法で残っていた1自由度を決定するために1次元探索が必要であったが、本形態では必要ないため、非特許文献2と比べて密なパターンが利用できる。その利点は細い指などの細かな形状の復元が容易になることである。 Next, the shape restoration of a small object will be described. Specifically, a comparison was made with the one-shot shape restoration (non-patent document 2) by the conventional method using only one projector. In the conventional method, a one-dimensional search is necessary to determine one degree of freedom remaining in the linear solution, but in this embodiment, a dense pattern can be used as compared with Non-Patent Document 2. The advantage is that a fine shape such as a thin finger can be easily restored.
図10は両手法による結果の比較である。上段は入力画像、中段は線検出によって得られたグリッドグラフ、下段は3次元形状復元結果である。左側に示す従来法ではパターンの不足から指が欠けているのに対し、右側に示す本形態では、指が欠けることなく形状復元できていることが分かる。 FIG. 10 is a comparison of the results of both methods. The upper row is an input image, the middle row is a grid graph obtained by line detection, and the lower row is a three-dimensional shape restoration result. It can be seen that the conventional method shown on the left side lacks a finger due to a lack of pattern, whereas the present embodiment shown on the right side can restore the shape without missing the finger.
最後に、人体の密な形状計測を確認するために、動いている人物の上半身を計測した。図11は、複数のポーズについての形状復元結果である。この図において、列(A)は入力画像であり、線検出処理によって線検出およびラインIDを決定した結果が列(B)である。各線の色(ここでは濃淡)によってラインIDを表している。列(C)は線検出結果を用いた形状復元結果の結果である。脇腹や肩といったいくつかの部分では、片方のプロジェクタから死角になっており、縦あるいは横のパターンしか投影されていない部分がある。しかし、そのパターンが死角になっていない部分とつながることにより、形状復元されていることが分かる。これは、複数のプロジェクタを用いる本形態の利点の1つと言える。 Finally, the upper body of a moving person was measured in order to confirm the dense shape measurement of the human body. FIG. 11 shows the shape restoration results for a plurality of poses. In this figure, column (A) is an input image, and the result of determining line detection and line ID by line detection processing is column (B). The line ID is represented by the color of each line (in this case, shading). Column (C) shows the result of the shape restoration result using the line detection result. In some parts such as the flank and shoulders, there is a blind spot from one projector, and there is a part where only a vertical or horizontal pattern is projected. However, it can be seen that the shape is restored by connecting the pattern to a portion that is not a blind spot. This is one of the advantages of this embodiment using a plurality of projectors.
10   画像処理装置
12   画像処理部
12A 第1計算部
12B 第2計算部
12C 第3計算部
14   制御部
16   入力部
18   記憶部
20   表示部
22   操作部
24   プロジェクタ
26   プロジェクタ
28   カメラ
30   物体
DESCRIPTION OF SYMBOLS 10 Image processing apparatus 12 Image processing part 12A 1st calculation part 12B 2nd calculation part 12C 3rd calculation part 14 Control part 16 Input part 18 Storage part 20 Display part 22 Operation part 24 Projector 26 Projector 28 Camera 30 Object

Claims (12)

  1. 2次元画像から3次元形状を復元する画像処理装置であり、 前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、 前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1計算部と、 前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2計算部と、 前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3計算部と、 を備えることを特徴とする画像処理装置。 An image processing apparatus that restores a three-dimensional shape from a two-dimensional image, wherein the two-dimensional image is a first light projecting unit that projects a first pattern onto an object existing in a three-dimensional space; and the first pattern And a second light projecting means for projecting a second pattern intersecting the surface of the object to the object, and photographing the first pattern light and the second pattern light reflected by the object to obtain a two-dimensional image. And a first curve that is the first pattern projected onto the object in the two-dimensional image and the second pattern that is projected onto the object in the two-dimensional image. A first calculation unit that detects two curves and calculates an intersection coordinate that is a coordinate of the intersection between the first curve and the second curve; the intersection coordinates, the first light projecting unit, and the second light projecting unit Parameters of the means, and of the photographing means A second calculation unit for determining a first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern from the parameter; The third calculation for restoring the three-dimensional shape by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from one correspondence, the second correspondence, or both And an image processing apparatus comprising: a unit.
  2. 前記第1投光手段および前記第2投光手段が、複数の線分の画像である第1パターン画像および第2パターン画像を3次元空間に投影する装置であることを特徴とする請求項1に記載の画像処理装置。 2. The apparatus according to claim 1, wherein the first light projecting unit and the second light projecting unit project a first pattern image and a second pattern image, which are images of a plurality of line segments, into a three-dimensional space. An image processing apparatus according to 1.
  3. 前記第3計算部では、前記交点座標、前記第1投光手段および前記第2投光手段のパラメータから得られる第1線形方程式を解くことで、前記3次元形状を算出することを特徴とする請求項1または請求項2に記載の画像処理装置。 The third calculation unit calculates the three-dimensional shape by solving a first linear equation obtained from parameters of the intersection coordinates, the first light projecting unit, and the second light projecting unit. The image processing apparatus according to claim 1.
  4. 前記第3計算部では、前記第1計算部で検出された前記交点座標を制約として、前記各対応を決定する組み合わせ最適化問題として解くことで、前記3次元形状を算出することを特徴とする請求項1または請求項2に記載の画像処理装置。 The third calculation unit calculates the three-dimensional shape by solving as a combination optimization problem that determines the correspondence with the intersection coordinates detected by the first calculation unit as a constraint. The image processing apparatus according to claim 1.
  5. 前記第1投光手段の軸と前記第2投光手段の軸とが、3次元空間中で交わらないことを特徴とする請求項1から請求項4の何れかに記載の画像処理装置。 The image processing apparatus according to claim 1, wherein an axis of the first light projecting unit and an axis of the second light projecting unit do not intersect in a three-dimensional space.
  6. 前記第1パターンあるいは前記第2パターンが3次元空間中で通過する平面の集合が、双対空間中で等間隔に並ぶという第1条件が満たされるように、 前記第1投光手段および前記第2投光手段を配置することを特徴とする請求項1から請求項5の何れかに記載の画像処理装置。 The first light projecting unit and the second light projecting unit are arranged so that a first condition that a set of planes through which the first pattern or the second pattern passes in a three-dimensional space is arranged at equal intervals in a dual space is satisfied. 6. The image processing apparatus according to claim 1, further comprising a light projecting unit.
  7. 前記第1条件が、前記第1投光手段の正面と前記撮影手段から前記第1投光手段への方向ベクトルとが直角になると共に、前記第2投光手段の正面と前記撮影手段から前記第2投光手段への方向ベクトルが直角になる状態で、前記2次元画像を取得することで満たされることを特徴とする請求項6に記載の画像処理装置。 The first condition is that the front surface of the first light projecting unit and the direction vector from the image capturing unit to the first light projecting unit are perpendicular to each other, and the front surface of the second light projecting unit and the image capturing unit The image processing apparatus according to claim 6, wherein the image processing apparatus is satisfied by acquiring the two-dimensional image in a state where a direction vector to the second light projecting unit is a right angle.
  8. 前記第1パターンまたは前記第2パターンには、個々の前記パターンを幾つかのクラスに識別可能な識別情報が付与されており、 前記第3計算部では、前記識別情報と前記第1パターンまたは前記第2パターンのクラスが一致するという制約条件が用いられることを特徴とする請求項1から請求項7の何れかに記載の画像処理装置。 The first pattern or the second pattern is provided with identification information capable of identifying each of the patterns into several classes. In the third calculation unit, the identification information and the first pattern or the The image processing apparatus according to claim 1, wherein a constraint condition that the classes of the second pattern match is used.
  9. 前記第1パターンまたは前記第2パターンに含まれるパターンの色を、2色以上とすることにより前記識別情報を持たせることを特徴とする請求項8に記載の画像処理装置。 The image processing apparatus according to claim 8, wherein the identification information is provided by setting the color of the pattern included in the first pattern or the second pattern to two or more colors.
  10. 前記第2計算部では、隣接する前記第1パターン同士または隣接する前記第2パターン同士の関係式を用いて前記第1対応あるいは前記第2対応、あるいはその両方を求めることを特徴とする請求項1から請求項9の何れかに記載の画像処理装置。 The said 2nd calculation part calculates | requires the said 1st correspondence or the said 2nd correspondence, or both using the relational expression of the said adjacent 1st patterns, or the said 2nd adjacent patterns. The image processing apparatus according to claim 1.
  11. 2次元画像から3次元形状を復元する画像処理方法であり、 前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、 前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1ステップと、 前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2ステップと、 前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3ステップと、 を備えることを特徴とする画像処理方法。 An image processing method for restoring a three-dimensional shape from a two-dimensional image, wherein the two-dimensional image is a first light projecting unit that projects a first pattern onto an object existing in a three-dimensional space; and the first pattern And a second light projecting means for projecting a second pattern intersecting the surface of the object to the object, and photographing the first pattern light and the second pattern light reflected by the object to obtain a two-dimensional image. And a first curve that is the first pattern projected onto the object in the two-dimensional image and the second pattern that is projected onto the object in the two-dimensional image. A first step of detecting two curves and calculating an intersection coordinate which is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates, the first light projecting means and the second light projecting means Parameters and the photographing means A second step of determining, from the parameters, a first correspondence that is a correspondence between the first curve and the first pattern, and a second correspondence that is a correspondence between the second curve and the second pattern; A third step of restoring the three-dimensional shape by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the correspondence, the second correspondence, or both; An image processing method comprising:
  12. 2次元画像から3次元形状を復元する機能を画像処理装置に実行させるプログラムであり、 前記2次元画像は、3次元空間に存在する物体に対して第1パターンを投光する第1投光手段と、前記第1パターンと前記物体の表面で交わる第2パターンを前記物体に対して投光する第2投光手段と、前記物体で反射した前記第1パターン光および前記第2パターン光を撮影して2次元画像を得る撮影手段とで取得され、 前記2次元画像にて前記物体に投影された前記第1パターンである第1曲線と、前記2次元画像にて前記物体に投影された前記第2パターンである第2曲線とを検出し、前記第1曲線と前記第2曲線との交点の座標である交点座標を算出する第1機能と、 前記交点座標、前記第1投光手段および前記第2投光手段のパラメータ、および前記撮影手段のパラメータから、前記第1曲線と前記第1パターンとの対応である第1対応および、前記第2曲線と前記第2パターンとの対応である第2対応を決定する第2機能と、 前記第1対応、前記第2対応又はその両方から、第1パターン光および前記第2パターン光が照射された部分の前記物体の3次元座標を算出することで、前記3次元形状を復元する第3機能と、 を実行させることを特徴とするプログラム。 A program for causing an image processing apparatus to execute a function of restoring a three-dimensional shape from a two-dimensional image, wherein the two-dimensional image is a first light projecting unit that projects a first pattern onto an object existing in a three-dimensional space. Photographing the first pattern light and the second pattern light reflected by the object; and second light projecting means for projecting the second pattern that intersects the first pattern and the surface of the object to the object. And a first curve that is the first pattern projected onto the object in the two-dimensional image and the first curve projected onto the object in the two-dimensional image. A first function that detects a second curve that is a second pattern and calculates an intersection coordinate that is a coordinate of an intersection between the first curve and the second curve; the intersection coordinates; the first light projecting means; Parameters of the second light projecting means And a first correspondence that is a correspondence between the first curve and the first pattern and a second correspondence that is a correspondence between the second curve and the second pattern are determined from the parameters of the imaging means and the parameters of the photographing means. The three-dimensional shape is calculated by calculating the three-dimensional coordinates of the object of the portion irradiated with the first pattern light and the second pattern light from the two functions and the first correspondence, the second correspondence, or both. A third function that restores the program, and a program that executes
PCT/JP2011/002561 2010-05-17 2011-05-09 Image processing device, image processing method and program WO2011145285A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010112753A JP2011242183A (en) 2010-05-17 2010-05-17 Image processing device, image processing method, and program
JP2010-112753 2010-05-17

Publications (1)

Publication Number Publication Date
WO2011145285A1 true WO2011145285A1 (en) 2011-11-24

Family

ID=44991408

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2011/002561 WO2011145285A1 (en) 2010-05-17 2011-05-09 Image processing device, image processing method and program

Country Status (2)

Country Link
JP (1) JP2011242183A (en)
WO (1) WO2011145285A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013187203A1 (en) * 2012-06-12 2013-12-19 株式会社島精機製作所 Three-dimensional measurement apparatus, and three-dimensional measurement method
US9441958B2 (en) 2013-08-12 2016-09-13 Ricoh Company, Ltd. Device, method, and non-transitory computer-readable recording medium to calculate a parameter for calibration

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6097903B2 (en) * 2011-07-15 2017-03-22 有限会社テクノドリーム二十一 Three-dimensional shape acquisition apparatus, processing method, and program
JP5547227B2 (en) * 2012-03-29 2014-07-09 株式会社デンソーアイティーラボラトリ Imaging device position / posture estimation system
WO2014020823A1 (en) * 2012-07-30 2014-02-06 独立行政法人産業技術総合研究所 Image processing system, and image processing method
JP6112807B2 (en) * 2012-09-11 2017-04-12 株式会社キーエンス Shape measuring device, shape measuring method, and shape measuring program
KR101479734B1 (en) * 2013-07-26 2015-01-06 전자부품연구원 3 Dimensional shape measuring system based structured light pattern
KR102085705B1 (en) * 2013-08-08 2020-03-06 엘지전자 주식회사 3 demensional camera
JP6717488B2 (en) * 2015-06-26 2020-07-01 国立大学法人 鹿児島大学 Projection system, projection method, pattern generation method and program
EP3561447B1 (en) * 2017-01-25 2023-11-22 National Institute of Advanced Industrial Science and Technology Image processing method
US10839536B2 (en) * 2018-10-02 2020-11-17 Facebook Technologies, Llc Depth sensing using grid light patterns
JP7187330B2 (en) * 2019-01-15 2022-12-12 株式会社小松製作所 Shape measuring device, shape measuring system, and shape measuring method
WO2021157196A1 (en) * 2020-02-04 2021-08-12 ソニーグループ株式会社 Information processing device, information processing method, and computer program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61175511A (en) * 1985-01-31 1986-08-07 Goro Matsumoto Three-dimensional shape measuring apparatus
JPS63109308A (en) * 1986-10-27 1988-05-14 Sharp Corp Apparatus for inspecting mounting of chip component

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5317169B2 (en) * 2008-06-13 2013-10-16 洋 川崎 Image processing apparatus, image processing method, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61175511A (en) * 1985-01-31 1986-08-07 Goro Matsumoto Three-dimensional shape measuring apparatus
JPS63109308A (en) * 1986-10-27 1988-05-14 Sharp Corp Apparatus for inspecting mounting of chip component

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HIROSHI KAWASAKI ET AL.: "3D Reconstruction by Localizing Multiple Laser Planes Using Self-Calibration Method", THE TRANSACTIONS OF THE INSTITUTE OF ELECTRONICS, INFORMATION AND COMMUNICATION ENGINEERS D, vol. J90-D, no. 8, 1 August 2007 (2007-08-01), pages 1848 - 1857 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013187203A1 (en) * 2012-06-12 2013-12-19 株式会社島精機製作所 Three-dimensional measurement apparatus, and three-dimensional measurement method
JPWO2013187203A1 (en) * 2012-06-12 2016-02-04 株式会社島精機製作所 3D measuring device and 3D measuring method
US9441958B2 (en) 2013-08-12 2016-09-13 Ricoh Company, Ltd. Device, method, and non-transitory computer-readable recording medium to calculate a parameter for calibration

Also Published As

Publication number Publication date
JP2011242183A (en) 2011-12-01

Similar Documents

Publication Publication Date Title
WO2011145285A1 (en) Image processing device, image processing method and program
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
JP6426968B2 (en) INFORMATION PROCESSING APPARATUS AND METHOD THEREOF
Zhang Camera calibration with one-dimensional objects
US8452081B2 (en) Forming 3D models using multiple images
US8447099B2 (en) Forming 3D models using two images
JP5029618B2 (en) Three-dimensional shape measuring apparatus, method and program by pattern projection method
US8208029B2 (en) Method and system for calibrating camera with rectification homography of imaged parallelogram
Takimoto et al. 3D reconstruction and multiple point cloud registration using a low precision RGB-D sensor
JP6097903B2 (en) Three-dimensional shape acquisition apparatus, processing method, and program
Furukawa et al. One-shot entire shape acquisition method using multiple projectors and cameras
da Silveira et al. Dense 3D scene reconstruction from multiple spherical images for 3-DoF+ VR applications
WO2018006246A1 (en) Method for matching feature points of planar array of four-phase unit and measurement method on basis thereof
Gadasin et al. Reconstruction of a Three-Dimensional Scene from its Projections in Computer Vision Systems
JP6285686B2 (en) Parallax image generation device
Coudrin et al. An innovative hand-held vision-based digitizing system for 3D modelling
GB2569609A (en) Method and device for digital 3D reconstruction
US20200041262A1 (en) Method for performing calibration by using measured data without assumed calibration model and three-dimensional scanner calibration system for performing same
JP6671589B2 (en) Three-dimensional measurement system, three-dimensional measurement method, and three-dimensional measurement program
Bender et al. A Hand-held Laser Scanner based on Multi-camera Stereo-matching
Yan et al. Calibration of camera intrinsic parameters using a single image
Recker et al. Hybrid Photogrammetry Structure-from-Motion Systems for Scene Measurement and Analysis
Shin et al. Evaluation of close-range stereo matching algorithms using stereoscopic measurements
Alhwarin et al. Optimized KinectFusion Algorithm for 3D Scanning Applications.
Sagawa et al. Linear solution for oneshot active 3d reconstruction using two projectors

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11783224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 11783224

Country of ref document: EP

Kind code of ref document: A1