WO2012144848A2 - Methods of three-dimensional recognition via the acquisition of a two-dimensional image consisting of multiple beams of transmitted light having a preset magnification angle - Google Patents

Methods of three-dimensional recognition via the acquisition of a two-dimensional image consisting of multiple beams of transmitted light having a preset magnification angle Download PDF

Info

Publication number
WO2012144848A2
WO2012144848A2 PCT/KR2012/003052 KR2012003052W WO2012144848A2 WO 2012144848 A2 WO2012144848 A2 WO 2012144848A2 KR 2012003052 W KR2012003052 W KR 2012003052W WO 2012144848 A2 WO2012144848 A2 WO 2012144848A2
Authority
WO
WIPO (PCT)
Prior art keywords
projection
acquisition
light
point
line
Prior art date
Application number
PCT/KR2012/003052
Other languages
French (fr)
Korean (ko)
Other versions
WO2012144848A3 (en
Inventor
이태경
정제교
박철우
김영준
Original Assignee
Yi Tae Kyoung
Jung Je Kyo
Park Chul Woo
Kim Young Jun
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yi Tae Kyoung, Jung Je Kyo, Park Chul Woo, Kim Young Jun filed Critical Yi Tae Kyoung
Publication of WO2012144848A2 publication Critical patent/WO2012144848A2/en
Publication of WO2012144848A3 publication Critical patent/WO2012144848A3/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/254Image signal generators using stereoscopic image cameras in combination with electromagnetic radiation sources for illuminating objects

Definitions

  • the present invention projects a plurality of point light sources having a preset projection angle in a space, and displays an image of the projection points made by reflecting each projected point light source from a surface of an environment or an object (hereinafter referred to as an object).
  • the present invention relates to a method of recognizing three-dimensional characteristics of an object based on the two-dimensional information obtained through one light receiving unit having a predetermined geometric relationship with respect to the information.
  • 3D recognition is regarded as an essential element in the operation of automated devices and devices and the voluntary functioning.
  • GPS Global Positioning System
  • the laser range finder can only detect the relative distance on the two-dimensional plane, there are inherent limitations as a method for obtaining three-dimensional spatial information.
  • An object of the present invention to overcome the above limitations of the prior art is to provide a method capable of three-dimensional spatial recognition using only one two-dimensional image acquisition means.
  • the present invention provides a three-dimensional recognition method through the two-dimensional image acquisition of the multi-projection light that can speed up the image processing and downsizing the image processing processor by eliminating the burden of the excessive amount of the conventional three-dimensional recognition technology stereo vision There is that purpose.
  • a three-dimensional recognition method by acquiring two-dimensional images of multi-projection light includes a first step of projecting a plurality of projection light beams having a predetermined projection angle with respect to the center projection line to a three-dimensional object; and centering on the center projection line A second step of acquiring a pattern of projection points formed on the three-dimensional object as a plurality of acquisition points having respective coordinate values through the two-dimensional image detection surface of the light receiving sensor while the acquisition lines are located on the same plane; and A third step of determining a one-to-one correspondence between the plurality of projection lights and an acquisition point; And a fourth step of calculating each acquisition distance from a geometric relationship between the projection angle and an acquisition angle and a focal length of the acquisition point.
  • the fourth step is to calculate the acquisition distance through the following equation (1) and (2).
  • L is the acquisition distance
  • f is the focal length
  • is the projection angle
  • is the acquisition angle
  • the present invention further includes a step 4-1 of converting the acquisition distance into an origin distance through Equation (3) below.
  • an angle formed between the center projection line and the center acquisition line based on the center projection point may be 20 ° or less.
  • the present invention may further include a fifth step of calculating the surface shape of the three-dimensional object by curvedly fitting the origin distance information of the plurality of acquisition points.
  • the third step may be performed by measuring the gradient characteristic of the measurable property values applied to the projection lights at the acquisition point and correlating each other.
  • the first step may allow the plurality of projection lights to form a projection pattern of equally spaced grid arrays or conformal concentric arrays based on the center projection line
  • the third step may extend radially about the center projection line.
  • the acquisition points may correspond one-to-one in correspondence with the arrangement order of a series of projection lights existing on a line.
  • the projection pattern may be made by projecting a single light source onto a diffraction grating having a preset dot pattern.
  • the present invention may further include a step 2-1 of comparing the number of projection light beams with the number of acquisition points to determine whether a missing acquisition point exists.
  • the method further includes step 2-2 of adjusting the direction of the center projection line such that the number of the acquisition points is equal to the number of the projection light beams. have.
  • the method may further include performing a 2-2 ′ step of repeatedly determining the corresponding projection light for all the projection light.
  • the step 2-2 'simultaneously blocks a series of projection light beams having the same radial distance with respect to the center projection line among the projection light beams of the projection pattern to the light blocking unit, and radially arranges the blocked projection light beams.
  • the light blocking unit may sequentially block the series of projection lights having the largest radial distance or the series of projection lights having the smallest radial distance.
  • the light blocking unit may be a physical shutter or an optical shutter which is a liquid crystal glass in which some regions are converted to non-transmissive light by application of a current.
  • the physical shutter may have a structure of a camera shutter.
  • the physical shutters may be a pair of shutters having "b" and "a” shapes and moving in a diagonal direction toward the center projection line.
  • the sequential blocking of the light blocking unit is performed in a preset pattern, and when the light blocking unit is positioned in a preset positional relationship with respect to the projection focus and sequential blocking is performed for each projection light, the projection at a certain point of time It is also possible to designate or confirm the projection angle of the projection light just before blocking according to the blocking phase with respect to the focus.
  • the projection light is preferably a laser light having a wavelength that is distinct from the wavelength of light existing around the three-dimensional object.
  • the present invention having the configuration as described above has the effect that can be three-dimensional recognition by only one two-dimensional image acquisition means.
  • the information processing amount is very small, thereby speeding up the image processing and miniaturizing the image processing processor. There is an advantage.
  • the projection point can be stably recognized regardless of the weakness of the peripheral illumination, which is one of the inherent problems of the image acquisition means. .
  • FIG. 1 is a diagram illustrating the basic principle of triangulation used in the present invention.
  • FIG. 2 is a diagram schematically showing a three-dimensional geometric relationship between a projection point and an acquisition point.
  • 3 is an orthogonal projection of the three-dimensional projection line and the acquisition line shown in FIG. 2 in the XZ plane with respect to the reference origin;
  • FIG. 4 is a view schematically showing that a plurality of projection light is detected as an acquisition point on the two-dimensional image detection surface of the light receiving sensor after forming the projection point on the three-dimensional object.
  • FIG. 5 is a diagram of a first projection pattern in which a plurality of projection lines form an equally spaced grid array based on a center projection line.
  • FIG. 6 is a view of a second projection pattern in which a plurality of projection lines form an isoconcentric array based on a center projection line.
  • FIG. 7 schematically illustrates a configuration of forming a constant projection pattern using a diffraction grating.
  • FIG. 8 is a diagram illustrating a phenomenon in which a projection point is formed in an undercut portion of a three-dimensional object to lose an acquisition point.
  • FIG. 9 is a view showing a configuration of a light blocking portion for a projection pattern of a conformal concentric array.
  • 10 is a view for explaining the principle of determining the lost acquisition point through the sequential blocking using the light shield.
  • FIG. 11 is a diagram showing a configuration of a light blocking unit for a projection pattern of an equally spaced grid array.
  • FIG. 12 is a flowchart illustrating a three-dimensional recognition method through two-dimensional image acquisition of multi-projection light according to the present invention.
  • an object 10 is obtained through pattern analysis of the multiple projection light shown in the two-dimensional image. It relates to a method of recognizing a three-dimensional surface shape of).
  • the two-dimensional orthogonal image of the images is the surface of the three-dimensional object 10.
  • a plurality of projected light sources are projected light P, and the image formed by the point light source reaching the surface of the three-dimensional object 10 is' Projection point (PP) ').
  • the pattern of projection points formed on the surface of the three-dimensional object 10 contains the surface information of the three-dimensional object 10, analyzing the pattern information inversely infers the three-dimensional surface shape of the object 10. It can be intuitively understood that it will be possible.
  • the method of inferring the three-dimensional surface shape of the object 10 from the pattern of the projection points formed on the surface of the three-dimensional object 10 calculates the distance from one point for observing (acquiring) the two-dimensional pattern of the projection points to each projection point. It is. If the distance to each projection point is known, finite number of discontinuous coordinate information about the surface shape of the object 10 seen from the observation point is obtained, and the coordinate information of these projection points is obtained by using a known algorithm. 3D Surface Fitting can infer the approximate surface shape.
  • the first relates to a method of calculating the distance from one point of observation of a two-dimensional pattern of projection points to each projection point PP. This can be overcome relatively simply among the various difficulties according to the principle of triangulation to be described later.
  • a certain projection point may not be obtained at the observation point. If the loss of the acquisition point CP is not found, the acquisition point CP and the projection light P Errors can occur in a one-to-one correspondence between them. The loss of the acquisition point CP may occur when a projection point is formed in the undercut portion of the object 10.
  • the basic principle of calculating the distance from one point of observation to the two-dimensional pattern of projection points is the principle of triangulation.
  • the principle of triangulation is a classical geometry-based survey method, if the observer occupying the position of A knows the distance between AB and BC for the other two points B and C, then the distance between the unknown AC It is based on the principle of knowing.
  • measuring the distance between A-B and B-C to find the distance between point A and point C is not only inefficient, indirect, but also difficult. Rather, it would be straightforward and easy to measure the distance between A and C immediately. It may also be impossible to measure the distance between A-B or B-C directly due to the influence of the terrain.
  • the present invention also calculates the distance from one point of observing the two-dimensional pattern to each projection point (PP) through the combination of the above-described distance and angle measurement.
  • Acquisition angle CA calculated through the distance between the focal points Cf, the projection angle PA of each of the plurality of projection light beams P, and the coordinates of the respective acquisition points CP detected on the light receiving sensor 200. If we know, we can finally calculate the distance to the projection point PP.
  • the projection angle PA is easily determined because it is determined by the optical design of the light transmitting part 100 that generates a plurality of projection light beams P, but the acquisition angle CA is the surface shape of the three-dimensional object 10. You have to find it by calculation because it is changed from time to time.
  • the acquisition angle CA uses a trigonometric function to determine the two-dimensional coordinates of the acquisition point CP formed on the light receiving sensor 200 and the geometric relationship between the acquisition point CP and the known light reception focal point Cf. Can be calculated by solving.
  • FIG. 2 schematically illustrates the projection point PP for the two projection lights P and the acquisition point CP formed on the two-dimensional image detection surface 210 of the light receiving sensor 200.
  • each point and virtual line, length, angle, etc. shown in FIG. 2 will be described as follows, and this definition will be maintained throughout the detailed description as well as in the claims unless otherwise specified.
  • the projection line PL refers to a traveling path through which the projection light P whose projection angle PA is finally adjusted (determined) is projected toward the object 10.
  • the final determination of the projection angle PA is performed after the intermediate process of dividing the light source from a single light source into a plurality of light sources to form a projection pattern and adjusting the projection magnification. This is because the traveling path of the projection light P toward (10) is determined.
  • the center projection line PLc refers to one projection line PL which is a reference among the plurality of projection lines PL. Although it can be arbitrarily determined, there is no change in straightness even after a single light source is divided into multiple light sources and the projection magnification is adjusted. It is convenient to use the projection line PL as the center projection line PLc. Accordingly, the angle formed by each projection line (except for the center projection line) with the center projection line PLc is defined as the projection angle PA.
  • the projection point PP corresponding to the center projection line PLc is referred to as the center projection point PPc.
  • the projection focus Pf means a point at which all the projection lines PL converge, and do not necessarily coincide with a single light source.
  • the projection focus Pf may be set for each projection line PL according to the projection pattern and the projection magnification, it is obvious that matching the projection focus Pf to one is simple and convenient.
  • the acquisition line CL is an imaginary extension line connecting the projection point PP formed on the three-dimensional object 10 and the light reception focus Cf, and the acquisition line CL is a two-dimensional image detection surface of the light reception sensor 200 ( An intersection formed through 210 is called an acquisition point CP. Therefore, the acquisition point CP has a coordinate value on the two-dimensional image detection surface 210. As will be described later, the coordinate value of this acquisition point CP is used to calculate the distance to each projection point PP.
  • the light receiving focus Cf means a point at which all the acquisition lines CL converge.
  • the projection focus Pf and the light receiving focus Cf are located on a straight line along the center projection line PLc. Is arranged to. Accordingly, since all the acquisition lines CL converge to the light reception focal point Cf, the center acquisition line CLc that returns to the light reception focal point Cf after the center projection line PLc forms the projection point PP is the center projection line. (PLc).
  • the acquisition point CP formed by the central acquisition line CLc is defined as the central acquisition point CPc
  • the acquisition angle CA is the central acquisition line CLc and each acquisition line similarly to the projection angle PA. It is defined as the angle formed by (CL) (excluding the center line).
  • the straight line distance between the straight line projection focus Pf and the light reception focus Cf is defined as the focal length f.
  • the projection focus Pf and the light reception focal point Cf can be produced by a conventional optical apparatus using a lens.
  • the reference origin O is a three-dimensional coordinate origin existing on the central acquisition line CLc or on an imaginary line extending from the central acquisition point CPc, and refers to a reference point recognized by a device in which the present invention is implemented. Therefore, unless otherwise mentioned below, the distance to the three-dimensional object 10 should be understood as referring to the origin distance OD which is the distance between the reference origin O and the projection point PP.
  • the light receiving focus Cf may be set as the reference origin O, and the coordinate axis definition of the reference origin O when describing an embodiment of the present invention is directed from the reference origin O toward the center acquisition point CPc.
  • the coordinate axis will be the Z axis.
  • the initial calculated value Denotes an acquisition distance CD, which is a distance between the projection point PP and the acquisition point CP, before the origin distance OD, which is the distance between the reference origin O and the projection point PP.
  • the conversion of the origin distance OD from the acquisition distance CD is a simple calculation. It can be done easily.
  • a relational formula for calculating the origin distance OD is derived from the geometric relations of the back.
  • 3 shows the three-dimensional projection line PL and the acquisition line CL shown in FIG. 2 in the XZ plane with respect to the reference origin O, and all distances and angles are represented by orthogonal projections to the XZ plane. . Accordingly, when referring to each distance and angle, in principle, a description indicating "XZ orthogonal projection" should be attached, but the description of "XZ orthogonal projection" will be omitted for convenience.
  • Equation (1) L denotes an acquisition distance, f denotes a focal length, ⁇ denotes a projection angle PA, and ⁇ denotes an acquisition angle CA.
  • the focal length (f) and the projection angle (PA) are known values, and the acquisition angle (CA) is the “acquisition point (CP) and center” for the “distance between the light receiving focus (Cf) and the center acquisition point (CPc)".
  • the tangent function of "distance between acquisition points CPc" is calculated by the following equation (2).
  • the distance between the acquisition point CP and the central acquisition point CPc may be calculated by the coordinate value of each acquisition point CP.
  • the acquisition distance CD is obtained from the calculation of a simple trigonometric function.
  • the origin distance OD can be obtained from Equation (3) applying the second cosine law.
  • FIG. 4 illustrates that a plurality of projection lights P are detected as acquisition points CP on the two-dimensional image detection surface 210 of the light receiving sensor 200 after forming the projection points PP on the three-dimensional object 10. It is shown schematically, by performing the above series of calculation process for each acquisition point (CP) to calculate the respective acquisition distance (CD) (convertable to the origin distance) to determine the surface shape of the three-dimensional object (10) It is clearly shown that it is possible to calculate roughly.
  • the two-dimensional image detection surface 210 of the light receiving sensor 200 is disposed perpendicular to the plane where the center projection line PLc and the center acquisition line CLc are located.
  • an embodiment of the present invention includes a configuration in which the center projection line PLc and the center acquisition line CLc do not coincide with each other but are arranged on at least one plane. 3, the light receiving focus Cf has a height difference with respect to the projection focus Pf along the Y axis perpendicular to the ground.
  • the error level is about 94%, that is, about 6%, so that a fairly accurate value can be obtained.
  • the distance to is approximately three times the distance of the light receiving focal point Cf, it means that an error level of about 6% can be maintained geometrically. That is, even if the light receiving focus Cf is spaced about 10 cm apart, the surface contour of the three-dimensional object 10 separated by 28 cm or more can be calculated with considerable precision, which is a practically meaningful result.
  • the wavelength of the projection light (P) has a specific wavelength that is distinct from the wavelength of the light around the three-dimensional object 10 to solve the weakness that is sensitive to the peripheral illumination, which is one of the inherent problems of the image acquisition means It is preferable.
  • the projection light P having the wavelength of the infrared band which is distinguished from the wavelength of the visible light band can be applied.
  • the calculation of the distance from one point of observing the two-dimensional pattern of the projection points PP, in particular, the reference origin O to each projection point PP can be solved using the principle of triangulation.
  • the next problem to be solved is to determine which projection light P corresponds to each of the plurality of acquisition points CP present in the 2D image.
  • a gradient characteristic is given to physically measurable physical properties such as wavelength (frequency) or output included in a plurality of projection lights P. That is, each projection light P imparts a physical property distinguished from other projection light P, measures the physical property at each acquisition point CP, and corresponds it to each projection light P, thereby projecting light To determine the one-to-one correspondence between (P) and the acquisition point (CP).
  • this method has a disadvantage in that it is structurally complicated because it is necessary to provide a structure having a meaningful gradient characteristic to each of the plurality of projection lights P and detecting them again.
  • physical properties of the projection light P may be altered in the process of being scattered or passed through heterogeneous media (eg, water droplets or dust in the air). Therefore, there is a need for a simpler and more reliable alternative.
  • FIG. 5 and 6 show projection patterns of the multiple projection light beams P in the embodiment of the present invention.
  • FIG. 5 is a first projection pattern 112 in which a plurality of projection lines PL form an equally spaced grid array (the same distance along the X and Y axes) based on the center projection line PLc
  • FIG. 6 is a center projection line PLc.
  • each acquisition point ( CP is moved only along the line connecting the corresponding projection line PL and the center projection line PLc in the radial direction. That is, as shown in FIG. 5 and FIG. 6, the acquisition point CP corresponding to each projection line PL indicated by a large circle only moves its position on a line extending radially about the center projection point PPc. . This occurs because the center projection line PLc and the center acquisition line CLc coincide.
  • the one-to-one correspondence between the plurality of projection lights P and the acquisition point CP is very simple.
  • the series of projection lights P existing on the line extending radially with respect to the center projection line PLc only changes its position little by little, and the arrangement order is maintained. Therefore, the projection light P and the All you have to do is match the acquisition points one-to-one.
  • the maintenance of the arrangement order is effective only along a line extending radially with respect to the center projection line PLc, so that the second projection pattern 114 constituting the conformal concentric arrangement of FIG. 6 is more convenient.
  • the first and second projection patterns 112 and 114 may be formed using a diffraction grating 110 as shown in FIG. 7.
  • a dot pattern is formed on the diffraction grating 110 by etching or laser processing, a plurality of projection light beams P having a constant pattern, such as the first / second projection patterns 112 and 114 shown by the dot pattern, are formed.
  • the projection angle PA of each projection light P has a predetermined value according to the design of the dot pattern.
  • the light source adopts a laser light source having almost no light spread and excellent straightness.
  • the loss of the acquisition point CP may occur when the projection point PP is formed in the undercut portion of the object 10, an example of which is illustrated in FIG. 8. This is because, as the projection focus Pf and the light reception focus Cf are arranged in a straight line, the projection focus Pf must be disposed earlier in the projection direction than the light receiving sensor 200, so that the projection angle PA is This is because it is larger than the acquisition angle CA.
  • Whether the acquisition point CP is lost can be determined by determining whether the number of the acquisition points CP is short compared to the number of the projection lights P.
  • the number of acquisition points CP may be the same as the number of projection light beams P, thereby avoiding the loss of the acquisition points CP. .
  • the lost acquisition point (loss point) is calculated when calculating the origin distance OD by determining which projection light P corresponding to the projection point P is lost. ).
  • This utilizes the property that a series of projection lights P existing on a line extending radially about the center projection line PLc only changes its position little by little but maintains the arrangement order.
  • the outermost projection light P of the grid array of the light transmission pattern (in FIG. 9 using the conformal concentric array of FIG. 6 as an example), that is, the series of projection light furthest from the center projection line PLc (
  • the acquisition point CP corresponding to the projection light beams P obscured by the light blocking unit 120 is determined by the light receiving sensor 200. It is disappeared, and by determining the change of the acquisition point CP before and after being covered by the light blocking unit 120, it is possible to know which projection light P corresponds to the lost acquisition point CP.
  • the second projection pattern 114 and its acquisition point CP will be described in more detail with reference to FIG. 10.
  • the two-digit subscripts shown in the projection light (P) and the acquisition point (CP) are given to distinguish each projection light (P) and the acquisition point (CP).
  • the first digit is centered on the center projection line (PLc).
  • the radially extending line is displayed in order along the counterclockwise direction, and the second digit is numbered along the direction away from the center projection line PLc (outer direction).
  • each projection light (P) and the acquisition point (CP) are arranged up to 01 ... 04, 11 ... 14, ..., 41 ... 44
  • the light blocking unit 120 sequentially covers the projection light beams P in the circumferential direction of which the second digit is the same.
  • the vanishing point determination is sequentially performed on the same circumferential projection light beams P, and it is determined whether or not the projection light beams P which have been vanishing point determination remain blocked at the next vanishing point determination or are unblocked. It doesn't matter. Also, it does not matter whether the projection light P is blocked from the outermost or starts from the center projection line PLc. In other words, this is only a difference between sequential opening or sequential blocking, and there is no difference in effect.
  • the sequential light blocking or light opening as described above may be used as an auxiliary means for designating the projection angle PA of the projection line PL. That is, when the light blocking unit 120 is positioned in a preset positional relationship with respect to the projection focus Pf and the blocking or opening is sequentially performed with respect to the projection light P in a preset pattern, the projection focus at any temporary point If the blocking phase with respect to (Pf) can be known, the projection angle PA of the immediately before the open or blocked projection line PL can be known. This is possible because the projection pattern of the projection light P is already designed, and an auxiliary means (such as specifying or confirming the projection angle) in the disappearance of the projection point PP causing a problem of one-to-one correspondence of the preset projection angle PA. It can be used as.
  • the light blocking unit 120 may preferably adopt the same structure as a general camera shutter, and the first projection pattern 112 having the equal interval grid array. As shown in FIG. 11, a pair of shutters having "b" and “a” shapes and moving along the diagonal direction toward the center projection line PLc may be used.
  • the light shielding for vanishing point determination may be applied not only to this physical shutter method but also to an optical shutter such as the liquid crystal glass 124.
  • the liquid crystal glass 124 is installed as a light transmission window disposed on the optical path of the projection light P, and controls a current applied to the liquid crystal glass 124 to sequentially convert some regions into non-transmissive light.
  • the function of the light blocking unit 120 may be implemented.
  • Such an optical shutter has advantages in that it is structurally simple, does not generate mechanical vibration, and relatively freely designs a part that does not transmit light, compared to a physical shutter.
  • two of the projection points PP having the predetermined magnification angle that is, the multi-projection light in which the projection angle PA of each projection light P is defined on the surface of the three-dimensional object 10 are formed. It has been described above whether the three-dimensional surface shape of the object 10 is recognized from the dimensional image.
  • the present invention can be summarized in the following series of processes, which are shown in a flowchart in FIG.
  • the three-dimensional recognition method through two-dimensional image acquisition of the multi-projection light according to the present invention is largely implemented through four steps.
  • the first step is a step of projecting a plurality of projection light beams P having a preset projection angle PA with respect to the center projection line PLc to the three-dimensional object 10.
  • the plurality of projection light beams P may be formed by projecting a single light source onto the diffraction grating 110 having a preset dot pattern, and being separated into a plurality of light, in particular, an equal interval based on the center projection line PLc.
  • the diffraction grating 110 may be designed to form a projection pattern of a lattice array or a concentric array.
  • the pattern of the projection points PP formed on the three-dimensional object 10 is displayed in the two-dimensional image of the light receiving sensor 200 with the center acquisition line CLc positioned on the same plane with respect to the center projection line PLc. Acquiring a plurality of acquisition points (CP) knowing each coordinate value through the detection surface 210. Even if the center projection line PLc and the center acquisition line CLc located on the same plane do not coincide with each other, the angle formed between the center projection line PLc and the center acquisition line CLc based on the center projection point PPc is determined. If you set below 20 °, you can expect about 94% accuracy.
  • the acquisition angle CA and the focal length of the projection angle PA and the acquisition point CP are determined.
  • the acquisition distance CD is calculated using the above-described equations (1) and (2), and the first calculated distance (CD) is calculated as the origin distance (OD) through the equation (3).
  • Step 4-1 of converting may be further performed.
  • the third step may determine the one-to-one correspondence of each projection light P and the acquisition point CP by measuring and matching the gradient characteristics of the measurable property values applied to the projection lights P at the acquisition points CP. have.
  • the third step may be based on the center projection line PLc.
  • the acquisition points CP may correspond one-to-one in correspondence with the arrangement order of the series of projection lights P existing on the radially extending line.
  • the present invention may further include a step 2-1 of comparing the number of projection light (P) and the number of acquisition points (CP) to determine whether there is a missing acquisition point (CP), the second If it is determined that the acquisition point CP lost in step -1 exists, the second operation of adjusting the direction of the center projection line PLc such that the number of acquisition points CP is equal to the number of projection light beams P is performed. It may further comprise two steps.
  • a part of the plurality of projection lights P is blocked by the light blocking unit 120, and whether there is an acquisition point CP that does not disappear among the acquisition points CP. It is also possible to apply the 2-2 'step of repeatedly performing the process of determining the projection light P corresponding to the lost acquisition point CP for all the projection lights P by determining.
  • step 2-2 'simultaneously blocks a series of projection light beams P having the same radial distance with respect to the center projection line PLc among the projection light beams P of the projection pattern at the same time with the light blocking unit 120,
  • the projection light P corresponding to the lost acquisition point CP is determined by determining whether there is an acquisition point CP that does not disappear among the acquisition points CP corresponding to the radial arrangement order of the blocked projection lights P. ) May be repeated for all the projection light beams (P).
  • the light blocking unit 120 may start blocking sequentially from a series of projection light beams P having the largest radial distance or a series of projection light beams P having the smallest radial distance.
  • the furnace may be an optical shutter, which is a liquid crystal glass 124 in which some regions are converted to non-transparent to light by application of a physical shutter or a current.
  • the physical shutter may apply the structure of the camera shutter. If the projection pattern is an equally spaced grid array, the physical shutters have the shapes of "b" and "a” and the center projection line PLc. It may be configured as a pair of shutters 122 moving in a diagonal direction toward.
  • the light blocking upper part for determining the lost acquisition point CP can also be used as an auxiliary means for specifying the projection angle PA.
  • the light blocking part 120 is positioned in a preset positional relationship with respect to the projection focus Pf. If the sequential blocking of the light blocking unit 120 is performed in a preset pattern, the projection angle PA of the projection light P immediately before being blocked according to the blocking phase with respect to the projection focus Pf at a certain point can be known. Because there is.
  • the present invention calculates the origin distance (OD) information for a plurality of acquisition points (CP) to grasp the spatial relationship with the three-dimensional object 10, as well as the origin distance (OD) for a plurality of acquisition points (CP)
  • the surface shape of the three-dimensional object 10 may be calculated by curved fitting the information, which is performed in a separate fifth step.
  • the projection points formed by the plurality of projection light on the surface of the three-dimensional object is approximately from the two-dimensional pattern of the acquisition points captured on the image detection surface of the light receiving sensor. It becomes possible to grasp the typical shape.
  • the present invention is useful in a variety of fields that require three-dimensional recognition, such as technical vehicles requiring recognition of three-dimensional characteristics of the search target, such as drones, missiles, as well as toys, cars, and robots used in everyday life. Can be used.

Abstract

The present invention relates to a method which involves projecting multiple point sources of light having a preset projection angle to a space, acquiring, as two-dimensional information, images of projected points created through the reflection of each point source of light from an ambient environment or from a surface of an object via one light-intercepting unit having a preset geometric relationship to a unit for projecting point sources of light, and recognizing three-dimensional characteristics of the ambient environment or of the object based on said two-dimensional information.

Description

기설정 배율각을 가진 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법3D Recognition Method through 2D Image Acquisition of Multiple Projection Light with Preset Magnification Angle
본 발명은 미리 설정된 투사 각도를 가진 다수의 점광원을 공간상에 투사하고, 투사된 각각의 점광원이 주변환경 또는 대상물(이하 대상물)의 표면으로부터 반사되어 만들어진 투사점들의 영상을 점광원의 투사부에 대해 기설정된 기하관계에 있는 하나의 수광부를 통하여 2차원 정보로서 획득한 후, 이 2차원 정보를 바탕으로 하여 대상물의 3차원적 특성을 인식하는 방법에 관한 것이다.The present invention projects a plurality of point light sources having a preset projection angle in a space, and displays an image of the projection points made by reflecting each projected point light source from a surface of an environment or an object (hereinafter referred to as an object). The present invention relates to a method of recognizing three-dimensional characteristics of an object based on the two-dimensional information obtained through one light receiving unit having a predetermined geometric relationship with respect to the information.
일반적으로 3차원 인식은 자동화된 기기·장치의 운영 및 자발적 기능발현에 있어서 필수적인 요소로 평가받고 있다. 일상에서 운용되는 장난감이나 자동차, 로봇뿐만 아니라 무인비행기, 미사일 등에 이르기까지 3차원 인식을 필요로 하는 분야는 다양하다.In general, three-dimensional recognition is regarded as an essential element in the operation of automated devices and devices and the voluntary functioning. There are various fields that require 3D recognition, from not only toys, cars and robots used in everyday life, but also to drones and missiles.
GPS(Global Positioning System)의 개발로 인해 개방지에서의 임의의 지점에서 3차원 좌표를 알아내는 기술이 상용화되었으나, 상대적으로 GPS가 통용될 수 없는 GPS 재밍(jamming) 환경이나 실내 환경에서의 공간 인식 기술은 표준화된 인식방법이 구현되어 있지 않다. 현재로서는 두 개의 카메라를 이용한 스테레오 비젼(stereo vision)과 레이저 거리 측정기(laser range finder)를 통한 실내공간 인식 방법이 소개되어 있다.The development of the Global Positioning System (GPS) has resulted in the commercialization of 3D coordinates at any point in the open area, but the spatial recognition in GPS jamming or indoor environments where GPS is relatively unacceptable. The technology does not implement standardized recognition methods. At present, a method of recognizing indoor space using stereo vision and a laser range finder using two cameras is introduced.
그러나, 스테레오 비젼은 3차원 인식과 특정 대상물의 구분이 가능하기는 하지만, 두 개의 카메라로 각각 취득된 영상의 정합에 필요한 특징점의 추출이 주변의 조도의 변화에 매우 민감하며, 형태를 기반으로 한 패턴을 인식해야 하기 때문에 단순히 색을 찾아내거나 모서리를 검출하는 알고리즘에 비해 연산량이 매우 크다는 단점이 있다. 또한 기구적으로 두 개의 카메라를 사용하기 때문에 시스템이 커지고 카메라의 정렬이 매우 정밀해야 하며, 프로그램으로 두 영상의 보정을 또다시 수행해야된다는 추가적인 어려움이 발생한다.However, although stereo vision can distinguish three-dimensional recognition and specific objects, the extraction of feature points necessary for matching images acquired by two cameras is very sensitive to the change in the illumination of the surroundings. Because the pattern must be recognized, the computational amount is very large compared to the algorithm of simply finding a color or detecting an edge. In addition, the use of two cameras mechanically increases the system size, the alignment of the cameras must be very precise, and additional difficulties arise in that the calibration of the two images must be performed again by a program.
그리고, 레이저 거리 측정기는 2차원 평면상에서의 상대적 거리만을 알아낼 수 있기 때문에 3차원 공간 정보를 얻기 위한 방법으로서는 태생적인 한계가 존재한다.In addition, since the laser range finder can only detect the relative distance on the two-dimensional plane, there are inherent limitations as a method for obtaining three-dimensional spatial information.
따라서, 간단한 알고리즘 기반하에 하드웨어적으로 단순한 기기구성을 통해 3차원 인식을 가능하게 하는 효율적인 방법의 필요성이 대두하였으며, 이런 효율적인 방법이 개발된다면 실내에서의 3차원 공간인식 분야(예를 들면, 청소로봇이나 게임기의 동작인식 인터페이스 등)에 활용될 수 있는 것은 물론 GPS 재밍에 영향을 받지 않고 독립적인 임무 수행이 가능한 소형 무인기와 같은 군사분야에 이르기까지 3차원 공간인식이 필요한 수많은 분야에 폭넓게 사용될 것으로 예상된다.Therefore, there is a need for an efficient method that enables three-dimensional recognition through a simple hardware configuration based on a simple algorithm. If such an efficient method is developed, a three-dimensional space recognition field (for example, a cleaning robot) is developed. It is expected to be widely used in many fields that require 3D space recognition, from military applications such as small drones that can be used for the game recognition interface, etc.) as well as independent missions without being affected by GPS jamming. do.
위와 같은 종래기술의 한계를 극복하기 위한 본 발명은 하나의 2차원 영상 획득 수단만으로 3차원적인 공간 인식이 가능한 방법을 제공하는 것에 그 목적이 있다.An object of the present invention to overcome the above limitations of the prior art is to provide a method capable of three-dimensional spatial recognition using only one two-dimensional image acquisition means.
특히 본 발명은 종래의 3차원 인식기술인 스테레오 비젼이 가진 과도한 연산량의 부담을 해소함으로써 영상처리를 고속화하고 영상처리 프로세서를 소형화할 수 있는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법을 제공하는 것에 그 목적이 있다.In particular, the present invention provides a three-dimensional recognition method through the two-dimensional image acquisition of the multi-projection light that can speed up the image processing and downsizing the image processing processor by eliminating the burden of the excessive amount of the conventional three-dimensional recognition technology stereo vision There is that purpose.
본 발명에 따른 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법은 중심투사선에 대한 투사각이 사전에 설정된 다수의 투사광을 3차원 대상물로 투사하는 제1 단계;와, 상기 중심투사선에 대해 중심획득선이 동일 평면상에 위치한 상태에서, 상기 3차원 대상물에 형성된 투사점의 패턴을 수광센서의 2차원 이미지검출면을 통해 각각의 좌표값을 아는 다수의 획득점으로 획득하는 제2 단계;와, 상기 다수의 투사광과 획득점 사이의 일대일 대응관계를 판정하는 제3 단계; 및 상기 투사각과 상기 획득점의 획득각 및 초점거리 사이의 기하학적 관계로부터 각각의 획득거리를 계산하는 제4 단계;를 포함한다.A three-dimensional recognition method by acquiring two-dimensional images of multi-projection light according to the present invention includes a first step of projecting a plurality of projection light beams having a predetermined projection angle with respect to the center projection line to a three-dimensional object; and centering on the center projection line A second step of acquiring a pattern of projection points formed on the three-dimensional object as a plurality of acquisition points having respective coordinate values through the two-dimensional image detection surface of the light receiving sensor while the acquisition lines are located on the same plane; and A third step of determining a one-to-one correspondence between the plurality of projection lights and an acquisition point; And a fourth step of calculating each acquisition distance from a geometric relationship between the projection angle and an acquisition angle and a focal length of the acquisition point.
여기서 상기 제4 단계는 아래의 식(1) 및 식(2)를 통해 상기 획득거리를 계산한다.Wherein the fourth step is to calculate the acquisition distance through the following equation (1) and (2).
Figure PCTKR2012003052-appb-I000001
......식(1)
Figure PCTKR2012003052-appb-I000001
Expression (1)
Figure PCTKR2012003052-appb-I000002
.....식(2)
Figure PCTKR2012003052-appb-I000002
..... Equation (2)
여기서, L은 획득거리, f는 초점거리, α는 투사각, β는 획득각임.Where L is the acquisition distance, f is the focal length, α is the projection angle and β is the acquisition angle.
그리고 본 발명은 상기 획득거리를 아래의 식(3)을 통해 원점거리로 변환하는 제4-1 단계를 더 포함한다.The present invention further includes a step 4-1 of converting the acquisition distance into an origin distance through Equation (3) below.
Figure PCTKR2012003052-appb-I000003
......식(3)
Figure PCTKR2012003052-appb-I000003
Expression (3)
여기서, D는 원점거리임.Where D is the origin distance.
그리고 상기 제2 단계에서 상기 중심투사선과 중심획득선이 일치하지 않는 경우 중심투사점을 기준으로 하는 상기 중심투사선과 중심획득선 사이에 형성된 각도가 20°이하일 수 있다.When the center projection line and the center acquisition line do not coincide in the second step, an angle formed between the center projection line and the center acquisition line based on the center projection point may be 20 ° or less.
또한 본 발명은 상기 다수의 획득점에 대한 원점거리 정보를 곡면 피팅하여 상기 3차원 대상물의 표면 형상을 계산하는 제5 단계를 더 포함할 수 있다.The present invention may further include a fifth step of calculating the surface shape of the three-dimensional object by curvedly fitting the origin distance information of the plurality of acquisition points.
한편 상기 제3 단계는 상기 투사광들에 부여된 측정가능한 물성치의 구배특성을 상기 획득점에서 측정하고 상호 대응시킴으로써 이루어질 수 있다.Meanwhile, the third step may be performed by measuring the gradient characteristic of the measurable property values applied to the projection lights at the acquisition point and correlating each other.
또한 상기 제1 단계는 상기 다수의 투사광들이 상기 중심투사선을 기준으로 하는 등간격 격자배열 또는 등각 동심배열의 투사패턴을 이루도록 하고, 상기 제3 단계는 상기 중심투사선을 중심으로 반경방향으로 연장된 선위에 존재하는 일련의 투사광들의 배열 순서에 대응하여 상기 획득점들을 일대일 대응시킬 수도 있다.In addition, the first step may allow the plurality of projection lights to form a projection pattern of equally spaced grid arrays or conformal concentric arrays based on the center projection line, and the third step may extend radially about the center projection line. The acquisition points may correspond one-to-one in correspondence with the arrangement order of a series of projection lights existing on a line.
이때 상기 투사패턴은 사전에 설정된 도트 패턴을 갖는 회절격자에 단일 광원을 투사함으로써 만들어질 수 있다.In this case, the projection pattern may be made by projecting a single light source onto a diffraction grating having a preset dot pattern.
한편 본 발명은 상기 투사광의 개수와 상기 획득점의 개수를 비교하여 소실된 획득점이 존재하는지 여부를 판정하는 제2-1 단계를 더 포함할 수 있다.Meanwhile, the present invention may further include a step 2-1 of comparing the number of projection light beams with the number of acquisition points to determine whether a missing acquisition point exists.
그리고 상기 제2-1 단계에서 소실된 획득점이 존재하는 것으로 판정되면, 상기 획득점의 개수가 상기 투사광의 개수와 동일해지도록 상기 중심투사선의 방향을 조정하는 제2-2 단계를 더 포함할 수 있다.And if it is determined that the acquisition point lost in step 2-1 exists, the method further includes step 2-2 of adjusting the direction of the center projection line such that the number of the acquisition points is equal to the number of the projection light beams. have.
또는 상기 제2-1 단계에서 소실된 획득점이 존재하는 것으로 판정될 경우 상기 다수 투사광의 일부를 광차단부로 차단하고, 상기 획득점 중 사라지지 않는 획득점이 존재하는지 여부를 판정함으로써 소실된 획득점에 대응하는 투사광을 결정하는 과정을 모든 투사광에 대해 반복 수행하는 제2-2' 단계를 더 포함할 수 있다.Alternatively, when it is determined that the acquisition point lost in step 2-1 exists, a part of the plurality of projection lights is blocked by the light blocking unit, and it is determined whether the acquisition point that does not disappear among the acquisition points exists in the lost acquisition point. The method may further include performing a 2-2 ′ step of repeatedly determining the corresponding projection light for all the projection light.
여기서 상기 제2-2' 단계는 상기 투사패턴의 투사광들 중 상기 중심투사선을 기준으로 동일한 반경거리를 갖는 일련의 투사광들을 상기 광차단부로 동시에 차단하고, 상기 차단된 투사광들의 반경방향 배열 순서에 대응하는 획득점 중 사라지지 않는 획득점이 존재하는지 여부를 판정함으로써 소실된 획득점에 대응하는 투사광을 결정하는 과정을 모든 투사광에 대해 반복 수행할 수 있다.Here, the step 2-2 'simultaneously blocks a series of projection light beams having the same radial distance with respect to the center projection line among the projection light beams of the projection pattern to the light blocking unit, and radially arranges the blocked projection light beams. By determining whether there is an acquisition point that does not disappear among the acquisition points corresponding to the order, the process of determining the projection light corresponding to the lost acquisition point can be repeated for all the projection lights.
특히 상기 광차단부는 상기 반경거리가 가장 큰 일련의 투사광들 또는 상기 반경거리가 가장 작은 일련의 투사광들로부터 순차적 차단을 시작할 수 있다.In particular, the light blocking unit may sequentially block the series of projection lights having the largest radial distance or the series of projection lights having the smallest radial distance.
여기서 상기 광차단부는 물리적 셔터이거나, 또는 전류의 인가에 의해 일부 영역이 빛에 대해 비투과성으로 변환되는 액정유리인 광학적 셔터일 수 있다.The light blocking unit may be a physical shutter or an optical shutter which is a liquid crystal glass in which some regions are converted to non-transmissive light by application of a current.
이때 상기 투사패턴이 등각 동심배열인 경우, 상기 물리적 셔터는 카메라 셔터의 구조를 가질 수 있다.In this case, when the projection pattern is a concentric concentric array, the physical shutter may have a structure of a camera shutter.
또는 상기 투사패턴이 등간격 격자배열인 경우, 상기 물리적 셔터는 "ㄴ"과 "ㄱ" 형상을 가지며 상기 중심투사선을 향해 대각방향을 따라 이동하는 한 쌍의 셔터일 수 있다.Alternatively, when the projection pattern is an equally spaced grid array, the physical shutters may be a pair of shutters having "b" and "a" shapes and moving in a diagonal direction toward the center projection line.
한편 상기 광차단부의 순차적 차단이 미리 설정된 패턴으로 수행되되, 상기 투사초점에 대해 미리 설정된 위치관계로 상기 광차단부를 위치시키고 각 투사광에 대해 순차적 차단을 수행할 때, 어느 일시점에서의 상기 투사초점에 대한 차단위상에 따라 차단되기 직전의 상기 투사광의 투사각을 지정 또는 확인할 수도 있다.On the other hand, the sequential blocking of the light blocking unit is performed in a preset pattern, and when the light blocking unit is positioned in a preset positional relationship with respect to the projection focus and sequential blocking is performed for each projection light, the projection at a certain point of time It is also possible to designate or confirm the projection angle of the projection light just before blocking according to the blocking phase with respect to the focus.
아울러 상기 투사광은 상기 3차원 대상물 주변에서 존재하는 빛의 파장과 구별되는 파장을 갖는 레이저 광인 것이 바람직하다.In addition, the projection light is preferably a laser light having a wavelength that is distinct from the wavelength of light existing around the three-dimensional object.
위와 같은 구성을 가진 본 발명은 하나의 2차원 영상 획득 수단만으로 3차원 인식을 할 수 있다는 효과를 가진다.The present invention having the configuration as described above has the effect that can be three-dimensional recognition by only one two-dimensional image acquisition means.
또한, 2차원 영상에서 복잡한 영상처리가 필요 없이 간단한 필터링을 통해 투사점을 식별하여 3차원 인식을 수행하기 때문에, 정보 처리량이 매우 적어져 영상처리를 고속화할 수 있고, 또한 영상처리 프로세서를 소형화할 수 있다는 장점이 있다.In addition, since the projection point is identified and three-dimensional recognition is performed through simple filtering without complex image processing in the two-dimensional image, the information processing amount is very small, thereby speeding up the image processing and miniaturizing the image processing processor. There is an advantage.
또한, 대상물 주변에서 방사되는 파장과 구별되는 특정 파장을 투사광원에 부여함으로써 영상 획득 수단이 가진 고유한 문제 중의 하나인 주변 조도에 민감하다는 약점과 상관없이 안정적으로 투사점을 인식할 수 있다는 장점도 있다.In addition, by giving the projection light source a specific wavelength that is distinguished from the wavelength emitted from the object periphery, the projection point can be stably recognized regardless of the weakness of the peripheral illumination, which is one of the inherent problems of the image acquisition means. .
도 1은 본 발명에 사용된 삼각측량의 기본원리를 도해한 도면.1 is a diagram illustrating the basic principle of triangulation used in the present invention.
도 2는 투사점과 획득점의 3차원적 기하학적 관계를 개략적으로 도시한 도면.2 is a diagram schematically showing a three-dimensional geometric relationship between a projection point and an acquisition point.
도 3은 도 2에 도시된 3차원 투사선과 획득선을 기준원점에 대한 XZ 평면에 정사영으로 도시한 도면.3 is an orthogonal projection of the three-dimensional projection line and the acquisition line shown in FIG. 2 in the XZ plane with respect to the reference origin;
도 4는 다수의 투사광이 3차원 대상물에 투사점을 형성한 후 수광센서의 2차원 이미지검출면에 획득점으로 검출되는 것을 모식적으로 도시한 도면.4 is a view schematically showing that a plurality of projection light is detected as an acquisition point on the two-dimensional image detection surface of the light receiving sensor after forming the projection point on the three-dimensional object.
도 5는 중심투사선을 기준으로 다수의 투사선이 등간격 격자배열을 이루는 제1 투사패턴에 대한 도면.FIG. 5 is a diagram of a first projection pattern in which a plurality of projection lines form an equally spaced grid array based on a center projection line. FIG.
도 6은 중심투사선을 기준으로 다수의 투사선이 등각 동심배열을 이루는 제2 투사패턴에 대한 도면.FIG. 6 is a view of a second projection pattern in which a plurality of projection lines form an isoconcentric array based on a center projection line. FIG.
도 7은 회절격자(Diffraction Grating)를 이용하여 일정한 투사패턴을 형성하는 구성을 개략적으로 도시한 도면.FIG. 7 schematically illustrates a configuration of forming a constant projection pattern using a diffraction grating. FIG.
도 8은 3차원 대상물의 언더컷 부분에 투사점이 형성되어 획득점이 소실되는 현상을 도시한 도면.8 is a diagram illustrating a phenomenon in which a projection point is formed in an undercut portion of a three-dimensional object to lose an acquisition point.
도 9는 등각 동심배열의 투사패턴에 대한 광차단부의 구성을 보여주는 도면.9 is a view showing a configuration of a light blocking portion for a projection pattern of a conformal concentric array.
도 10은 광차단부를 이용한 순차적 차단을 통해 소실된 획득점을 판정하는 원리를 설명하는 도면.10 is a view for explaining the principle of determining the lost acquisition point through the sequential blocking using the light shield.
도 11은 등간격 격자배열의 투사패턴에 대한 광차단부의 구성을 보여주는 도면.FIG. 11 is a diagram showing a configuration of a light blocking unit for a projection pattern of an equally spaced grid array.
도 12는 본 발명에 따른 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법에 대한 흐름도.12 is a flowchart illustrating a three-dimensional recognition method through two-dimensional image acquisition of multi-projection light according to the present invention.
이하 첨부된 도면을 참조하여 본 발명의 일 실시예에 대하여 상세히 설명한다.Hereinafter, an embodiment of the present invention will be described in detail with reference to the accompanying drawings.
본 발명의 일 실시예를 설명함에 있어서 당업자라면 자명하게 이해할 수 있는 공지의 구성에 대한 설명은 본 발명의 요지를 흐리지 않도록 생략될 것이다.In the description of one embodiment of the present invention, descriptions of well-known configurations that will be apparent to those skilled in the art will be omitted so as not to obscure the subject matter of the present invention.
또한, 도면을 참조할 때에는 도면에 도시된 선들의 두께나 구성요소 사이의 상대적인 크기 등은 설명의 명확성과 이해의 편의상 과장되게 도시되어 있을 수 있음을 고려하여야 한다.In addition, when referring to the drawings it should be considered that the thickness of the lines shown in the drawings or the relative size between the components may be exaggerated for clarity of explanation and convenience of understanding.
본 발명은 기설정 배율각을 가진 다중투사광이 3차원 대상물(10)의 표면에 맺힌 투사영상을 2차원 영상으로 획득한 후, 이 2차원 영상에 나타난 다중투사광의 패턴 분석을 통해 대상물(10)의 3차원적 표면 형상을 인식하는 방법에 관한 것이다.According to the present invention, after a projection image having multiple projection lights having a predetermined magnification angle formed on a surface of a three-dimensional object 10 is obtained as a two-dimensional image, an object 10 is obtained through pattern analysis of the multiple projection light shown in the two-dimensional image. It relates to a method of recognizing a three-dimensional surface shape of).
이를 좀 더 상세히 설명한다면, 다수의 점광원이 투사되어 3차원 대상물(10)의 표면에 도달하여 상(像)을 이루었을 때, 이 상들의 2차원 정사영 영상은 3차원 대상물(10)의 표면 형상에 따라 최초의 투사 패턴과는 차이를 가지게 될 것이다(이하에서는 투사되는 다수의 점광원은 '투사광(P)'으로, 3차원 대상물(10)의 표면에 도달한 점광원이 이루는 상은 '투사점(PP)'이라 한다).In more detail, when a plurality of point light sources are projected to reach the surface of the three-dimensional object 10 to form an image, the two-dimensional orthogonal image of the images is the surface of the three-dimensional object 10. Depending on the shape, it will be different from the original projection pattern (hereinafter, a plurality of projected light sources are projected light P, and the image formed by the point light source reaching the surface of the three-dimensional object 10 is' Projection point (PP) ').
따라서 3차원 대상물(10)의 표면에 맺힌 투사점들의 패턴에는 3차원 대상물(10)의 표면정보가 담겨 있는 것이기 때문에, 이 패턴 정보를 분석하면 역으로 대상물(10)의 3차원적 표면 형상을 유추하는 것이 가능할 것임을 직관적으로 이해할 수 있다.Therefore, since the pattern of projection points formed on the surface of the three-dimensional object 10 contains the surface information of the three-dimensional object 10, analyzing the pattern information inversely infers the three-dimensional surface shape of the object 10. It can be intuitively understood that it will be possible.
3차원 대상물(10)의 표면에 맺힌 투사점들의 패턴으로부터 대상물(10)의 3차원적 표면 형상을 유추하는 방법은 투사점들의 2차원 패턴을 관찰(획득)하는 일 지점으로부터 각 투사점까지의 거리를 계산하는 것이다. 각 투사점까지의 거리를 알게 되면 관찰 지점에서 보이는 대상물(10)의 표면 형상에 대한 유한개(투사점의 개수)의 불연속적인 좌표정보를 얻게 되는 것이고, 이들 투사점의 좌표정보를 공지의 알고리즘을 사용하여 곡면 피팅(3D Surface Fitting)을 하면 대략적인 표면 형상을 유추해낼 수 있다.The method of inferring the three-dimensional surface shape of the object 10 from the pattern of the projection points formed on the surface of the three-dimensional object 10 calculates the distance from one point for observing (acquiring) the two-dimensional pattern of the projection points to each projection point. It is. If the distance to each projection point is known, finite number of discontinuous coordinate information about the surface shape of the object 10 seen from the observation point is obtained, and the coordinate information of these projection points is obtained by using a known algorithm. 3D Surface Fitting can infer the approximate surface shape.
본 발명의 가장 기본적인 개요를 설명하면 위와 같지만, 실제에 있어서 이를 구현하는 것에는 몇 가지 해결해야할 기술적 어려움이 있다.Although the most basic outline of the present invention is described above, there are some technical difficulties to be solved in actual implementation.
첫째는 투사점들의 2차원 패턴을 관찰하는 일 지점으로부터 각 투사점(PP)까지의 거리를 계산하는 방법에 관한 것이다. 이는 후술할 삼각측량의 원리에 따라 여러 어려움 중 비교적 간단하게 극복할 수 있다.The first relates to a method of calculating the distance from one point of observation of a two-dimensional pattern of projection points to each projection point PP. This can be overcome relatively simply among the various difficulties according to the principle of triangulation to be described later.
둘째는 광원으로부터 투사된 다수의 투사광(P)이 대상물(10)의 표면에 도달하여 맺힌 투사점(PP)을 관찰 지점에서 2차원 영상으로 획득했을 때, 2차원 영상에 존재하는 다수의 획득점(CP) 각각이 어떤 투사광(P)에 대응하는 것인지 결정하는 것이다. 이러한 일대일 대응이 결정되지 않으면 삼각측량의 원리 자체가 적용될 수 없기 때문에 이는 매우 중요하다.Second, when a plurality of projection light beams P projected from a light source reach the surface of the object 10 and acquire a projection point PP formed as a two-dimensional image from an observation point, a plurality of acquisition points exist in the two-dimensional image. (CP) Determine which projection light P corresponds to each. This is important because the principle of triangulation itself cannot be applied unless this one-to-one correspondence is determined.
셋째는 대상물(10)의 표면 형상에 따라서는 어떤 투사점은 관찰 지점에서 획득되지 못하는 경우도 발생할 수 있는데, 이런 획득점(CP)의 소실이 판명되지 않으면 획득점(CP)과 투사광(P) 사이의 일대일 대응에 오류가 발생할 수 있다. 이러한 획득점(CP)의 소실은 대상물(10)의 언더컷 부분에 투사점이 형성되었을 경우에 발생될 수 있다.Third, depending on the surface shape of the object 10, a certain projection point may not be obtained at the observation point. If the loss of the acquisition point CP is not found, the acquisition point CP and the projection light P Errors can occur in a one-to-one correspondence between them. The loss of the acquisition point CP may occur when a projection point is formed in the undercut portion of the object 10.
그밖에 대상물(10)로 투사되는 다수의 점광원(투사광)을 형성하는 구성이나, 각 점광원이 투사되는 각도를 결정하거나 알아내는 구성 등도 고려할 필요가 있다.In addition, it is necessary to consider a configuration for forming a plurality of point light sources (projected light) projected onto the object 10, a configuration for determining or determining the angle at which each point light source is projected.
이하에서는 위에서 언급한 기술적 난점을 해결한 본 발명의 구성에 대해 상세히 설명하기로 한다.Hereinafter, the configuration of the present invention to solve the above technical difficulties will be described in detail.
투사점들의 2차원 패턴을 관찰하는 일 지점으로부터 각 투사점까지의 거리를 계산하는 기본원리는 삼각측량의 원리이다.The basic principle of calculating the distance from one point of observation to the two-dimensional pattern of projection points is the principle of triangulation.
삼각측량의 원리는 고전적인 기하학에 기초를 둔 측량방법인데, A의 위치를 점유하는 관찰자가 다른 두 지점 B와 C에 대해 A-B 사이의 거리와 B-C 사이의 거리를 안다면, 미지의 A-C 사이의 거리를 알 수 있다는 원리에 기반을 둔 것이다.The principle of triangulation is a classical geometry-based survey method, if the observer occupying the position of A knows the distance between AB and BC for the other two points B and C, then the distance between the unknown AC It is based on the principle of knowing.
실제에 있어서는 A 지점과 C 지점 사이의 거리를 알기 위해 A-B 사이와 B-C 사이의 거리를 모두 측정해야 하는 것은 비효율적이고 간접적일 뿐만 아니라 쉽지 않은 일이다. 오히려 A-C 사이의 거리를 바로 측정하는 것이 직접적이고 쉬울 것이다. 또한 A-B 사이 또는 B-C 사이의 거리를 직접 측정하는 것이 지형의 영향으로 불가능할 수도 있다.In practice, measuring the distance between A-B and B-C to find the distance between point A and point C is not only inefficient, indirect, but also difficult. Rather, it would be straightforward and easy to measure the distance between A and C immediately. It may also be impossible to measure the distance between A-B or B-C directly due to the influence of the terrain.
이는 거리의 측정에 세 지점을 연결하는 가상선 사이의 각도를 측정하는 혼합방식을 통해 해결할 수 있는데, 도 1에 도시된 것처럼, A 지점에 있는 측정자가 쉽게 측정할 수 있는 다른 일 지점, 예를 들면 B 지점까지의 거리 하나만을 측정한 후, 실제로 측량해야 하는 C 지점을 A와 B 지점에서 각각 바라봤을 때의 각도를 측정하면, A-C 및 B-C 사이의 거리를 구할 수 있다.This can be solved by a blending method that measures the angle between the imaginary lines connecting the three points to the distance measurement. As shown in FIG. 1, another point that can be easily measured by the measurer at point A, for example, For example, if you measure only one distance to point B, and then measure the angles from the point A and point B to the point that you actually need to survey, you can find the distance between AC and BC.
본 발명 역시 전술한 거리와 각도 측정의 조합을 통해 2차원 패턴을 관찰하는 일 지점으로부터 각 투사점(PP)까지의 거리를 계산하게 된다. 이를 좀 더 상세히 설명한다면, 다수의 투사광(P)이 수렴하는 초점인 투사초점(Pf)과 3차원 대상물(10)에 맺힌 투사점(PP)의 영상이 획득되는 수광센서(200)에 대한 수광초점(Cf) 사이의 거리와, 다수의 투사광(P) 각각의 투사각(PA)과, 수광센서(200) 상에 검출된 각 획득점(CP)의 좌표를 통해 산출된 획득각(CA)을 안다면 결국 투사점(PP)까지의 거리를 계산할 수 있게 된다.The present invention also calculates the distance from one point of observing the two-dimensional pattern to each projection point (PP) through the combination of the above-described distance and angle measurement. In more detail, the light reception for the light receiving sensor 200 in which an image of the projection focal point Pf, which is the focal point where the plurality of projection light beams P converge, and the projection point PP that is formed on the three-dimensional object 10 is obtained. Acquisition angle CA calculated through the distance between the focal points Cf, the projection angle PA of each of the plurality of projection light beams P, and the coordinates of the respective acquisition points CP detected on the light receiving sensor 200. If we know, we can finally calculate the distance to the projection point PP.
여기서 투사각(PA)은 다수의 투사광(P)을 만들어내는 투광부(100)의 광학설계에 의해 결정되는 것이기 때문에 쉽게 알 수 있지만, 획득각(CA)은 3차원 대상물(10)의 표면 형상에 따라 수시로 바뀌는 것이기 때문에 계산에 의해 찾아내야 한다. 뒤에서 상술하겠지만, 획득각(CA)은 수광센서(200)에 맺힌 획득점(CP)의 2차원 좌표 및 획득점(CP)과 이미 알고 있는 수광초점(Cf) 사이의 기하학적 관계를 삼각함수를 이용해 풀어냄으로써 계산할 수 있다.Here, the projection angle PA is easily determined because it is determined by the optical design of the light transmitting part 100 that generates a plurality of projection light beams P, but the acquisition angle CA is the surface shape of the three-dimensional object 10. You have to find it by calculation because it is changed from time to time. As will be described in detail later, the acquisition angle CA uses a trigonometric function to determine the two-dimensional coordinates of the acquisition point CP formed on the light receiving sensor 200 and the geometric relationship between the acquisition point CP and the known light reception focal point Cf. Can be calculated by solving.
이상에서 개념적으로 설명한 획득점(CP)과 투사점(PP) 사이의 거리를 계산하는 방법에 대해 도 2를 참조하여 상세히 설명한다.A method of calculating a distance between the acquisition point CP and the projection point PP conceptually described above will be described in detail with reference to FIG. 2.
도 2는 두 개의 투사광(P)에 대한 투사점(PP) 및 수광센서(200)의 2차원 이미지검출면(210) 상에 맺힌 획득점(CP)을 개략적으로 도시한 것이다. 그리고 상세한 설명에 앞서, 도 2에 표시된 각 점과 가상선, 길이, 각도 등에 설명하면 다음과 같으며, 이 정의는 특별한 언급이 없는 한 상세한 설명 전반은 물론 청구항에 걸쳐 계속 유지될 것이다.FIG. 2 schematically illustrates the projection point PP for the two projection lights P and the acquisition point CP formed on the two-dimensional image detection surface 210 of the light receiving sensor 200. In addition, prior to the detailed description, each point and virtual line, length, angle, etc. shown in FIG. 2 will be described as follows, and this definition will be maintained throughout the detailed description as well as in the claims unless otherwise specified.
투사선(PL)은 투사각(PA)이 최종적으로 조절(결정)된 투사광(P)이 대상물(10)을 향해 투사되는 진행경로를 의미한다. 여기서 투사각(PA)이 최종적으로 결정됐다는 것은 후술할 바와 같이, 단일광원에서 다수의 광원으로 분할되어 어떤 투사패턴을 형성한 후 투사배율이 조정되는 중간과정을 거치는데, 이런 중간과정을 거쳐야 비로소 대상물(10)을 향하는 투사광(P)의 진행경로가 확정되기 때문이다.The projection line PL refers to a traveling path through which the projection light P whose projection angle PA is finally adjusted (determined) is projected toward the object 10. In this case, the final determination of the projection angle PA is performed after the intermediate process of dividing the light source from a single light source into a plurality of light sources to form a projection pattern and adjusting the projection magnification. This is because the traveling path of the projection light P toward (10) is determined.
중심투사선(PLc)은 다수의 투사선(PL) 중 기준으로 삼는 하나의 투사선(PL)을 말하는데, 임의로 결정할 수도 있지만 단일광원이 다수의 광원으로 분할되고 투사배율이 조정된 후에도 직진도에 변화가 없는 투사선(PL)을 중심투사선(PLc)으로 삼는 것이 편리하다. 이에 따라 각 투사선(중심투사선 제외)이 중심투사선(PLc)과 이루는 각도가 투사각(PA)으로 정의된다. 그리고 중심투사선(PLc)에 대응하는 투사점(PP)은 중심투사점(PPc)이라 칭한다.The center projection line PLc refers to one projection line PL which is a reference among the plurality of projection lines PL. Although it can be arbitrarily determined, there is no change in straightness even after a single light source is divided into multiple light sources and the projection magnification is adjusted. It is convenient to use the projection line PL as the center projection line PLc. Accordingly, the angle formed by each projection line (except for the center projection line) with the center projection line PLc is defined as the projection angle PA. The projection point PP corresponding to the center projection line PLc is referred to as the center projection point PPc.
투사초점(Pf)은 모든 투사선(PL)이 수렴하는 한 점을 의미하는데, 반드시 단일광원과 일치할 필요는 없다. 다만 투사패턴과 투사배율에 따라 투사선(PL)마다 각자의 투사초점(Pf)이 설정되도록 만들 수도 있지만, 투사초점(Pf)을 하나로 일치시키는 것이 간단하고 편리하다는 것은 자명하다.The projection focus Pf means a point at which all the projection lines PL converge, and do not necessarily coincide with a single light source. However, although the projection focus Pf may be set for each projection line PL according to the projection pattern and the projection magnification, it is obvious that matching the projection focus Pf to one is simple and convenient.
획득선(CL)은 3차원 대상물(10) 상에 맺힌 투사점(PP)과 수광초점(Cf)을 잇는 가상의 연장선으로서, 획득선(CL)이 수광센서(200)의 2차원 이미지검출면(210)을 통과하여 형성한 교차점을 획득점(CP)이라 한다. 따라서 획득점(CP)은 2차원 이미지검출면(210) 상에서의 좌표값을 갖는다. 후술할 바와 같이, 이 획득점(CP)의 좌표값이 각 투사점(PP)에 대한 거리를 산출하는데 사용된다.The acquisition line CL is an imaginary extension line connecting the projection point PP formed on the three-dimensional object 10 and the light reception focus Cf, and the acquisition line CL is a two-dimensional image detection surface of the light reception sensor 200 ( An intersection formed through 210 is called an acquisition point CP. Therefore, the acquisition point CP has a coordinate value on the two-dimensional image detection surface 210. As will be described later, the coordinate value of this acquisition point CP is used to calculate the distance to each projection point PP.
수광초점(Cf)은 모든 획득선(CL)이 수렴하는 한 점을 의미하는데, 본 발명의 실시예에서 투사초점(Pf)과 수광초점(Cf)은 중심투사선(PLc)을 따르는 일직선상에 위치하도록 배열된다. 이에 따라 모든 획득선(CL)은 수광초점(Cf)으로 수렴하기 때문에, 중심투사선(PLc)이 투사점(PP)을 형성한 후 수광초점(Cf)으로 되돌아오는 중심획득선(CLc)은 중심투사선(PLc)과 일치하게 된다.The light receiving focus Cf means a point at which all the acquisition lines CL converge. In an embodiment of the present invention, the projection focus Pf and the light receiving focus Cf are located on a straight line along the center projection line PLc. Is arranged to. Accordingly, since all the acquisition lines CL converge to the light reception focal point Cf, the center acquisition line CLc that returns to the light reception focal point Cf after the center projection line PLc forms the projection point PP is the center projection line. (PLc).
그리고, 중심획득선(CLc)이 형성하는 획득점(CP)은 중심획득점(CPc)으로 정의되며, 획득각(CA)은 투사각(PA)과 유사하게 중심획득선(CLc)과 각 획득선(CL)(중심획득선 제외)이 이루는 각도로 정의된다. 또한 일직선상의 투사초점(Pf)과 수광초점(Cf) 사이의 직선거리는 초점거리(f)로 정의된다. 물론 투사초점(Pf)과 수광초점(Cf)은 렌즈를 이용한 통상의 광학기구를 통해 만들어낼 수 있다.In addition, the acquisition point CP formed by the central acquisition line CLc is defined as the central acquisition point CPc, and the acquisition angle CA is the central acquisition line CLc and each acquisition line similarly to the projection angle PA. It is defined as the angle formed by (CL) (excluding the center line). In addition, the straight line distance between the straight line projection focus Pf and the light reception focus Cf is defined as the focal length f. Of course, the projection focus Pf and the light reception focal point Cf can be produced by a conventional optical apparatus using a lens.
기준원점(O)은 중심획득선(CLc) 상에 또는 중심획득점(CPc)으로부터 연장된 가상선상에 존재하는 3차원 좌표원점으로서, 본 발명이 구현된 장치가 인식하는 기준점을 말한다. 따라서 이하에서 별다른 언급이 없는 한 3차원 대상물(10)까지의 거리란 기준원점(O)과 투사점(PP) 사이의 거리인 원점거리(OD)를 말하는 것으로 이해되어야 한다. 물론 수광초점(Cf)을 기준원점(O)으로 설정할 수도 있으며, 본 발명의 실시예를 설명할 때의 기준원점(O)의 좌표축 정의는 기준원점(O)에서 중심획득점(CPc)을 향한 좌표축을 Z축으로 할 것이다.The reference origin O is a three-dimensional coordinate origin existing on the central acquisition line CLc or on an imaginary line extending from the central acquisition point CPc, and refers to a reference point recognized by a device in which the present invention is implemented. Therefore, unless otherwise mentioned below, the distance to the three-dimensional object 10 should be understood as referring to the origin distance OD which is the distance between the reference origin O and the projection point PP. Of course, the light receiving focus Cf may be set as the reference origin O, and the coordinate axis definition of the reference origin O when describing an embodiment of the present invention is directed from the reference origin O toward the center acquisition point CPc. The coordinate axis will be the Z axis.
다만 물리적으로 획득선(CL)과 수광센서(200)의 2차원 이미지검출면(210)의 교차점인 획득점(CP)의 좌표로부터 투사점(PP)까지의 거리를 산출해야 하기 때문에, 초기 계산값은 기준원점(O)과 투사점(PP) 사이의 거리인 원점거리(OD)에 앞서 투사점(PP)과 획득점(CP) 사이의 거리인 획득거리(CD)로 나타난다. 그렇지만 수광센서(200)의 2차원 이미지검출면(210)과 기준원점(O) 사이의 기하학적 관계는 미리 설정되는 것이기 때문에, 획득거리(CD)로부터 원점거리(OD)를 변환하는 것은 단순 계산으로 손쉽게 수행될 수 있다.However, since the distance from the coordinate of the acquisition point CP, which is the intersection point of the two-dimensional image detection surface 210 of the acquisition line CL and the light receiving sensor 200, to the projection point PP must be calculated, the initial calculated value Denotes an acquisition distance CD, which is a distance between the projection point PP and the acquisition point CP, before the origin distance OD, which is the distance between the reference origin O and the projection point PP. However, since the geometric relationship between the two-dimensional image detection surface 210 and the reference origin O of the light receiving sensor 200 is set in advance, the conversion of the origin distance OD from the acquisition distance CD is a simple calculation. It can be done easily.
위와 같은 정의를 바탕으로 하여, 도 3에 도시된 투사초점(Pf), 수광초점(Cf), 투사선(PL), 획득선(CL), 수광센서(200)의 2차원 이미지검출면(210) 등이 이루는 기하학적 관계로부터 원점거리(OD)를 산출하는 관계식을 도출한다. 여기서, 도 3은 도 2에 도시된 3차원 투사선(PL)과 획득선(CL)을 기준원점(O)에 대한 XZ 평면에 도시한 것으로서, 모든 거리와 각도는 XZ 평면에 대한 정사영으로 표현된다. 이에 따라 각 거리와 각도를 지칭할 때에는 원칙적으로 "XZ 정사영"을 표시하는 기재가 붙어야 정확하겠지만, 편의상 "XZ 정사영"이란 기재를 생략하기로 한다.Based on the above definition, the two-dimensional image detection surface 210 of the projection focus Pf, the light reception focus Cf, the projection line PL, the acquisition line CL, and the light receiving sensor 200 shown in FIG. A relational formula for calculating the origin distance OD is derived from the geometric relations of the back. 3 shows the three-dimensional projection line PL and the acquisition line CL shown in FIG. 2 in the XZ plane with respect to the reference origin O, and all distances and angles are represented by orthogonal projections to the XZ plane. . Accordingly, when referring to each distance and angle, in principle, a description indicating "XZ orthogonal projection" should be attached, but the description of "XZ orthogonal projection" will be omitted for convenience.
우선 사인(sine) 제1 법칙을 이용하면, 다음과 같은 식(1)이 성립한다.First, using the first law of sine, the following equation (1) holds.
Figure PCTKR2012003052-appb-I000004
......식(1)
Figure PCTKR2012003052-appb-I000004
Expression (1)
위 식(1)에서, L은 획득거리, f는 초점거리, α는 투사각(PA), β는 획득각(CA)을 나타낸다. 초점거리(f)와 투사각(PA)은 이미 알고 있는 값이며, 획득각(CA)은 "수광초점(Cf)과 중심획득점(CPc) 사이의 거리"에 대한 "획득점(CP)과 중심획득점(CPc) 사이의 거리"의 탄젠트 함수에 의해 아래의 식(2)로 계산된다. 물론 획득점(CP)과 중심획득점(CPc) 사이의 거리는 각 획득점(CP)의 좌표값에 의해 계산될 수 있다.In Equation (1), L denotes an acquisition distance, f denotes a focal length, α denotes a projection angle PA, and β denotes an acquisition angle CA. The focal length (f) and the projection angle (PA) are known values, and the acquisition angle (CA) is the "acquisition point (CP) and center" for the "distance between the light receiving focus (Cf) and the center acquisition point (CPc)". The tangent function of "distance between acquisition points CPc" is calculated by the following equation (2). Of course, the distance between the acquisition point CP and the central acquisition point CPc may be calculated by the coordinate value of each acquisition point CP.
Figure PCTKR2012003052-appb-I000005
.....식(2)
Figure PCTKR2012003052-appb-I000005
..... Equation (2)
따라서 식(1)과 식(2)에 의하면, 획득거리(CD)가 간단한 삼각함수의 계산으로부터 얻어진다.Therefore, according to equations (1) and (2), the acquisition distance CD is obtained from the calculation of a simple trigonometric function.
다음으로 코사인(cosine) 제2 법칙을 적용한 식(3)으로부터 원점거리(OD)가 구해질 수 있다.Next, the origin distance OD can be obtained from Equation (3) applying the second cosine law.
Figure PCTKR2012003052-appb-I000006
......식(3)
Figure PCTKR2012003052-appb-I000006
Expression (3)
이들 식(1)∼식(3)을 각 투사선(PL)과 획득선(CL)에 적용하면 3차원 대상물(10)의 표면에 형성된 투사점(PP)에 대한 원점거리(OD)를 구할 수 있으며, 이 복수의 원점거리(OD)를 이용하여 곡면 피팅을 수행함으로써 3차원 대상물(10)의 대략적인 표면형상을 인식할 수 있게 된다. 물론 투사각(PA)과 획득각(CA)이 0°인 중심투사선(PLc)과 중심획득선(CLc)은 기준의 역할을 할 뿐 이로부터 거리 정보를 얻을 수 없음은 자명하다.Applying these equations (1) to (3) to each projection line PL and acquisition line CL, the origin distance OD with respect to the projection point PP formed on the surface of the three-dimensional object 10 can be obtained. By performing the surface fitting using the plurality of origin distances OD, an approximate surface shape of the 3D object 10 can be recognized. Of course, it is obvious that the center projection line PLc and the center acquisition line CLc having the projection angle PA and the acquisition angle CA of 0 ° serve as a reference but cannot obtain distance information therefrom.
도 4는 다수의 투사광(P)이 3차원 대상물(10)에 투사점(PP)을 형성한 후 수광센서(200)의 2차원 이미지검출면(210)에 획득점(CP)으로 검출되는 것을 모식적으로 도시한 것인데, 위 일련의 계산과정을 각 획득점(CP)에 대해 수행하여 각각의 획득거리(CD)(원점거리로 변환 가능)를 계산함으로써 3차원 대상물(10)의 표면형상을 개략적으로 산출하는 것이 가능하다는 것을 명확하게 보여준다. 여기서 수광센서(200)의 2차원 이미지검출면(210)은 중심투사선(PLc)과 중심획득선(CLc)이 위치한 평면에 대해 수직하게 배치된다. 또한 위 계산식으로부터 알 수 있듯이, 3차원 대상물(10)의 표면형상 산출을 위한 원점거리(OD) 계산은 단순한 삼각함수를 이용해 수행되기 때문에 정보처리량이 극히 적어져 종래의 3차원 인식방법에 비해 상당한 고속 프로세싱이 가능해질 수 있음을 알 수 있다.4 illustrates that a plurality of projection lights P are detected as acquisition points CP on the two-dimensional image detection surface 210 of the light receiving sensor 200 after forming the projection points PP on the three-dimensional object 10. It is shown schematically, by performing the above series of calculation process for each acquisition point (CP) to calculate the respective acquisition distance (CD) (convertable to the origin distance) to determine the surface shape of the three-dimensional object (10) It is clearly shown that it is possible to calculate roughly. Here, the two-dimensional image detection surface 210 of the light receiving sensor 200 is disposed perpendicular to the plane where the center projection line PLc and the center acquisition line CLc are located. In addition, as can be seen from the above equation, since the origin distance (OD) calculation for calculating the surface shape of the three-dimensional object 10 is performed using a simple trigonometric function, the information throughput is extremely small, which is considerably compared to the conventional three-dimensional recognition method. It can be seen that high speed processing can be enabled.
한편 본 발명의 실시예는 투사초점(Pf)과 수광초점(Cf) 및 기준원점(O)이 모두 일직선 상에 배열된 것이라 설명하였는데, 실제에 있어서는 중심투사선(PLc)과 중심획득선(CLc)이 완전히 일치하지 못하고 투사점(PP)을 중심으로 약간 벌어지게 설계하는 경우도 예상할 수 있다. 이는 투사초점(Pf)과 수광초점(Cf)이 일직선 상에 배열됨에 따라 겹치거나 교차하게 되는 투사선(PL)과 획득선(CL)을 분리하여 처리할 광학계의 설치가 복잡할 수 있기 때문이다. 따라서 본 발명의 실시예는 중심투사선(PLc)과 중심획득선(CLc)이 일치하지는 않더라도 적어도 하나의 평면상에 배열되는 구성을 포함한다. 이러한 구성은 도 3을 예로 든다면, 지면(紙面)에 수직한 Y축을 따라 수광초점(Cf)이 투사초점(Pf)에 대해 높이차를 가지는 것이라 말할 수 있다.On the other hand, the embodiment of the present invention has been described that the projection focus (Pf), the light receiving focus (Cf) and the reference origin (O) are all arranged in a straight line, in practice, the center projection line (PLc) and the center acquisition line (CLc) It can also be expected that this design does not match completely and is designed to open slightly around the projection point (PP). This is because, as the projection focus Pf and the light reception focus Cf are arranged on a straight line, the installation of the optical system to separate and process the projection line PL and the acquisition line CL, which overlap or intersect, may be complicated. Therefore, an embodiment of the present invention includes a configuration in which the center projection line PLc and the center acquisition line CLc do not coincide with each other but are arranged on at least one plane. 3, the light receiving focus Cf has a height difference with respect to the projection focus Pf along the Y axis perpendicular to the ground.
이처럼 투사초점(Pf)과 수광초점(Cf)이 이격되어 있을 때에는, 중심획득선(CLc)이 XZ 평면으로의 정사영으로 변환되는 만큼의 오차가 발생하게 된다. 그러나 중심획득선(CLc)의 XZ 평면으로의 정사영 변환에 의한 오차는 투사점(PP)을 기준으로 한 중심투사선(PLc)과 중심획득선(CLc) 사이 각도의 코사인 값에 비례하기 때문에, 이 각도가 작다면 정사영 변환에 의한 오차는 실용적인 관점에서 무시할 수 있는 정도의 수준이다.When the projection focal point Pf and the light receiving focal point Cf are spaced apart from each other, an error occurs as much as the center acquisition line CLc is converted into orthogonal projection to the XZ plane. However, the error due to the orthogonal transformation of the center acquisition line CLc to the XZ plane is proportional to the cosine of the angle between the center projection line PLc and the center acquisition line CLc with respect to the projection point PP. If is small, the error due to the orthogonal transformation is negligible from the practical point of view.
예를 들면, 중심투사선(PLc)과 중심획득선(CLc) 사이의 각도(θ)가 20°라면 cos20°≒0.940이기 때문에 중심투사선(PLc)과 중심획득선(CLc)이 일치하는 경우를 기준으로 약 94%의 정확도, 다시 말하면 약 6%의 오차 수준을 보이기 때문에 상당히 정확한 값을 얻을 수 있다. 이를 획득거리(CD)에 대한 수광초점(Cf)이 Y축을 따라 투사초점(Pf)에 대해 이격된 거리의 비로 환산하면, 이는 cotan20°(≒2.75)에 상응하는 값이므로, 3차원 대상물(10)까지의 거리가 대략 수광초점(Cf)의 이격거리에 대해 대략 3배 정도의 거리를 갖는다면 기하학적으로는 약 6%의 오차 수준을 유지할 수 있다는 것을 의미한다. 즉 수광초점(Cf)이 10㎝ 정도 이격되어 있더라도, 28㎝ 이상 떨어진 3차원 대상물(10)의 표면 윤곽을 상당한 정밀도로 산출할 수 있으며, 이는 실용적으로 충분히 의미 있는 결과라 할 수 있다.For example, if the angle θ between the center projection line PLc and the center acquisition line CLc is 20 °, the cos20 ° ≒ 0.940, so that the center projection line PLc and the center acquisition line CLc coincide with each other. As a result, the error level is about 94%, that is, about 6%, so that a fairly accurate value can be obtained. When the light receiving focus Cf to the acquisition distance CD is converted into the ratio of the distances spaced apart from the projection focus Pf along the Y axis, this is a value corresponding to cotan20 ° (752.75). If the distance to) is approximately three times the distance of the light receiving focal point Cf, it means that an error level of about 6% can be maintained geometrically. That is, even if the light receiving focus Cf is spaced about 10 cm apart, the surface contour of the three-dimensional object 10 separated by 28 cm or more can be calculated with considerable precision, which is a practically meaningful result.
한편 투사광(P)의 파장은 3차원 대상물(10) 주변에 존재하는 빛의 파장과 구별되는 특정 파장을 갖도록 함으로써 영상 획득 수단이 가진 고유한 문제 중의 하나인 주변 조도에 민감하다는 약점을 해소하는 것이 바람직하다. 예를 들면, 가시광선 대역의 파장과 구별되는 적외선 대역의 파장을 갖는 투사광(P)을 적용할 수 있다.On the other hand, the wavelength of the projection light (P) has a specific wavelength that is distinct from the wavelength of the light around the three-dimensional object 10 to solve the weakness that is sensitive to the peripheral illumination, which is one of the inherent problems of the image acquisition means It is preferable. For example, the projection light P having the wavelength of the infrared band which is distinguished from the wavelength of the visible light band can be applied.
위와 같이 투사점(PP)들의 2차원 패턴을 관찰하는 일 지점, 특히 기준원점(O)으로부터 각 투사점(PP)까지의 거리를 계산하는 것은 삼각측량의 원리를 이용해 해결할 수 있음을 확인하였다. 그 다음으로 해결해야 할 것은 2차원 영상에 존재하는 다수의 획득점(CP) 각각이 어떤 투사광(P)에 대응하는 것인지 결정하는 것이다.As described above, it was confirmed that the calculation of the distance from one point of observing the two-dimensional pattern of the projection points PP, in particular, the reference origin O to each projection point PP can be solved using the principle of triangulation. The next problem to be solved is to determine which projection light P corresponds to each of the plurality of acquisition points CP present in the 2D image.
가장 단순하게는 다수의 투사광(P)에 포함된 파장(주파수)나 출력 등의 물리적으로 측정가능한 물성치에 구배특성을 부여하는 것이다. 즉 각각의 투사광(P)이 다른 투사광(P)과 구별되는 물리적 특성을 부여하고, 각 획득점(CP)에서의 상기 물리적 특성을 측정하고 이를 각 투사광(P)에 대응시킴으로써 투사광(P)과 획득점(CP) 사이의 일대일 대응을 결정하는 것이다.In the simplest case, a gradient characteristic is given to physically measurable physical properties such as wavelength (frequency) or output included in a plurality of projection lights P. That is, each projection light P imparts a physical property distinguished from other projection light P, measures the physical property at each acquisition point CP, and corresponds it to each projection light P, thereby projecting light To determine the one-to-one correspondence between (P) and the acquisition point (CP).
그러나 이 방법은 다수의 투사광(P) 각각에 의미 있는 구배특성을 부여하고, 이를 다시 검출하는 구성을 갖추어야 하기 때문에 구조적으로 복잡해진다는 단점이 있다. 그리고 경우에 따라서는 이질적인 매질(예를 들면, 공기 중의 물방울이나 먼지 등)을 통과하거나 산란되는 과정에서 투사광(P)의 물리적 특성이 변질될 수도 있다. 따라서 이보다는 좀 더 단순하고 신뢰성이 높은 대안을 마련할 필요가 있다.However, this method has a disadvantage in that it is structurally complicated because it is necessary to provide a structure having a meaningful gradient characteristic to each of the plurality of projection lights P and detecting them again. In some cases, physical properties of the projection light P may be altered in the process of being scattered or passed through heterogeneous media (eg, water droplets or dust in the air). Therefore, there is a need for a simpler and more reliable alternative.
도 5와 도 6은 본 발명의 실시예에서의 다수 투사광(P)의 투사패턴을 도시한 것이다. 도 5는 중심투사선(PLc)을 기준으로 다수의 투사선(PL)이 등간격 격자배열(X, Y 축을 따르는 간격이 동일)을 이루는 제1 투사패턴(112)이고, 도 6은 중심투사선(PLc)을 기준으로 다수의 투사선(PL)이 등각 동심배열을 이루는 제2 투사패턴(114)을 보여준다.5 and 6 show projection patterns of the multiple projection light beams P in the embodiment of the present invention. FIG. 5 is a first projection pattern 112 in which a plurality of projection lines PL form an equally spaced grid array (the same distance along the X and Y axes) based on the center projection line PLc, and FIG. 6 is a center projection line PLc. ) Shows a second projection pattern 114 in which a plurality of projection lines PL form a concentric concentric array.
위와 같이 중심투사선(PLc)을 기준으로 대칭을 이루도록 각 투사광(P)을 배열하면, 투사초점(Pf)과 수광초점(Cf)이 일직선상에 있는 본 발명의 실시예에 있어서는 각 획득점(CP)은 이에 대응되는 투사선(PL)과 중심투사선(PLc)을 반경방향으로 연결한 선 위를 따라서만 그 위치가 이동하게 된다. 즉 도 5와 도 6에 도시된 것과 같이, 큰 원으로 표시된 각 투사선(PL)에 대응하는 획득점(CP)은 중심투사점(PPc)을 중심으로 반경방향으로 연장된 선 위에서 그 위치가 이동할 뿐이다. 이는 중심투사선(PLc)과 중심획득선(CLc)이 일치하기 때문에 나타나는 현상이다.When the projection light beams P are arranged to be symmetrical with respect to the center projection line PLc as described above, in the embodiment of the present invention in which the projection focus Pf and the light reception focus Cf are in a straight line, each acquisition point ( CP is moved only along the line connecting the corresponding projection line PL and the center projection line PLc in the radial direction. That is, as shown in FIG. 5 and FIG. 6, the acquisition point CP corresponding to each projection line PL indicated by a large circle only moves its position on a line extending radially about the center projection point PPc. . This occurs because the center projection line PLc and the center acquisition line CLc coincide.
이러한 현상을 이용하면 다수의 투사광(P)과 획득점(CP) 사이의 일대일 대응이 매우 간단해진다. 중심투사선(PLc)을 중심으로 반경방향으로 연장된 선위에 존재하는 일련의 투사광(P)들은 그 위치만이 조금씩 바뀔 뿐 그 배열 순서는 그대로 유지되므로, 이 순서에 따라 투사광(P)과 획득점(CP)을 일대일 대응시키기만 하면 된다. 다만 배열 순서의 유지는 중심투사선(PLc)을 중심으로 반경방향으로 연장된 선을 따라서만 유효하므로, 도 6의 등각 동심배열을 이루는 제2 투사패턴(114)이 더욱 간편하다는 장점이 있다.Using this phenomenon, the one-to-one correspondence between the plurality of projection lights P and the acquisition point CP is very simple. The series of projection lights P existing on the line extending radially with respect to the center projection line PLc only changes its position little by little, and the arrangement order is maintained. Therefore, the projection light P and the All you have to do is match the acquisition points one-to-one. However, the maintenance of the arrangement order is effective only along a line extending radially with respect to the center projection line PLc, so that the second projection pattern 114 constituting the conformal concentric arrangement of FIG. 6 is more convenient.
여기서 제1/제2 투사패턴(112,114)의 형성은 도 7에 도시된 것과 같은 회절격자(Diffraction Grating, 110)를 이용하면 가능하다. 에칭이나 레이저 가공을 하여 회절격자(110)에 도트 패턴을 형성하면, 이 도트 패턴에 의해 도시된 제1/제2 투사패턴(112,114)과 같은 일정한 패턴을 가진 다수의 투사광(P)을 하나의 광원으로부터 만들어낼 수 있으며, 도트 패턴의 설계에 따라 각 투사광(P)의 투사각(PA)은 이미 결정된 값을 갖는다. 그리고 광원은 빛의 퍼짐이 거의 없고 직진도가 우수한 레이저 광원을 채택한다.In this case, the first and second projection patterns 112 and 114 may be formed using a diffraction grating 110 as shown in FIG. 7. When a dot pattern is formed on the diffraction grating 110 by etching or laser processing, a plurality of projection light beams P having a constant pattern, such as the first / second projection patterns 112 and 114 shown by the dot pattern, are formed. The projection angle PA of each projection light P has a predetermined value according to the design of the dot pattern. In addition, the light source adopts a laser light source having almost no light spread and excellent straightness.
다음으로 대상물(10)의 표면 형상에 따라 발생할 수 있는 획득점(CP)의 소실 문제를 해결하는 것이다. 전술한 바와 같이 획득점(CP)의 소실은 대상물(10)의 언더컷 부분에 투사점(PP)이 형성되었을 경우에 발생될 수 있으며, 이러한 예는 도 8에 도시되어 있다. 이는 투사초점(Pf)과 수광초점(Cf)이 일직선상으로 배열됨에 따라 필연적으로 수광센서(200)보다 투사초점(Pf)이 투사방향에 대해 더 앞에 배치되어야 하고, 이에 따라 투사각(PA)이 획득각(CA)보다 더 크게 형성되기 때문이다.Next, to solve the problem of the loss of the acquisition point (CP) that can occur according to the surface shape of the object (10). As described above, the loss of the acquisition point CP may occur when the projection point PP is formed in the undercut portion of the object 10, an example of which is illustrated in FIG. 8. This is because, as the projection focus Pf and the light reception focus Cf are arranged in a straight line, the projection focus Pf must be disposed earlier in the projection direction than the light receiving sensor 200, so that the projection angle PA is This is because it is larger than the acquisition angle CA.
획득점(CP)이 소실되면 투사광(P)과 획득점(CP)의 일대일 대응에 오류가 발생되기 때문에, 획득점(CP)의 소실이 발생했는지 발생했다면 어떤 투사광(P)에 대한 획득점(CP)이 소실되었는지를 판정할 필요가 있다.If the acquisition point CP is lost, an error occurs in the one-to-one correspondence between the projection light P and the acquisition point CP. If the loss of the acquisition point CP has occurred, the acquisition of the projection light P It is necessary to determine whether the point CP is lost.
획득점(CP)의 소실 여부는 투사광(P)의 개수에 비해 획득점(CP)의 개수가 모자라는지 여부를 판정함으로써 알 수 있다. 이러한 경우 중심투사선(PLc)과 중심획득선(CLc)의 방향을 조정해 획득점(CP)의 개수가 투사광(P)의 개수와 동일하게 만들어 획득점(CP)의 소실을 회피할 수 있다.Whether the acquisition point CP is lost can be determined by determining whether the number of the acquisition points CP is short compared to the number of the projection lights P. FIG. In this case, by adjusting the directions of the center projection line PLc and the center acquisition line CLc, the number of acquisition points CP may be the same as the number of projection light beams P, thereby avoiding the loss of the acquisition points CP. .
그러나 위와 같은 회피방안은 항상 유효하다고 보기 어려우며, 중심투사선(PLc)과 중심획득선(CLc)의 방향 조정이 3차원 대상물(10)의 형상 파악에 방해를 가져올 우려도 있다.However, the above avoidance measures are not always valid, and there is a fear that the adjustment of the direction of the center projection line PLc and the center acquisition line CLc may interfere with the shape of the three-dimensional object 10.
따라서 본 발명은 획득점(CP)의 소실이 발생했을 경우 어떤 투사광(P)에 대응하는 획득점(CP)이 소실되었는지를 판정함으로써 원점거리(OD)를 계산할 때 이 소실된 획득점(소실점)을 제외시킬 수 있는 방안을 마련하였다. 이는 중심투사선(PLc)을 중심으로 반경방향으로 연장된 선위에 존재하는 일련의 투사광(P)들은 그 위치만이 조금씩 바뀔 뿐 그 배열 순서는 그대로 유지된다는 성질을 이용하는 것이다.Therefore, in the present invention, when the loss of the acquisition point CP occurs, the lost acquisition point (loss point) is calculated when calculating the origin distance OD by determining which projection light P corresponding to the projection point P is lost. ). This utilizes the property that a series of projection lights P existing on a line extending radially about the center projection line PLc only changes its position little by little but maintains the arrangement order.
도 9에 도시된 것과 같이, 투광패턴의 격자배열(도 9에서는 도 6의 등각 동심배열을 예로 하였음)의 최외곽 투사광(P), 즉 중심투사선(PLc)으로부터 가장 먼 일련의 투사광(P)으로부터 순차적으로 안쪽을 향해 이동하는 광차단부(120)를 배치하면, 광차단부(120)에 의해 가려진 투사광(P)들에 대응하는 획득점(CP)은 수광센서(200)에서 사라지게 되고, 광차단부(120)에 의해 가려지기 전후의 획득점(CP)의 변화를 판단함으로써 소실된 획득점(CP)이 어떤 투사광(P)에 대응하는 것인지 알 수 있게 된다.As shown in FIG. 9, the outermost projection light P of the grid array of the light transmission pattern (in FIG. 9 using the conformal concentric array of FIG. 6 as an example), that is, the series of projection light furthest from the center projection line PLc ( When the light blocking unit 120 sequentially moves inwardly from P), the acquisition point CP corresponding to the projection light beams P obscured by the light blocking unit 120 is determined by the light receiving sensor 200. It is disappeared, and by determining the change of the acquisition point CP before and after being covered by the light blocking unit 120, it is possible to know which projection light P corresponds to the lost acquisition point CP.
제2 투사패턴(114)과 이에 대한 획득점(CP)을 제1 사분면에 한정하여 예시적으로 도시한 도 10을 참조하여 보다 상세히 설명하기로 한다. 투사광(P)과 획득점(CP)에 표시된 두 자리 수의 첨자는 각 투사광(P)과 획득점(CP)을 구별하기 위해 부여된 것인데, 앞 자리 숫자는 중심투사선(PLc)을 중심으로 반경방향으로 연장된 선을 반시계 방향을 따라 순서대로 표시한 것이고, 두 번째 자리 숫자는 중심투사선(PLc)으로부터 멀어지는 방향(외곽방향)을 따라 순번을 매긴 것이다.The second projection pattern 114 and its acquisition point CP will be described in more detail with reference to FIG. 10. The two-digit subscripts shown in the projection light (P) and the acquisition point (CP) are given to distinguish each projection light (P) and the acquisition point (CP). The first digit is centered on the center projection line (PLc). The radially extending line is displayed in order along the counterclockwise direction, and the second digit is numbered along the direction away from the center projection line PLc (outer direction).
중심투사점(PPc)과 중심획득점c)을 제외하고 각 투사광(P)과 획득점(CP)은 01...04, 11...14, ..., 41...44 까지 배치되어 있으며, 광차단부(120)는 두 번째 자리 숫자가 동일한 원주방향의 투사광(P)들을 순차적으로 가리게 된다.Except for the center projection point (PPc) and the center acquisition point c), each projection light (P) and the acquisition point (CP) are arranged up to 01 ... 04, 11 ... 14, ..., 41 ... 44 The light blocking unit 120 sequentially covers the projection light beams P in the circumferential direction of which the second digit is the same.
만일 첨자 04...44 까지의 투사광(P)을 가렸을 때 획득점(CP)들 중 반경방향 라인 3에 있는 최외곽 획득점(CP33)이 여전히 수광센서(200)에 나타나 있다면 PP34로 표기된 투사광(P)에 의해 형성된 투사점(PP)은 소실된 것이라 판정할 수 있으며(도 10에서 회색 원형으로 표시된 획득점(CP)들은 투사광(P)의 차단에 의해 사라지는 최외곽 획득점들을 표시한 것임), 이 PP34 투사광(P)에 대한 원점거리(OD) 계산은 수행하지 않는다.If the outermost acquisition point CP 33 in radial line 3 of the acquisition points CP is still present in the light receiving sensor 200 when the projection light P to subscripts 04 ... 44 is covered, PP It can be determined that the projection point PP formed by the projection light P denoted as 34 is lost (the acquisition points CP indicated by gray circles in FIG. 10 are the outermost acquisition disappeared by the blocking of the projection light P). Points), the origin distance OD calculation for this PP 34 projection light P is not performed.
이러한 소실점 판정은 동일한 원주방향의 투사광(P)들에 대해 순차적으로 진행되며, 소실점 판정이 끝난 투사광(P)들이 다음 차례의 소실점 판정시 계속 차단된 상태로 있을 것인지 아니면 차단을 해제할 것인지는 별다른 문제가 되지 않는다. 또한 투사광(P)의 차단을 최외곽으로부터 진행할 것인지 아니면 중심투사선(PLc)에서부터 시작할 것인지 역시 문제되지 않는다. 즉 이는 순차적 개방이냐 순차적 차단이냐의 차이일 뿐 효과상으로는 아무런 차이가 없는 것이다.The vanishing point determination is sequentially performed on the same circumferential projection light beams P, and it is determined whether or not the projection light beams P which have been vanishing point determination remain blocked at the next vanishing point determination or are unblocked. It doesn't matter. Also, it does not matter whether the projection light P is blocked from the outermost or starts from the center projection line PLc. In other words, this is only a difference between sequential opening or sequential blocking, and there is no difference in effect.
또한, 상기와 같은 순차적 광차단 또는 광개방은 투사선(PL)의 투사각(PA) 지정을 위한 보조수단으로 활용될 수 있다. 즉 투사초점(Pf)에 대해 미리 설정된 위치관계로 광차단부(120)를 위치시키고 투사광(P)에 대해 순차적으로 차단 또는 개방을 미리 설정된 패턴으로 수행할 때, 어느 일시점에서의 투사초점(Pf)에 대한 차단위상을 알 수 있다면, 개방되거나 차단된 직전의 투사선(PL)의 투사각(PA)을 알 수 있는 것이다. 이는 투사광(P)의 투사패턴이 이미 설계되어 있기 때문에 가능한 것이며, 기설정 투사각(PA)의 일대일 대응 상의 문제를 야기하는 투사점(PP)의 소실상황에서 보조수단(투사각의 지정 또는 확인 등)으로 활용될 수 있다.In addition, the sequential light blocking or light opening as described above may be used as an auxiliary means for designating the projection angle PA of the projection line PL. That is, when the light blocking unit 120 is positioned in a preset positional relationship with respect to the projection focus Pf and the blocking or opening is sequentially performed with respect to the projection light P in a preset pattern, the projection focus at any temporary point If the blocking phase with respect to (Pf) can be known, the projection angle PA of the immediately before the open or blocked projection line PL can be known. This is possible because the projection pattern of the projection light P is already designed, and an auxiliary means (such as specifying or confirming the projection angle) in the disappearance of the projection point PP causing a problem of one-to-one correspondence of the preset projection angle PA. It can be used as.
여기서 광차단부(120)는 등각 동심배열인 제2 투사패턴(114)일 경우에는 일반적인 카메라 셔터와 같은 구조를 채택하는 것이 바람직할 것이며, 등간격 격자배열인 제1 투사패턴(112)일 경우에는 도 11과 같이 "ㄴ"과 "ㄱ" 형상을 가지며 중심투사선(PLc)을 향해 대각방향을 따라 이동하는 한 쌍의 셔터를 사용할 수 있다.In the case of the second projection pattern 114 having the conformal concentric array, the light blocking unit 120 may preferably adopt the same structure as a general camera shutter, and the first projection pattern 112 having the equal interval grid array. As shown in FIG. 11, a pair of shutters having "b" and "a" shapes and moving along the diagonal direction toward the center projection line PLc may be used.
또한 소실점 판정을 위한 광차단은 이런 물리적 셔터방식뿐만 아니라, 액정유리(124)와 같은 광학적 셔터를 적용하는 것도 가능하다. 액정유리(124)는 투사광(P)의 광경로상에 배치된 광투과창으로 설치되며, 액정유리(124)에 인가되는 전류를 제어하여 일부 영역을 순차적으로 빛에 대한 비투과성으로 변환시킴으로써 광차단부(120)의 기능을 구현할 수 있다. 이런 광학적 셔터는 물리적 셔터에 비해 구조적으로 간단하고 기계적 진동이 발생하지 않으며 빛을 투과하지 않는 일부분을 비교적 자유롭게 설계할 수 있다는 장점이 있다.In addition, the light shielding for vanishing point determination may be applied not only to this physical shutter method but also to an optical shutter such as the liquid crystal glass 124. The liquid crystal glass 124 is installed as a light transmission window disposed on the optical path of the projection light P, and controls a current applied to the liquid crystal glass 124 to sequentially convert some regions into non-transmissive light. The function of the light blocking unit 120 may be implemented. Such an optical shutter has advantages in that it is structurally simple, does not generate mechanical vibration, and relatively freely designs a part that does not transmit light, compared to a physical shutter.
물론 각각의 투사선을 하나씩 순차적으로 차단하는 셔터를 사용하는 것도 가능하지만, 구조적으로 복잡해지고 소실점 판정의 효율에서 떨어진다는 단점이 있다.Of course, it is also possible to use a shutter that sequentially blocks each projection line one by one, but there are disadvantages in that it is structurally complicated and inferior in efficiency of vanishing point determination.
이상에서는 본 발명이 어떠한 원리를 통해 기설정 배율각을 갖는, 즉 각 투사광(P)의 투사각(PA)이 정해진 다중투사광이 3차원 대상물(10)의 표면에 맺힌 투사점(PP)들의 2차원 영상으로부터 대상물(10)의 3차원적 표면 형상을 인식하는지에 대해 상술하였다.In the above description, according to the present invention, two of the projection points PP having the predetermined magnification angle, that is, the multi-projection light in which the projection angle PA of each projection light P is defined on the surface of the three-dimensional object 10 are formed. It has been described above whether the three-dimensional surface shape of the object 10 is recognized from the dimensional image.
이러한 본 발명은 다음과 같은 일련의 과정으로 정리될 수 있으며, 이 과정은 도 12에 흐름도로 도시되어 있다.The present invention can be summarized in the following series of processes, which are shown in a flowchart in FIG.
본 발명에 따른 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법은 크게 4개의 단계를 통해 구현된다.The three-dimensional recognition method through two-dimensional image acquisition of the multi-projection light according to the present invention is largely implemented through four steps.
제1 단계는 중심투사선(PLc)에 대한 투사각(PA)이 사전에 설정된 다수의 투사광(P)을 3차원 대상물(10)로 투사하는 단계이다. 여기서 다수의 투사광(P)들은 사전에 설정된 도트 패턴을 갖는 회절격자(110)에 단일 광원을 투사하여 다수의 광으로 분리되어 만들어질 수 있으며, 특히 중심투사선(PLc)을 기준으로 하는 등간격 격자배열 또는 등각 동심배열의 투사패턴을 이루도록 회절격자(110)가 설계될 수 있다.The first step is a step of projecting a plurality of projection light beams P having a preset projection angle PA with respect to the center projection line PLc to the three-dimensional object 10. Here, the plurality of projection light beams P may be formed by projecting a single light source onto the diffraction grating 110 having a preset dot pattern, and being separated into a plurality of light, in particular, an equal interval based on the center projection line PLc. The diffraction grating 110 may be designed to form a projection pattern of a lattice array or a concentric array.
제2 단계는 중심투사선(PLc)에 대해 중심획득선(CLc)이 동일 평면상에 위치한 상태에서, 3차원 대상물(10)에 형성된 투사점(PP)의 패턴을 수광센서(200)의 2차원 이미지검출면(210)을 통해 각각의 좌표값을 아는 다수의 획득점(CP)으로 획득하는 단계이다. 만일 동일 평면상에 위치한 중심투사선(PLc)과 중심획득선(CLc)이 일치하지 않는 경우라도, 중심투사점(PPc)을 기준으로 하는 중심투사선(PLc)과 중심획득선(CLc) 사이에 형성된 각도를 20°이하로 설정하면 약 94% 이상의 정확도를 기대할 수 있다.In the second step, the pattern of the projection points PP formed on the three-dimensional object 10 is displayed in the two-dimensional image of the light receiving sensor 200 with the center acquisition line CLc positioned on the same plane with respect to the center projection line PLc. Acquiring a plurality of acquisition points (CP) knowing each coordinate value through the detection surface 210. Even if the center projection line PLc and the center acquisition line CLc located on the same plane do not coincide with each other, the angle formed between the center projection line PLc and the center acquisition line CLc based on the center projection point PPc is determined. If you set below 20 °, you can expect about 94% accuracy.
다음으로 다수의 투사광(P)과 획득점(CP) 사이의 일대일 대응관계를 판정하는 제3 단계를 거친 후, 투사각(PA)과 획득점(CP)의 획득각(CA) 및 초점거리(f) 사이의 기하학적 관계로부터 각각의 획득거리(CD)를 계산하는 제4 단계를 수행한다.Next, after the third step of determining the one-to-one correspondence between the plurality of projection lights P and the acquisition point CP, the acquisition angle CA and the focal length of the projection angle PA and the acquisition point CP are determined. f) performing a fourth step of calculating each acquisition distance CD from the geometric relationship therebetween.
여기서 제4 단계는 위에서 설명한 식(1) 및 식(2)를 통해 획득거리(CD)를 계산하고, 1차적으로 계산된 획득거리(CD)를 식(3)을 통해 원점거리(OD)로 변환하는 제4-1 단계를 더 수행할 수 있다.In the fourth step, the acquisition distance CD is calculated using the above-described equations (1) and (2), and the first calculated distance (CD) is calculated as the origin distance (OD) through the equation (3). Step 4-1 of converting may be further performed.
한편 제3 단계는 투사광(P)들에 부여된 측정가능한 물성치의 구배특성을 획득점(CP)에서 측정하고 상호 대응시킴으로써 각 투사광(P)과 획득점(CP)의 일대일 대응을 결정할 수 있다.On the other hand, the third step may determine the one-to-one correspondence of each projection light P and the acquisition point CP by measuring and matching the gradient characteristics of the measurable property values applied to the projection lights P at the acquisition points CP. have.
또는 제1 단계에서 다수의 투사광(P)들이 중심투사선(PLc)을 기준으로 하는 등간격 격자배열 또는 등각 동심배열의 투사패턴을 이루도록 한 경우라면, 제3 단계는 중심투사선(PLc)을 중심으로 반경방향으로 연장된 선위에 존재하는 일련의 투사광(P)들의 배열 순서에 대응하여 획득점(CP)들을 일대일 대응시킬 수도 있다.Alternatively, in the first step, when the plurality of projection light beams P forms a projection pattern of an equidistant lattice array or an equiaxed concentric array based on the center projection line PLc, the third step may be based on the center projection line PLc. As a result, the acquisition points CP may correspond one-to-one in correspondence with the arrangement order of the series of projection lights P existing on the radially extending line.
한편 본 발명은 투사광(P)의 개수와 획득점(CP)의 개수를 비교하여 소실된 획득점(CP)이 존재하는지 여부를 판정하는 제2-1 단계를 더 포함할 수 있으며, 제2-1 단계에서 소실된 획득점(CP)이 존재하는 것으로 판정되면, 획득점(CP)의 개수가 투사광(P)의 개수와 동일해지도록 중심투사선(PLc)의 방향을 조정하는 제2-2 단계를 더 포함할 수 있다.On the other hand, the present invention may further include a step 2-1 of comparing the number of projection light (P) and the number of acquisition points (CP) to determine whether there is a missing acquisition point (CP), the second If it is determined that the acquisition point CP lost in step -1 exists, the second operation of adjusting the direction of the center projection line PLc such that the number of acquisition points CP is equal to the number of projection light beams P is performed. It may further comprise two steps.
중심투사선(PLc)의 방향을 조정하는 방법 대신, 다수 투사광(P)의 일부를 광차단부(120)로 차단하고, 획득점(CP) 중 사라지지 않는 획득점(CP)이 존재하는지 여부를 판정함으로써 소실된 획득점(CP)에 대응하는 투사광(P)을 결정하는 과정을 모든 투사광(P)에 대해 반복 수행하는 제2-2' 단계를 적용하는 것도 가능하다.Instead of adjusting the direction of the center projection line PLc, a part of the plurality of projection lights P is blocked by the light blocking unit 120, and whether there is an acquisition point CP that does not disappear among the acquisition points CP. It is also possible to apply the 2-2 'step of repeatedly performing the process of determining the projection light P corresponding to the lost acquisition point CP for all the projection lights P by determining.
여기서 제2-2' 단계는 투사패턴의 투사광(P)들 중 중심투사선(PLc)을 기준으로 동일한 반경거리를 갖는 일련의 투사광(P)들을 광차단부(120)로 동시에 차단하고, 차단된 투사광(P)들의 반경방향 배열 순서에 대응하는 획득점(CP) 중 사라지지 않는 획득점(CP)이 존재하는지 여부를 판정함으로써 소실된 획득점(CP)에 대응하는 투사광(P)을 결정하는 과정을 모든 투사광(P)에 대해 반복 수행할 수 있다.Here, step 2-2 'simultaneously blocks a series of projection light beams P having the same radial distance with respect to the center projection line PLc among the projection light beams P of the projection pattern at the same time with the light blocking unit 120, The projection light P corresponding to the lost acquisition point CP is determined by determining whether there is an acquisition point CP that does not disappear among the acquisition points CP corresponding to the radial arrangement order of the blocked projection lights P. ) May be repeated for all the projection light beams (P).
특히 광차단부(120)는 반경거리가 가장 큰 일련의 투사광(P)들 또는 상기 반경거리가 가장 작은 일련의 투사광(P)들로부터 순차적 차단을 시작할 수 있는데, 광차단부(120)로는 물리적 셔터 또는 전류의 인가에 의해 일부 영역이 빛에 대해 비투과성으로 변환되는 액정유리(124)인 광학적 셔터를 적용할 수 있다.In particular, the light blocking unit 120 may start blocking sequentially from a series of projection light beams P having the largest radial distance or a series of projection light beams P having the smallest radial distance. The furnace may be an optical shutter, which is a liquid crystal glass 124 in which some regions are converted to non-transparent to light by application of a physical shutter or a current.
이때 투사패턴이 등각 동심배열인 경우의 물리적 셔터는 카메라 셔터의 구조를 적용할 수 있으며, 투사패턴이 등간격 격자배열이라면 물리적 셔터는 "ㄴ"과 "ㄱ" 형상을 가지며 중심투사선(PLc)을 향해 대각방향을 따라 이동하는 한 쌍의 셔터(122)로 구성할 수 있을 것이다.In this case, when the projection pattern is an isometric concentric array, the physical shutter may apply the structure of the camera shutter. If the projection pattern is an equally spaced grid array, the physical shutters have the shapes of "b" and "a" and the center projection line PLc. It may be configured as a pair of shutters 122 moving in a diagonal direction toward.
한편 소실된 획득점(CP)을 판정하기 위한 광차단상부는 투사각(PA) 지정의 보조수단으로도 사용가능한데, 투사초점(Pf)에 대해 미리 설정된 위치관계로 광차단부(120)를 위치시키고 광차단부(120)의 순차적 차단을 미리 설정된 패턴으로 수행한다면, 어느 일시점에서의 투사초점(Pf)에 대한 차단위상에 따라 차단되기 직전의 투사광(P)의 투사각(PA)을 알 수 있기 때문이다.On the other hand, the light blocking upper part for determining the lost acquisition point CP can also be used as an auxiliary means for specifying the projection angle PA. The light blocking part 120 is positioned in a preset positional relationship with respect to the projection focus Pf. If the sequential blocking of the light blocking unit 120 is performed in a preset pattern, the projection angle PA of the projection light P immediately before being blocked according to the blocking phase with respect to the projection focus Pf at a certain point can be known. Because there is.
또한 본 발명은 다수의 획득점(CP)에 대한 원점거리(OD) 정보를 계산하여 3차원 대상물(10)과의 공간 관계를 파악하는 것은 물론 다수의 획득점(CP)에 대한 원점거리(OD) 정보를 곡면 피팅함으로써 3차원 대상물(10)의 표면 형상을 계산할 수도 있는데, 이는 별도의 제5 단계에서 수행된다.In addition, the present invention calculates the origin distance (OD) information for a plurality of acquisition points (CP) to grasp the spatial relationship with the three-dimensional object 10, as well as the origin distance (OD) for a plurality of acquisition points (CP) The surface shape of the three-dimensional object 10 may be calculated by curved fitting the information, which is performed in a separate fifth step.
이상과 같은 구성을 가진 본 발명의 실시예에 의하면, 다수의 투사광이 3차원 대상물의 표면에 형성한 투사점들이 수광센서의 이미지검출면에 포착된 획득점들의 2차원 패턴으로부터 3차원 대상물의 대략적인 형상을 파악하는 것이 가능해진다. 다만 상술된 실시예는 본 발명이 속하는 기술분야의 통상의 지식을 가진 자가 본 발명의 원칙이나 정신에서 벗어나지 않는 범위에서 변형 가능한 것이므로, 본 발명의 권리범위는 첨부된 청구항과 그 균등물에 의해 정해져야 할 것이다.According to the embodiment of the present invention having the above configuration, the projection points formed by the plurality of projection light on the surface of the three-dimensional object is approximately from the two-dimensional pattern of the acquisition points captured on the image detection surface of the light receiving sensor. It becomes possible to grasp the typical shape. However, since the embodiments described above can be modified by those skilled in the art without departing from the spirit or principle of the present invention, the scope of the present invention is defined by the appended claims and their equivalents. Should
본 발명은 탐색 대상물의 3차원적 특성 인식이 요구되는 기술분야, 예를 들면 일상에서 운용되는 장난감이나 자동차, 로봇뿐만 아니라 무인비행기, 미사일 등에 이르기까지 3차원 인식을 필요로 하는 다양한 분야에 유용하게 사용될 수 있다.The present invention is useful in a variety of fields that require three-dimensional recognition, such as technical vehicles requiring recognition of three-dimensional characteristics of the search target, such as drones, missiles, as well as toys, cars, and robots used in everyday life. Can be used.

Claims (18)

  1. 중심투사선에 대한 투사각이 사전에 설정된 다수의 투사광을 3차원 대상물로 투사하는 제1 단계;A first step of projecting a plurality of projection light beams having a projection angle with respect to the center projection line set in advance to a three-dimensional object;
    상기 중심투사선에 대해 중심획득선이 동일 평면상에 위치한 상태에서, 상기 3차원 대상물에 형성된 투사점의 패턴을 수광센서의 2차원 이미지검출면을 통해 각각의 좌표값을 아는 다수의 획득점으로 획득하는 제2 단계;Acquiring a pattern of projection points formed on the three-dimensional object as a plurality of acquisition points that know each coordinate value through the two-dimensional image detection surface of the light receiving sensor while the center acquisition line is located on the same plane with respect to the center projection line. Second step;
    상기 다수의 투사광과 획득점 사이의 일대일 대응관계를 판정하는 제3 단계; 및A third step of determining a one-to-one correspondence between the plurality of projection lights and an acquisition point; And
    상기 투사각과 상기 획득점의 획득각 및 초점거리 사이의 기하학적 관계로부터 각각의 획득거리를 계산하는 제4 단계;A fourth step of calculating each acquisition distance from a geometric relationship between the projection angle and an acquisition angle and a focal length of the acquisition point;
    를 포함하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.Three-dimensional recognition method through the two-dimensional image acquisition of the multi-projection light comprising a.
  2. 제1항에 있어서,The method of claim 1,
    상기 제4 단계는 아래의 식(1) 및 식(2)를 통해 상기 획득거리를 계산하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.In the fourth step, the acquisition distance is calculated by using Equation (1) and Equation (2) below.
    Figure PCTKR2012003052-appb-I000007
    ......식(1)
    Figure PCTKR2012003052-appb-I000007
    Expression (1)
    Figure PCTKR2012003052-appb-I000008
    .....식(2)
    Figure PCTKR2012003052-appb-I000008
    ..... Equation (2)
    여기서, L은 획득거리, f는 초점거리, α는 투사각, β는 획득각Where L is the acquisition distance, f is the focal length, α is the projection angle, and β is the acquisition angle
  3. 제2항에 있어서,The method of claim 2,
    상기 획득거리를 아래의 식(3)을 통해 원점거리로 변환하는 제4-1 단계를 더 포함하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.And a 4-1 step of converting the acquisition distance into an origin distance through Equation (3) below.
    Figure PCTKR2012003052-appb-I000009
    ......식(3)
    Figure PCTKR2012003052-appb-I000009
    Expression (3)
    여기서, D는 원점거리Where D is the origin distance
  4. 제1항에 있어서,The method of claim 1,
    상기 제2 단계에서 상기 중심투사선과 중심획득선이 일치하지 않는 경우 중심투사점을 기준으로 하는 상기 중심투사선과 중심획득선 사이의 각도가 20°이하인 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.In the second step, when the center projection line and the center acquisition line do not coincide, the angle between the center projection line and the center acquisition line based on the center projection point is 20 ° or less through 2D image acquisition of the multi-projection light. 3D recognition method.
  5. 제3항에 있어서,The method of claim 3,
    상기 다수의 획득점에 대한 원점거리 정보를 곡면 피팅하여 상기 3차원 대상물의 표면 형상을 계산하는 제5 단계를 더 포함하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.And a fifth step of calculating a surface shape of the three-dimensional object by curvedly fitting the origin distance information of the plurality of acquisition points, wherein the two-dimensional image acquisition of the multi-projection light is performed.
  6. 제1항에 있어서,The method of claim 1,
    상기 제3 단계는 상기 투사광들에 부여된 측정가능한 물성치의 구배특성을 상기 획득점에서 측정하고 상호 대응시킴으로써 이루어지는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.And wherein said third step is achieved by measuring and matching the gradient characteristics of measurable property values imparted to said projection light at said acquisition point and mutually matching.
  7. 제1항에 있어서,The method of claim 1,
    상기 제1 단계는 상기 다수의 투사광들이 상기 중심투사선을 기준으로 하는 등간격 격자배열 또는 등각 동심배열의 투사패턴을 이루도록 하고,In the first step, the plurality of projection light beams form a projection pattern of an equally spaced grid array or an equilateral concentric array based on the center projection line.
    상기 제3 단계는 상기 중심투사선을 중심으로 반경방향으로 연장된 선위에 존재하는 일련의 투사광들의 배열 순서에 대응하여 상기 획득점들을 일대일 대응시키는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.The third step includes acquiring one-to-one correspondence of the acquisition points corresponding to an arrangement order of a series of projection lights existing on a line extending radially about the center projection line. 3D recognition method.
  8. 제7항에 있어서,The method of claim 7, wherein
    상기 투사패턴은 사전에 설정된 도트 패턴을 갖는 회절격자에 단일 광원을 투사함으로써 만들어지는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.Wherein the projection pattern is made by projecting a single light source onto a diffraction grating having a preset dot pattern.
  9. 제1항에 있어서,The method of claim 1,
    상기 투사광의 개수와 상기 획득점의 개수를 비교하여 소실된 획득점이 존재하는지 여부를 판정하는 제2-1 단계를 더 포함하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.And comparing the number of projection light beams with the number of acquisition points to determine whether there is a missing acquisition point. 2-3D recognition method according to the two-dimensional image acquisition of the multi-projection light.
  10. 제9항에 있어서,The method of claim 9,
    상기 제2-1 단계에서 소실된 획득점이 존재하는 것으로 판정되면, 상기 획득점의 개수가 상기 투사광의 개수와 동일해지도록 상기 중심투사선의 방향을 조정하는 제2-2 단계를 더 포함하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.If it is determined that the acquisition point lost in the step 2-1 is present, further comprising the step 2-2 of adjusting the direction of the center projection line so that the number of the acquisition point is equal to the number of the projection light. 3D recognition method through 2D image acquisition of multi-projection light.
  11. 제7항에 있어서,The method of claim 7, wherein
    상기 투사광의 개수와 상기 획득점의 개수를 비교하여 소실된 획득점이 존재하는지 여부를 판정하는 제2-1 단계를 더 포함하고,And comparing the number of the projection light beams with the number of acquisition points to determine whether there is a missing acquisition point.
    상기 제2-1 단계에서 소실된 획득점이 존재하는 것으로 판정되면, 상기 다수 투사광의 일부를 광차단부로 차단하고 상기 획득점 중 사라지지 않는 획득점이 존재하는지 여부를 판정함으로써 소실된 획득점에 대응하는 투사광을 결정하는 과정을 모든 투사광에 대해 반복 수행하는 제2-2' 단계를 더 포함하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.If it is determined that the acquisition point lost in step 2-1 is present, a part of the plurality of projection lights is blocked by the light blocking unit, and it is determined whether or not there is an acquisition point that does not disappear among the acquisition points. And 2-2 'repeating the process of determining the projection light for all the projection light.
  12. 제11항에 있어서,The method of claim 11,
    상기 제2-2' 단계는 상기 투사패턴의 투사광들 중 상기 중심투사선을 기준으로 동일한 반경거리를 갖는 일련의 투사광들을 상기 광차단부로 동시에 차단하고, 상기 차단된 투사광들의 반경방향 배열 순서에 대응하는 획득점 중 사라지지 않는 획득점이 존재하는지 여부를 판정함으로써 소실된 획득점에 대응하는 투사광을 결정하는 과정을 모든 투사광에 대해 반복 수행하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.Step 2-2 ′ simultaneously blocks a series of projection light beams having the same radial distance from the projection light beams of the projection pattern with the light blocking unit, and sequentially arranges the blocked projection light beams in the radial direction. 2D image acquisition of the multi-projection light, wherein the process of determining the projection light corresponding to the lost acquisition point is repeated for all the projection lights by determining whether there is an acquisition point that does not disappear among the acquisition points corresponding to 3D recognition method through
  13. 제12항에 있어서,The method of claim 12,
    상기 광차단부는 상기 반경거리가 가장 큰 일련의 투사광들 또는 상기 반경거리가 가장 작은 일련의 투사광들로부터 순차적 차단을 시작하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.The light shielding unit starts sequential blocking from a series of projection lights having the largest radial distance or a series of projection lights having the smallest radial distance. .
  14. 제11항 내지 제13항에 있어서,The method according to claim 11, wherein
    상기 광차단부는 물리적 셔터이거나, 또는 전류의 인가에 의해 일부 영역이 빛에 대해 비투과성으로 변환되는 액정유리인 광학적 셔터인 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.The light blocking unit is a physical shutter or an optical shutter which is a liquid crystal glass in which a part of the area is converted to non-transmissive to light by the application of a current, 3D recognition method through the two-dimensional image acquisition of the multi-projection light.
  15. 제14항에 있어서,The method of claim 14,
    상기 투사패턴이 등각 동심배열인 경우, 상기 물리적 셔터는 카메라 셔터의 구조를 갖는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.And the physical shutter has a structure of a camera shutter when the projection pattern is a conformal concentric array. 3D recognition method through 2D image acquisition of multi-projection light.
  16. 제14항에 있어서,The method of claim 14,
    상기 투사패턴이 등간격 격자배열인 경우, 상기 물리적 셔터는 "ㄴ"과 "ㄱ" 형상을 가지며 상기 중심투사선을 향해 대각방향을 따라 이동하는 한 쌍의 셔터인 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.When the projection pattern is an equally spaced grid array, the physical shutter is a pair of shutters having a shape of "b" and "a" and moving in a diagonal direction toward the center projection line. Three-dimensional recognition method through image acquisition.
  17. 제11항에 있어서,The method of claim 11,
    상기 광차단부의 순차적 차단이 미리 설정된 패턴으로 수행되되, 상기 투사초점에 대해 미리 설정된 위치관계로 상기 광차단부를 위치시키고 각 투사광에 대해 순차적 차단을 수행할 때, 어느 일시점에서의 상기 투사초점에 대한 차단위상에 따라 차단되기 직전의 상기 투사광의 투사각을 지정 또는 확인하는 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.The sequential blocking of the light blocking unit is performed in a preset pattern, but when the light blocking unit is positioned in a preset positional relationship with respect to the projection focus and sequential blocking is performed for each projection light, the projection focus at a certain point of time. And specifying or confirming a projection angle of the projection light immediately before being blocked according to the blocking phase for the three-dimensional recognition method by acquiring two-dimensional images of the multi-projection light.
  18. 제1항에 있어서,The method of claim 1,
    상기 투사광은 상기 3차원 대상물 주변에서 존재하는 빛의 파장과 구별되는 파장을 갖는 레이저 광인 것을 특징으로 하는 다중투사광의 2차원 영상획득을 통한 3차원 인식 방법.The projection light is a three-dimensional recognition method through the two-dimensional image acquisition of the multi-projection light, characterized in that the laser light having a wavelength distinct from the wavelength of the light present around the three-dimensional object.
PCT/KR2012/003052 2011-04-21 2012-04-20 Methods of three-dimensional recognition via the acquisition of a two-dimensional image consisting of multiple beams of transmitted light having a preset magnification angle WO2012144848A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR1020110037084A KR101188357B1 (en) 2011-04-21 2011-04-21 Method for three dimensional recognition by two dimensional detection of multi projected light with predetermined magnification angle
KR10-2011-0037084 2011-04-21

Publications (2)

Publication Number Publication Date
WO2012144848A2 true WO2012144848A2 (en) 2012-10-26
WO2012144848A3 WO2012144848A3 (en) 2013-01-17

Family

ID=47042067

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2012/003052 WO2012144848A2 (en) 2011-04-21 2012-04-20 Methods of three-dimensional recognition via the acquisition of a two-dimensional image consisting of multiple beams of transmitted light having a preset magnification angle

Country Status (2)

Country Link
KR (1) KR101188357B1 (en)
WO (1) WO2012144848A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9806538B2 (en) 2012-12-27 2017-10-31 Murata Manufacturing Co., Ltd. Measurement circuit and measurement apparatus for wireless power transmission system
WO2018229358A1 (en) * 2017-06-14 2018-12-20 Majo Method and device for constructing a three-dimensional image
CN111174788A (en) * 2018-11-13 2020-05-19 北京京东尚科信息技术有限公司 Indoor two-dimensional map building method and device

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101967554B1 (en) * 2017-04-17 2019-04-09 부산대학교 산학협력단 Method for measuring spatial information using index line and apparatus thereof
CN109443288B (en) * 2018-11-09 2023-09-01 广东新怡内衣科技有限公司 Mould cup examines utensil and mould cup
KR102225342B1 (en) * 2019-02-13 2021-03-09 주식회사 브이터치 Method, system and non-transitory computer-readable recording medium for supporting object control

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010053347A (en) * 1998-06-30 2001-06-25 피터 엔. 데트킨 Method and apparatus for capturing stereoscopic images using image sensors
US20030042401A1 (en) * 2000-04-25 2003-03-06 Hansjorg Gartner Combined stereovision, color 3D digitizing and motion capture system
KR20100087083A (en) * 2007-08-28 2010-08-03 아르텍 그룹, 아이엔씨. System and method for three-dimensional measurment of the shape of material object

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010053347A (en) * 1998-06-30 2001-06-25 피터 엔. 데트킨 Method and apparatus for capturing stereoscopic images using image sensors
US20030042401A1 (en) * 2000-04-25 2003-03-06 Hansjorg Gartner Combined stereovision, color 3D digitizing and motion capture system
KR20100087083A (en) * 2007-08-28 2010-08-03 아르텍 그룹, 아이엔씨. System and method for three-dimensional measurment of the shape of material object

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9806538B2 (en) 2012-12-27 2017-10-31 Murata Manufacturing Co., Ltd. Measurement circuit and measurement apparatus for wireless power transmission system
WO2018229358A1 (en) * 2017-06-14 2018-12-20 Majo Method and device for constructing a three-dimensional image
CN111174788A (en) * 2018-11-13 2020-05-19 北京京东尚科信息技术有限公司 Indoor two-dimensional map building method and device
CN111174788B (en) * 2018-11-13 2023-05-02 北京京东乾石科技有限公司 Indoor two-dimensional mapping method and device

Also Published As

Publication number Publication date
WO2012144848A3 (en) 2013-01-17
KR101188357B1 (en) 2012-10-08

Similar Documents

Publication Publication Date Title
WO2012144848A2 (en) Methods of three-dimensional recognition via the acquisition of a two-dimensional image consisting of multiple beams of transmitted light having a preset magnification angle
CN110487213B (en) Full-view-angle line laser scanning three-dimensional imaging device and method based on spatial dislocation
CN102788559B (en) Optical vision measuring system with wide-field structure and measuring method thereof
EP3803776A1 (en) Systems and methods for multi-camera placement
CN102081296B (en) Device and method for quickly positioning compound-eye vision imitated moving target and synchronously acquiring panoramagram
Staniek Stereo vision method application to road inspection
CN101901501A (en) Method for generating laser color cloud picture
WO2016206108A1 (en) System and method for measuring a displacement of a mobile platform
CN107339935B (en) Target space intersection measuring method for full-view scanning measuring system
WO2020071619A1 (en) Apparatus and method for updating detailed map
Olivka et al. Calibration of short range 2D laser range finder for 3D SLAM usage
JP2018036769A (en) Image processing apparatus, image processing method, and program for image processing
US20190339071A1 (en) Marker, and Posture Estimation Method and Position and Posture Estimation Method Using Marker
Xie et al. LiTag: localization and posture estimation with passive visible light tags
AU2019353165B2 (en) Optics based multi-dimensional target and multiple object detection and tracking method
CN112415010A (en) Imaging detection method and system
Zalud et al. Calibration and evaluation of parameters in a 3D proximity rotating scanner
WO2018186507A1 (en) Method for performing calibration by using measured data without assumed calibration model and three-dimensional scanner calibration system for performing same
WO2015037797A1 (en) Three-dimensional shape measurement device and method
CN105953820B (en) A kind of optical calibrating device of inertial measurement combination dynamic navigation performance
KR100913165B1 (en) Apparatus and method for detecting a localization of mobility
RU2552123C2 (en) Method of selecting objects on remote background
Shojaeipour et al. Robot path obstacle locator using webcam and laser emitter
WO2023068562A1 (en) Method and device for determining plane for mapping object onto three-dimensional space
RU152656U1 (en) OPTICAL-ELECTRONIC DEVICE FOR DETECTION OF SMALL-SIZED UNMANNED AERIAL VEHICLES

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12774225

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11/03/2014)

122 Ep: pct application non-entry in european phase

Ref document number: 12774225

Country of ref document: EP

Kind code of ref document: A2