WO2012023256A4 - Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program - Google Patents
Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program Download PDFInfo
- Publication number
- WO2012023256A4 WO2012023256A4 PCT/JP2011/004451 JP2011004451W WO2012023256A4 WO 2012023256 A4 WO2012023256 A4 WO 2012023256A4 JP 2011004451 W JP2011004451 W JP 2011004451W WO 2012023256 A4 WO2012023256 A4 WO 2012023256A4
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- edge
- slit
- light
- pattern
- overlap
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2536—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object using several gratings with variable grating pitch, projected on the object with the same angle of incidence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
Definitions
- the present invention relates to three-dimensional measurement apparatuses, methods for three-dimensional measurement, and computer programs.
- Measurement apparatuses for three-dimensional shapes are used in an area generally referred to as robot vision, such as three-dimensional shape inspection of products, measurement of physical dimensions, assembly positioning.
- a light section method using laser slit light which is a method for measurement of three-dimensional shapes, has an advantage in that it can be carried out using compact components such as a scanner, a mirror, and a semiconductor laser.
- the light section method has a disadvantage in that this method requires capturing as many images as there are scan lines and, hence, is unfavorable for high-speed measurement of an object having a three-dimensional shape.
- a space encoding method in which a three-dimensional shape is measured by projecting a slit-shaped light pattern, which is a light and dark pattern, onto an object, for space division.
- the number of images to be captured needs to be only (log 2 N), where N is the number of divisions of space.
- the space encoding method allows an object with a three-dimensional shape to be measured using fewer captured images.
- the resolution of space division is limited by the number of bits used for quantization of a projected pattern, at the time of space division.
- a multi-slit-light space measurement method in which a space encoding method is used up to a certain stage and the three-dimensional shapes of individual spatially divided regions are measured by a light section method at the same time (refer to PTL 1).
- This method allows an object having a three-dimensional shape to be measured with even fewer captured images than in the light section method, and allows an object having a three-dimensional shape to be measured with higher spatial resolution than measurement using only the space encoding method.
- a three-dimensional shape measurement apparatus which measures a shape by, for example, projecting laser slit-shaped light or a slit-shaped light pattern, has a problem in, for example, detection of measurement lines on which measured values are based, depending on the capturing condition of the reflected light of projected light.
- detection of measurement lines may not be performed under ideal condition (diffuse reflection) and there may be specularity, depending on a bidirectional reflectance distribution function (BRDF), which expresses the reflection characteristics of light from an object. Due to this, for example, in a three-dimensional shape measurement apparatus, indirect reflection of measurement lines is measured.
- BRDF bidirectional reflectance distribution function
- the reliability of measured distance values may become low. Further, when the reflection strength of an object is low, the signal to noise ratio (S/N ratio) of measurement lines may decrease, resulting in difficulty in the observation of the object and decreased reliability of the measured distance values when numerous noise components are observed, for example.
- S/N ratio signal to noise ratio
- encoding errors are detected by projecting, onto an object, a light pattern for detecting an encoding error of a code indicating space division of a portion of a slit-shaped light pattern.
- the method disclosed in PTL 1 requires, for example, projection of a new pattern to remove the influence of indirect light. This results in an increase in the number of operations of projecting a slit-shaped light pattern, although the influence of indirect light can be measured.
- the method disclosed in PTL 1 requires capturing images more frequently than in ordinary methods in measurement of three-dimensional shapes.
- the method disclosed in PTL 1 causes an increase in measurement time and makes high-speed measurement of the three-dimensional shape of an object difficult.
- the present invention has been made in consideration of the above situation, and has as its object to perform measurement of the three-dimensional shape of an object at high speed in a space encoding method.
- a three-dimensional measurement apparatus including: an obtaining unit configured to obtain a captured image of an object onto which pattern light having a light and dark stripe pattern is projected; an edge position detecting unit configured to detect an edge position indicating a boundary between light and dark portions of the pattern light in the captured image; an edge overlap degree detecting unit configured to detect a degree of overlap of the edge positions detected by the edge position detecting unit; and a position computing unit configured to compute a position of the object on the basis of the edge positions and the degree of overlap of the edge positions.
- Fig. 1 illustrates a system configuration of a three-dimensional measurement system.
- Fig. 2 illustrates exemplary slit-shaped light patterns projected onto an object.
- Fig. 3 illustrates an exemplary method for three-dimensional measurement performed by a three-dimensional measurement system.
- Fig. 4 illustrates a software configuration of a processing unit of a three-dimensional measurement system.
- Fig. 5 illustrates other examples of slit-shaped light patterns to be projected onto an object.
- Fig. 6 illustrates an actually captured image of an object.
- Fig. 7 illustrates computation results of edge positions obtained by projection of a low-frequency slit-shaped light pattern.
- Fig. 8 illustrates computation results of edge positions obtained by projection of a middle-frequency slit-shaped light pattern.
- Fig. 1 illustrates a system configuration of a three-dimensional measurement system.
- Fig. 2 illustrates exemplary slit-shaped light patterns projected onto an object.
- Fig. 3 illustrates an exemplary method for three-dimensional
- FIG. 9 illustrates computation results of edge positions obtained by projection of a high-frequency slit-shaped light pattern.
- Fig. 10A illustrates computation results of edge overlap values.
- Fig. 10B illustrates computation results of edge overlap values.
- Fig. 11 is a flowchart illustrating edge overlap computing processing.
- Fig. 12 illustrates exemplary influence of occlusion.
- Fig. 13 illustrates computation of edge coordinates.
- Fig. 14 illustrates computation of edge overlap.
- Fig. 1 illustrates an exemplary system configuration of a three-dimensional measurement system.
- the three-dimensional measurement system includes a projector 112, a camera 113, and a processing unit 100.
- the projector 112 projects stripe pattern light or multi-slit pattern light onto an object 117.
- the camera 113 captures an image of the object 117 (pattern light or multi-slit pattern light reflected by the object 117) onto which the pattern light or multi-slit pattern light is projected.
- the processing unit 100 instructs projection or capturing of stripe pattern light or multi-slit pattern light and processes the captured image data and performs three-dimensional measurement of the object 117.
- the processing unit 100 includes a central processing unit (CPU) 110, a frame buffer 111, an image memory 114, a memory 115, a controller 116, an output unit (not illustrated), and a display unit (not illustrated). These units are connected to one another through a bus.
- CPU central processing unit
- frame buffer 111 an image memory 114
- memory 115 a memory 115
- controller 116 an output unit (not illustrated)
- display unit not illustrated
- the controller 116 is hardware that performs communication between itself and the memory 115, the image memory 114, the frame buffer 111, and the like, and auxiliary functions of computation.
- the frame buffer 111 is formed of RAMs, for example.
- the CPU 110 forms projection patterns using patterns stored in the memory 115 that includes a non-volatile memory, and stores the projection patterns in the frame buffer 111.
- a pattern shape program for stripe pattern light and multi-slit light and a time account program for setting a projection time and the like are stored in the memory 115 that includes a non-volatile memory.
- the controller 116 upon receipt of a projection instruction from the CPU 110, transmits a stripe pattern light shape signal or a multi-slit light shape signal from the memory 115 to the projector 112 via the frame buffer 111.
- the controller 116 on the basis of an instruction from the CPU 110, transmits a time account signal to the projector 112 and the camera 113. Thereby, the projection and image capturing timings of stripe pattern light or multi-slit pattern light are managed.
- a plurality of pieces of image data captured by the camera 113 are temporarily stored in the image memory 114, and are input to the CPU 110 via the controller 116.
- the CPU 110 performs image processing such as binarization on the input image data. This binarization processing is processing for determining the position of a boundary between light and dark portions of stripe pattern light.
- the CPU 110 through the binarization processing, generates black and white binarized image data.
- the binarized image data is stored in the memory 115 and used to compute distance values in subsequent processing.
- the memory 115 is formed so as to include a ROM, which is a non-volatile memory, a RAM, which is a volatile memory, and the like.
- the memory 115 stores apparatus dependent parameters, such as a base line length between the projector 112 and the camera 113, the focal length of the camera 113, the number of pixels of the camera 113 and external parameters, such as distortion obtained by calibration performed in advance and external light luminance.
- the memory 115 also stores a program for three-dimensional measurement based on triangulation.
- the CPU 110 executes processing based on a three-dimensional measurement program stored in the memory 115, with binarized image data and various parameters as input, thereby performing three-dimensional measurement of the object 117.
- the results of three-dimensional measurement of the object 117 performed by the CPU 110 are generated as distance image data.
- the distance image data is converted into an image by the output unit (not illustrated) and displayed as a distance image by the display unit (not illustrated).
- binarized image data in consideration of a sub-pixel accuracy level can be created by assuming, as the size of the image memory 114, a virtual captured image which is larger than the image captured by the camera 113.
- space encoding based on a Gray code is described as an example of the space encoding.
- Fig. 2 illustrates exemplary slit-shaped light patterns projected onto the object 117.
- slit-shaped light patterns 200 to 202 are stripe patterns which have alternately arranged light and dark slit portions.
- the slit-shaped light patterns 200 to 202 based on a space encoding method represent slit and stripe pattern light based on 3-bit space encoding.
- N-bit encoding slit-shaped light patterns are expressed by N patterns.
- the slit-shaped light pattern 201 represents stripe pattern light which has twice the space resolution of the slit-shaped light pattern 200.
- the slit-shaped light pattern 202 represents stripe pattern light which has twice the space resolution of the slit-shaped light pattern 201.
- the slit-shaped light patterns 200 to 202 based on a space encoding method described above are projected by the projector 112 onto the object 117, and the slit-shaped light patterns 200 to 202 projected onto the object 117 are captured by the camera 113.
- the processing unit 100 by dividing an image capturing area into eight divisions, can recognize the object 117.
- Fig. 3 conceptually illustrates an exemplary method for three-dimensional measurement performed by a three-dimensional measurement system.
- the slit-shaped light patterns 200 to 202 formed of stripe patterns in which light and dark portions are alternately arranged with predetermined cycles are projected by the projector 112 onto the object 117.
- the stripe patterns (shapes and sizes) of the slit-shaped light patterns 200 to 202 are determined in advance, and are sequentially and individually projected onto the object 117. Every time each pattern is projected, the slit-shaped light pattern projected onto the object 117 is captured by the camera 113 and is obtained by the processing unit 100 as image data.
- the boundary position of light and dark portions of the object 117 is (X, Y, Z) and that the position of the light of the projector 112 when the boundary position (X, Y, Z) and the projector 112 are connected to each other by a straight line is (X1, Y1, Z1). Further it is assumed that the position of the light of the camera 113 when the boundary position (X, Y, Z) and the camera 113 are connected to each other by a straight line is (X2, Y2, Z2).
- the position (X2, Y2, Z2) of the light of the camera 113 is determined by the plane coordinates of the pixel of the image sensor (for example, CCD or CMOS) of the camera 113.
- the position (X1, Y1, Z1) of the light of the projector 112 is determined by the plane coordinates of the pixel of the light projecting device (for example, liquid crystal device).
- a length L between the projector 112 and the camera 113 is the base line length.
- the base line length L is determined by the configuration conditions of the apparatuses.
- the boundary position (X, Y, Z) of the object 117 can be determined by these parameters on the basis of the principle of triangulation.
- the three-dimensional shape of the object 117 can be measured by obtaining the boundary positions (X, Y, Z) over the whole surface of the object 117. Note that obtaining of the boundary positions (X, Y, Z) over the whole surface of the object 117 and the measurement of the three-dimensional shape of the object 117 can be realized using publicly known techniques, and the detailed description thereof is omitted here.
- Fig. 4 illustrates an exemplary software configuration of the processing unit 100 of a three-dimensional measurement system.
- the processing unit 100 is realized by software components denoted by 402 to 412. These software components correspond to programs executed by hardware such as the CPU 110, the frame buffer 111, the image memory 114, the memory 115, the controller 116, and the like illustrated in Fig. 1. Note that although these components are described as software in the present embodiment, part or all of these components may be realized using hardware for execution or processing. After all, outputting distance image data 413 using these software components is the object of all of the software components.
- the distance image data 413 is data formed of a two-dimensional array obtained by computing distance values for each of the pixels of an image captured by the camera 113.
- the distance image data 413 is three-dimensional space information which has a distance value from the camera 113 to the object 117 as a value of each pixel.
- a system management unit 404 controls only a pattern projection unit 402 and an image importing unit 403, other blocks are actually also managed by the system management unit 404.
- the system management unit 404 performs resource and time management of the entirety of the three-dimensional measurement system, such as management of the timings of projecting slit-shaped light patterns and the timings of capturing images, management of the order in which computation is to be performed, and management of output data.
- the pattern projection unit 402 generates slit-shaped light patterns to be projected by the projector 112.
- the generated slit-shaped light patterns are projected by the projector 112.
- the image of the object 117 at this time is captured by the camera 113 and is imported by the image importing unit 403 into the apparatus (processing unit 100).
- This processing for projection and capturing of slit-shaped light patterns is performed by projection of, for example, the slit-shaped light patterns 200 to 202 illustrated in Fig. 2 and capturing of the image.
- the slit-shaped light patterns of the method of the present embodiment may be vertical patterns illustrated in Fig. 2 or horizontal patterns illustrated in Fig. 5 for the proper operation of the system. Note that in the description below, the case in which the horizontal patterns illustrated in Fig. 5 are used rather than the vertical patterns illustrated in Fig. 2 is described as an example.
- This selection of patterns is determined by the positional relationship between the camera 113 and the projector 112.
- the camera 113 and the projector 112 are arranged so as to be separated in the vertical direction (Y-direction in the example illustrated in Fig. 3), horizontal patterns are used as the slit-shaped light patterns.
- vertical patterns are used as the slit-shaped light patterns.
- slit-shaped light patterns based on a Gray code are used as the slit-shaped light patterns, and a plurality of low- to high-frequency slit-shaped light patterns in accordance with the number of bits appropriate for dividing space are used.
- a binarization unit 405 divides each of the captured images of the object 117 into a region onto which slit-shaped light patterns are projected and a region onto which slit-shaped light patterns are not projected. For example, region division can be performed by dividing information regarding the luminance of the captured images of the object 117 in accordance with a threshold determined by a threshold determination method, such as the Ohtsu method.
- the binarized information obtained by the binarization unit 405 is sent to an edge computing unit 407.
- the edge computing unit 407 computes detailed position information (hereinafter, called “edge position information", as needed) regarding portions of a slit-shaped light pattern changing between light and dark, on the basis of the binarized information output from the binarization unit 405.
- the edge computing unit 407 sends the edge position information to a sub-pixel computing unit 409.
- the sub-pixel computing unit 409 computes more detailed coordinate data as the information regarding edge positions on the basis of luminance change information regarding pixels near the edges.
- the information regarding edge positions can be made to be coordinate values with sub-pixel accuracy, rather than pixel accuracy in units of the pixels of the camera 113 or units of the projection pixels of the projector 112.
- the edge computing unit 407 sends the edge position information to an edge overlap computing unit 410.
- the edge overlap computing unit 410 counts the number of overlapping edges of slit-shaped light patterns in each image capturing pixel, for each of the frequencies of the slit-shaped light patterns described above. In the present embodiment, when a plurality of edge positions exist in a pixel of an image sensor, the edge overlap computing unit 410 determines that the edge positions are overlapping.
- the binarization unit 405 sends the binarized information also to a binary value computing unit 406.
- the binary value computing unit 406 integrates the binarized images, at respective frequencies, generated from the images of the object 117 onto which a plurality of low- to high-frequency slit-shaped light patterns are individually projected. Thereby, the binary value computing unit 406 generates an image in which position information (hereinafter, called "disparity map data", as needed) in the projected image made by the projector 112 is encoded.
- disparity map data has been encoded using a code called a Gray code.
- a projection plane position information decoding unit 408 decodes the disparity map data encoded with a Gray code, and computes disparity map data which is not encoded.
- a distance value computing unit 411 receives the Gray code of the disparity map data decoded by the projection plane position information decoding unit 408, information regarding the edge positions obtained by the sub-pixel computing unit 409, and the count of overlapping edges obtained by the edge overlap computing unit 410. The distance value computing unit 411, upon receipt of these computing results, computes distance image data.
- distance image data at each position on the object 117 can be obtained using a triangulation method.
- the sub-pixel computing unit 409 computes the edge positions of slit-shaped light patterns with sub-pixel accuracy which is higher than accuracy in units of image pixels.
- the positions in three-dimensional space of the object 117 can be computed with very high accuracy.
- a method of computing distance image data is realized by a publicly known technique, and the detailed description thereof is omitted.
- the distance image data (computation results of the distances to the object 117) obtained by the distance value computing unit 411 is sent to a high-reliability distance value computing unit 412.
- the high-reliability distance value computing unit 412 identifies portions where the positions of the edges of the slit-shaped light patterns are overlapping on the basis of the count value computed by the edge overlap computing unit 410. Then, the high-reliability distance value computing unit 412 determines that the distance image data at these portions is inaccurate, and clears the corresponding distance values.
- the high-reliability distance value computing unit 412 reconfigures the distance image data 413 by using distance image data which have been determined to be not inaccurate.
- Fig. 5 illustrates other examples of slit-shaped light patterns to be projected onto an object. As described before, Fig. 5 illustrates slit-shaped light patterns having horizontal patterns, as the slit-shaped light patterns.
- Figs. 6 to 10 illustrate examples of the case in which the camera 113 and the projector 112 are arranged so as to be separated in the vertical direction.
- the count of the overlapping edge positions of slit-shaped light patterns is computed by projecting the horizontal slit-shaped light patterns (horizontal patterns) illustrated in Fig. 5 onto an object 601. Note that in the description below, this count value is called an "edge overlap value", as needed.
- Figs. 6 to 10 similarly to the example illustrated in Fig. 2, the image of the object 601 is captured for each of a plurality of low- to high-frequency slit-shaped light patterns 500 to 505 which are sequentially projected onto the object 601.
- Computation of the edge portions of slit-shaped light patterns described above is performed as follows, for example. That is, computation of the positions of the edge portions of the slit-shaped light patterns are computed on the basis of luminance gradient information regarding an image captured by using the stripe projection pattern (positive pattern) 500 and an image captured by using the stripe projection pattern (positive pattern) 503.
- slit-shaped light patterns are projected in a state in which position information of the projection plane is encoded using a 3-bit Gray code.
- the number of bits used for a high-frequency pattern is determined by the resolution of the projection pixels of the projector 112. For example, when the resolution of the projection pixels of the projector 112 is 1024 x 768 and horizontal patterns are projected, the required number of bits for the high-frequency pattern is 10.
- Fig. 6 illustrates an example of an actually captured image of an object.
- the object 601 whose image is captured by the camera 113 at the time of projecting a slit-shaped light pattern can be divided into a region with relatively high luminance and a region with relatively low luminance. This processing is performed by the binarization unit 405.
- Processing for projecting slit-shaped light patterns and capturing the image of the slit-shaped light patterns is repeated as many times as is required for necessary resolution, and every time the processing is repeated, edge positions obtained from the captured images are stored. This is performed by the edge computing unit 407 described above.
- Fig. 6 illustrates the captured image 600 in a state in which slit-shaped light patterns are projected onto the object 601.
- Exemplary computation of edge portions illustrated in Figs. 7 to 10 was performed for the object 601.
- Fig. 7 illustrates an example of the computation results of edge positions obtained by projection of a low-frequency slit-shaped light pattern.
- Edge computation values 700 illustrate all of the computation results of the edge positions obtained by projection of the low frequency slit-shaped light pattern.
- the edge computation values 700 are a result computed from the stripe projection pattern (positive pattern) and stripe projection pattern (negative pattern) described above.
- the edge computation values 700 illustrate an edge portion 701 corresponding to a luminance change in the slit-shaped light pattern.
- the edge computing unit 407 stores the value (edge position information) of the edge portion 701 in the memory 115.
- the value (edge position information) of the edge portion 701 is used by the edge overlap computing unit 410.
- Fig. 8 illustrates an example of the computation results of edge positions obtained by projection of a middle-frequency (intermediate-frequency) slit-shaped light pattern.
- edge computation values 800 illustrate all of the computation results of the edge positions obtained by projection of a higher-frequency slit-shaped light pattern.
- edge portions 801 When an intermediate slit-shaped light pattern is projected on the object 601, computation of the edge potions is possible for higher frequencies, as illustrated by edge portions 801, compared with the edge portion 701 illustrated in Fig. 7.
- Fig. 9 illustrates an example of the computation results of edge positions obtained by projection of a high-frequency slit-shaped light pattern.
- Edge computation values 900 illustrate an example of the computation results of edge positions when the space frequency of the pixels of the projector 112 has been encoded using 10 bits.
- edge portions 901 When a high-frequency slit-shaped light pattern is projected onto the object 601, computation of the edge potions is possible for higher frequencies, as illustrated by edge portions 901, compared with the edge portion 701 illustrated in Fig. 7 and the edge portions 801 illustrated in Fig. 8.
- Figs. 10A and 10B illustrate exemplary computation results of edge overlap values.
- the edge computing unit 407 obtains edge position information from captured images at the time when slit-shaped light patterns with respective frequencies are projected onto the object 601.
- the edge overlap computing unit 410 computes an edge overlap value by counting the occurrences of edges at a location corresponding to the same pixel of the camera 113 on the basis of edge position information obtained from the slit-shaped light patterns with respective frequencies.
- Computation results 1000 of edge overlap values illustrated in Fig. 10A illustrate edge overlap values at respective pixel positions of the camera 113.
- Computation results 1001 of edge overlap values illustrated in Fig. 10B illustrate a three-dimensional graph in which the edge overlap values of the respective pixels of the computation results 1000 are represented in the height direction.
- Image positions (area) 1002 having some edge overlap values illustrated in Fig. 10A and image positions (area) 1003 having some edge overlap values illustrated in Fig. 10B are the same position, and are positions having an edge overlap value of two or more.
- Image positions (area) 1004 having some edge overlap values illustrated in Fig. 10A and image positions (area) 1005 having some edge overlap values illustrated in Fig. 10B are the same position, and are positions having an edge overlap value of one.
- the high-reliability distance value computing unit 412 determines that an area having an edge overlap value of one is an area of the distance image data in which the reliability of distance computation values at the same position is high. On the other hand, the high-reliability distance value computing unit 412 determines that an area having an edge overlap value of two or more is an area of the distance image data in which the reliability of distance computation values at the same position is low. In this manner, by evaluating edge overlap values, the high-reliability distance value computing unit 412 can extract portions of the distance image data, obtained by the distance value computing unit 411, having low reliability in distance values, and thereby make the portions be not output as the distance image data 413.
- a Gray code is characterized by the fact that the edge positions of low-frequency components and the edge portions of higher-frequency components do not overlap. Also in the examples of projecting the Gray code slit-shaped light patterns illustrated in Figs. 2 and 5, the edge positions at which light and dark portions are switched in slit-shaped light patterns of respective frequencies do not overlap at any of low-, intermediate-, and high-frequencies.
- the edge positions observed in the images of an object captured by a camera apparatus do not overlap, except for in the following cases, for example: the case in which there is an influence of indirect light, the case in which there is occlusion, the case in which the inclination of an object surface is too large for a camera apparatus to resolve the edges of slit-shaped light patterns, and the case in which the reflectance of the object is low, thereby causing a decrease in the signal to noise ratio.
- portions of the distance image data having an edge overlap value of one satisfy the conditions of the algorithm of a space encoding method and have high reliability.
- portions of the distance image data having an edge overlap value of two or more satisfy the conditions of the algorithm of a space encoding method, and these portions can be determined to have low reliability.
- an area having an edge overlap value of two or more is under the influence of indirect light, occlusion, noise due to a decrease in signal to noise ratio (S/N ratio), or the like, as described above.
- projection of a special slit-shaped light pattern for measuring the influence of reflection of indirect light or the influence of noise caused by a decrease in signal to noise ratio is not required, and three-dimensional shapes can be measured using captured images which are the same as those used by an ordinary space encoding method.
- there is no increase in the number of image capturing operations and the influence of reflection of indirect light or the influence of a decrease in the signal to noise ratio can be eliminated while keeping an image capturing speed similar to that of an ordinary space encoding method.
- Fig. 11 is an exemplary flowchart illustrating edge overlap computing processing.
- step S1101 the edge overlap computing unit 410 clears edge overlap values.
- step S1102 the image importing unit 403 imports a captured image (p-image) at the time of projection of the positive pattern of a slit-shaped light pattern and a captured image (n-image) at the time of projection of the negative pattern of the slit-shaped light pattern, for each of the low to high frequencies. Note that the captured images are received one by one in step S1103.
- step S1103 the binarization unit 405 performs binarization processing on the captured images imported in step S1102 and generates binarized data.
- step S1104 the edge computing unit 407 extracts edge portions on the basis of the binarized data and generates edge position information.
- step S1104 the edge computation values 700, 800, and 900 illustrated in Figs. 7 to 9 are obtained.
- step S1105 the edge overlap computing unit 410 adds "1" to edge overlap values corresponding to portions (areas) where edges exist.
- the details of addition processing for the edge overlap values have been described with reference to Fig. 10.
- step S1106 the system management unit 404 determines whether or not the addition processing for the edge overlap values has been performed for all the frequencies of the slit-shaped light patterns. If the addition processing for the edge overlap values has not been performed for all the frequencies of the slit-shaped light patterns, the flow goes back to step S1102, where captured images corresponding to the next frequency are imported.
- steps S1102 to S1106 is repeated until addition processing for the edge overlap values are performed for all the frequencies of the slit-shaped light patterns.
- edge overlap is computed.
- An area 1305 is a portion onto which projection light is projected, although not captured by the camera 1302.
- the surface of the object 1301 is captured by the camera 1302, and light patterns are normally projected by the projector 1303.
- An area 1306 is also a floor surface, but this is a portion onto which pattern light is projected and which is captured by the camera 1302.
- portions having an edge overlap value of two or more are considered to be portions under the influence of indirect light, occlusion, noise caused by a decrease in signal to noise ratio (S/N ratio), or the like.
- edge overlap value map data of the whole image can be considered to be a map, or reliability map data, indicating the reliability of the edges of slit-shaped light patterns detected by pattern light.
- the method according to the present embodiment does not require projection of special slit-shaped light patterns for measuring the influence of reflection of indirect light, or the influence of noise caused by a decrease in signal to noise ratio (S/N ratio), compared with the technique disclosed in PTL 1.
- this method can be realized by computation using captured images which are the same as those used in an ordinary space encoding method.
- there is no increase in the number of image capturing operations and the influence of reflection of indirect light or the influence of a decrease in the signal to noise ratio can be eliminated while keeping an image capturing speed similar to that of an ordinary space encoding method.
- Fig. 13 illustrates a method of computing edges in the present embodiment.
- Edges can be computed by means of two captured images, one using a positive pattern and the other using a negative image.
- Fig. 13 is a graph illustrating the luminance distribution of captured images of projected patterns. Although the captured image is represented by two-dimensional data, Fig. 13 illustrates data of the vertical direction of the image as one-dimensional data.
- the horizontal axis represents the vertical coordinates of captured images, and the vertical axis represents the strength of luminance of the images.
- a luminance change curve 1401 represents change in the luminance of a captured image corresponding to positive-pattern projection.
- a luminance change curve 1402 represents change in the luminance of a captured image corresponding to negative-pattern projection.
- the change in luminance illustrated in Fig. 13 is a magnified view of a local portion where a change in pattern occurs.
- the luminance change curve 1401 and the luminance change curve 1402 have the same luminance value in a pattern changing portion 1404.
- an edge coordinate position 1403 is determined.
- the edge coordinate positions of patterns are computed for captured images of respective frequency patterns.
- Fig. 14 is a schematic diagram illustrating processing for computing an edge overlap value.
- Reference numeral 1501 denotes data indicating edge positions computed using the above-described method on the basis of an image obtained by capturing an N-bit projection pattern.
- Reference numeral 1502 denotes data representing edge positions similarly computed on the basis of an image obtained by capturing an (N + 1)-bit projection pattern.
- the data representing edge positions has a two-dimensional size similarly to a capturing screen.
- a map 1503 illustrates, for each pixel, data which is the count of overlapping edges at a pixel, for the patterns of respective numbers of bits.
- the data shows "0” if there are no edges (1508), "1” if there is one edge (state in which there is no edge overlap) (1507), and "2” if there are two edges (state in which edges overlap) (1506).
- the edge overlap values at respective pixel positions of a captured image can be computed.
- the high-reliability distance value computing unit 412 determines that the value of distance image data is reliable in an area in which the edge overlap value is one, and makes the distance image data in this area valid. On the other hand, when the edge overlap value is two or more, the high-reliability distance value computing unit 412 determines that the value of the distance image data has low reliability and makes the distance image data in this area invalid. In this manner, invalidation processing is performed for the portions where the distance image data has low reliability, and as a result, the highly reliable distance image data 413 can be generated.
- the edge portions of the slit-shaped light patterns are identified on the basis of respective captured images.
- the reliability of the computed distance values of the positions corresponding to the edges is lowered.
- a second embodiment of the present invention will now be described.
- the first embodiment an example of the case in which the distance image data corresponding to an area having an edge overlap value of two or more is invalidated has been described.
- a Gray code is configured using slit-shaped light patterns with a number of bits up to the number of bits of the highest frequency of slit-shaped light patterns minus the edge overlap value, thereby forming distance image data.
- the present embodiment is different from the first embodiment mainly in part of the method of computing (or employing) distance image data.
- distance computation is performed by forming a Gray code using slit-shaped light patterns with numbers of bits up to a value given by (number of bits of the maximum frequency of slit-shaped light patterns minus the edge overlap value).
- a low-frequency slit-shaped light pattern has a greater light amount than a high-frequency slit-shaped light pattern, and as a result, provides a higher signal to noise ratio (S/N ratio) in the computation results than a high-frequency slit-shaped light pattern.
- S/N ratio signal to noise ratio
- the distance image data can be computed for more positions than in the first embodiment.
- the distance value computing unit 411 obtains all the distance image data and the high-reliability distance value computing unit 412 determines whether or not the sub-pixel values of overlapping edge positions are the same and recomputes the distance computation values with an edge overlap value of two or more in accordance with the determination results.
- the edge overlap values computed by the edge overlap computing unit 410 and the edge coordinates (sub-pixel value) computed by the sub-pixel computing unit 409 are used in the distance computing process performed by the distance value computing unit 411. Thereby, distance image data based on projection of a high-frequency slit-shaped light pattern can be computed, which is not possible in the first or second embodiment.
- An area (pixel) having an edge overlap value of two or more is thought to be in a state in which the edge position of a low-frequency slit-shaped light pattern and the edge position of a high-frequency slit-shaped light pattern exist in the same pixel of the camera 113.
- an area (pixel) having an edge overlap value of two or more occurs also when the resolution of edge positions is higher than the pixel resolution of the camera 113.
- the edge positions can be computed with sub-pixel resolution by the sub-pixel computing unit 409, by using the luminance of slit-shaped light patterns of respective frequencies. Hence, the edge positions can be computed in a greater level of detail than the units of the pixels of the camera 113.
- the distance value computing unit 411 determines whether or not the sub-pixel values of the overlapping edge positions are the same. When the sub-pixel values of the overlapping edge positions are determined to be not the same, the distance value computing unit 411 computes a distance to an object corresponding to the edge positions. In this manner, the present embodiment realizes higher-accuracy distance computation than the first and second embodiments, enabling high-accuracy measurement of the three-dimensional shape of an object.
- the high-reliability distance value computing unit 412 may determine whether or not the pixel values of overlapping edge positions are the same and, when they are the same, clear the computed distance values corresponding to the edge position.
- the projector 112 realizes an exemplary projector
- the camera 113 realizes an exemplary image capturing unit.
- the distance value computing unit 411 realizes an exemplary distance value computing unit
- the edge computing unit 407 realizes an exemplary edge computing unit.
- the edge overlap computing unit 410 realizes an exemplary edge overlap degree detecting unit
- at least one of the distance value computing unit 411 and the high-reliability distance value computing unit 412 realizes an exemplary position detection reliability computing unit.
- an edge overlap value of two is an exemplary reference of the reliability of distance in the position of an edge.
- the distance values are computed without making distance values be invalid, although accuracy becomes low compared with the first embodiment.
- an edge overlap value computed by the edge overlap computing unit 410 is two or more, the corresponding distance value is processed as an unreliable measured distance value.
- distance measurement is performed using reliable low-frequency slit-shaped light patterns, by interpreting the edge overlap value as follows: (1) When the edge overlap value is one, distance computation is performed by forming a Gray code using projection patterns of all the frequencies. (2) When the edge overlap value is two or more, distance computation is performed by forming a Gray code using projection patterns with numbers of bits up to a value given by (number of bits of the maximum frequency of projection patterns minus the edge overlap value).
- the present embodiment allows computation of distance image data to be performed for more positions than in the first embodiment.
- distance image data which is not computed in other embodiments is computed on the basis of the results of projecting high-frequency slit-shaped light patterns.
- An image pixel having an edge overlap value of two or more is thought to be in a state in which the edge position of a low-frequency slit-shaped light pattern and the edge position of a high-frequency slit-shaped light pattern exist in the same pixel of a camera apparatus. In other words, this also occurs when the resolution of edge positions is higher than the pixel resolution of the camera apparatus.
- the edge positions can be computed with sub-pixel resolution by the sub-pixel computing unit 409, by using the luminance of slit-shaped light patterns of respective frequencies.
- edge positions can be computed with higher resolution than the units of pixels of a camera apparatus.
- the distance value computing unit 411 computing distances to an object by using also the sub-pixel values of edge positions when the edge overlap value is two or more, computation of distances with higher accuracy than in other embodiments is possible.
- aspects of the present invention can also be realized by a computer of a system or apparatus (or devices such as a CPU or MPU) that reads out and executes a program recorded on a memory device to perform the functions of the above-described embodiment(s), and by a method, the steps of which are performed by a computer of a system or apparatus by, for example, reading out and executing a program recorded on a memory device to perform the functions of the above-described embodiment(s).
- the program is provided to the computer for example via a network or from a recording medium of various types serving as the memory device (e.g., computer-readable medium).
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
(1) When the edge overlap value is one, distance computation is performed by forming a Gray code using projection patterns of all the frequencies.
(2) When the edge overlap value is two or more, distance computation is performed by forming a Gray code using projection patterns with numbers of bits up to a value given by (number of bits of the maximum frequency of projection patterns minus the edge overlap value).
Claims (10)
- A three-dimensional measurement apparatus comprising:
an obtaining unit configured to obtain a captured image of an object onto which pattern light having a light and dark stripe pattern is projected;
an edge position detecting unit configured to detect an edge position indicating a boundary between light and dark portions of the pattern light in the captured image;
an edge overlap degree detecting unit configured to detect a degree of overlap of edge positions detected by the edge position detecting unit; and
a position computing unit configured to compute a position of the object on the basis of the edge positions and the degree of overlap of the edge positions. - The three-dimensional measurement apparatus according to Claim 1, wherein the light and dark stripe pattern is a Gray code.
- The three-dimensional measurement apparatus according to Claim 1, wherein the position computing unit computes the position of the object on the basis of a pixel, in the captured image, in which the number of overlapping edge positions is one.
- The three-dimensional measurement apparatus according to Claim 1, wherein the position computing unit does not use a pixel, in the captured image, in which the number of overlapping edge positions is two or more for computation of the position of the object.
- The three-dimensional measurement apparatus according to Claim 1, wherein the edge position detecting unit, when the number of overlapping edge positions is two or more, detects the edge position of the light and dark stripe pattern through sub-pixel processing.
- The three-dimensional measurement apparatus according to Claim 1, further comprising a reliability computing unit configured to compute reliability of computing the position of the object on the basis of the degree of overlap of the edge positions.
- The three-dimensional measurement apparatus according to Claim 6, wherein the reliability computing unit does not output the position in which the reliability is below a reference.
- The three-dimensional measurement apparatus according to Claim 6, wherein the reliability computing unit determines a frequency of the light and dark stripe pattern on the basis of the reliability.
- A three-dimensional measurement method comprising:
an obtaining step of obtaining a captured image of an object onto which pattern light having a light and dark stripe pattern is projected;
an edge position detecting step of detecting an edge position indicating a boundary between light and dark portions of the pattern light in the captured image;
an edge overlap degree detecting step of detecting a degree of overlap of the edge positions detected in the edge position detecting step; and
a position computing step of computing a position of the object on the basis of the edge positions and the degree of overlap of the edge positions. - A computer readable memory medium containing a computer program for causing a computer to execute:
an obtaining step of obtaining a captured image of an object onto which pattern light having a light and dark stripe pattern is projected;
an edge position detecting step of detecting an edge position indicating a boundary between light and dark portions of the pattern light in the captured image;
an edge overlap degree detecting step of detecting a degree of overlap of the edge positions detected in the edge position detecting step; and
a position computing step of computing a position of the object on the basis of the edge positions and the degree of overlap of the edge positions.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180040145.9A CN103069250B (en) | 2010-08-19 | 2011-08-05 | 3-D measuring apparatus, method for three-dimensional measurement |
US13/817,410 US8964189B2 (en) | 2010-08-19 | 2011-08-05 | Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2010184195 | 2010-08-19 | ||
JP2010-184195 | 2010-08-19 |
Publications (3)
Publication Number | Publication Date |
---|---|
WO2012023256A2 WO2012023256A2 (en) | 2012-02-23 |
WO2012023256A3 WO2012023256A3 (en) | 2012-04-26 |
WO2012023256A4 true WO2012023256A4 (en) | 2012-06-14 |
Family
ID=44675769
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2011/004451 WO2012023256A2 (en) | 2010-08-19 | 2011-08-05 | Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program |
Country Status (4)
Country | Link |
---|---|
US (1) | US8964189B2 (en) |
JP (2) | JP5882631B2 (en) |
CN (1) | CN103069250B (en) |
WO (1) | WO2012023256A2 (en) |
Families Citing this family (42)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9772394B2 (en) | 2010-04-21 | 2017-09-26 | Faro Technologies, Inc. | Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker |
WO2012023256A2 (en) * | 2010-08-19 | 2012-02-23 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program |
US9482529B2 (en) | 2011-04-15 | 2016-11-01 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
GB2504890A (en) | 2011-04-15 | 2014-02-12 | Faro Tech Inc | Enhanced position detector in laser tracker |
US9686532B2 (en) | 2011-04-15 | 2017-06-20 | Faro Technologies, Inc. | System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices |
TR201811449T4 (en) * | 2012-11-07 | 2018-09-21 | Artec Europe S A R L | Method for observing linear dimensions of three-dimensional objects. |
JP6166539B2 (en) * | 2013-01-24 | 2017-07-19 | キヤノン株式会社 | Three-dimensional shape measuring apparatus, control method for three-dimensional shape measuring apparatus, and program |
US9041914B2 (en) | 2013-03-15 | 2015-05-26 | Faro Technologies, Inc. | Three-dimensional coordinate scanner and method of operation |
CN115981000A (en) * | 2013-03-15 | 2023-04-18 | 感知技术有限公司 | Enhanced optical and perceptual digital eyewear |
JP6420572B2 (en) * | 2014-06-13 | 2018-11-07 | キヤノン株式会社 | Measuring apparatus and method |
US10032279B2 (en) * | 2015-02-23 | 2018-07-24 | Canon Kabushiki Kaisha | Information processing apparatus, information processing method, and storage medium |
AU2016222716B2 (en) * | 2015-02-25 | 2018-11-29 | Facebook Technologies, Llc | Identifying an object in a volume based on characteristics of light reflected by the object |
US10488192B2 (en) | 2015-05-10 | 2019-11-26 | Magik Eye Inc. | Distance sensor projecting parallel patterns |
CN104897082B (en) * | 2015-06-08 | 2018-02-23 | 华东师范大学 | A kind of high-speed structures light based on intelligent family moving platform produces and processing unit |
KR102368597B1 (en) * | 2015-11-11 | 2022-03-02 | 삼성전자주식회사 | Image photographing apparatus and method of controlling thereof |
JP6735615B2 (en) * | 2016-06-29 | 2020-08-05 | キヤノン株式会社 | Information processing apparatus, information processing apparatus control method, and program |
WO2018106671A2 (en) | 2016-12-07 | 2018-06-14 | Magik Eye Inc. | Distance sensor including adjustable focus imaging sensor |
CN108732066A (en) * | 2017-04-24 | 2018-11-02 | 河北工业大学 | A kind of Contact-angle measurement system |
JP6942567B2 (en) * | 2017-08-30 | 2021-09-29 | キヤノン株式会社 | Information processing equipment, information processing methods and computer programs |
JP6918651B2 (en) * | 2017-09-05 | 2021-08-11 | 東芝テック株式会社 | Luggage management system |
WO2019070806A1 (en) | 2017-10-08 | 2019-04-11 | Magik Eye Inc. | Calibrating a sensor system including multiple movable sensors |
KR20200054326A (en) | 2017-10-08 | 2020-05-19 | 매직 아이 인코포레이티드 | Distance measurement using hardness grid pattern |
US10679076B2 (en) | 2017-10-22 | 2020-06-09 | Magik Eye Inc. | Adjusting the projection system of a distance sensor to optimize a beam layout |
CN111512180A (en) * | 2017-10-22 | 2020-08-07 | 魔眼公司 | Adjusting a projection system of a distance sensor to optimize beam layout |
US11085761B2 (en) * | 2017-10-30 | 2021-08-10 | Hewlett-Packard Development Company, L.P. | Determining surface structures of objects |
JP2019087008A (en) | 2017-11-07 | 2019-06-06 | 東芝テック株式会社 | Image processing system and image processing method |
AT521004B1 (en) * | 2017-11-30 | 2022-10-15 | Henn Gmbh & Co Kg | Procedure for positioning measuring points on a moving object |
JP7354133B2 (en) | 2018-03-20 | 2023-10-02 | マジック アイ インコーポレイテッド | Camera exposure adjustment for 3D depth sensing and 2D imaging |
KR20200123849A (en) | 2018-03-20 | 2020-10-30 | 매직 아이 인코포레이티드 | Distance measurement using a projection pattern of variable densities |
EP3803266A4 (en) | 2018-06-06 | 2022-03-09 | Magik Eye Inc. | Distance measurement using high density projection patterns |
DE102018211371A1 (en) * | 2018-07-10 | 2020-01-16 | Sirona Dental Systems Gmbh | Optical measuring method and optical measuring device |
WO2020033169A1 (en) | 2018-08-07 | 2020-02-13 | Magik Eye Inc. | Baffles for three-dimensional sensors having spherical fields of view |
JP7111586B2 (en) * | 2018-11-09 | 2022-08-02 | 株式会社Soken | object detector |
US11483503B2 (en) | 2019-01-20 | 2022-10-25 | Magik Eye Inc. | Three-dimensional sensor including bandpass filter having multiple passbands |
EP3903061A1 (en) * | 2019-02-22 | 2021-11-03 | Prophesee | Three-dimensional imaging and sensing using a dynamic vision sensor and pattern projection |
US11474209B2 (en) | 2019-03-25 | 2022-10-18 | Magik Eye Inc. | Distance measurement using high density projection patterns |
CN114073075B (en) | 2019-05-12 | 2024-06-18 | 魔眼公司 | Mapping three-dimensional depth map data onto two-dimensional images |
EP4065929A4 (en) | 2019-12-01 | 2023-12-06 | Magik Eye Inc. | Enhancing triangulation-based three-dimensional distance measurements with time of flight information |
JP2023508501A (en) | 2019-12-29 | 2023-03-02 | マジック アイ インコーポレイテッド | Association between 3D coordinates and 2D feature points |
US11688088B2 (en) | 2020-01-05 | 2023-06-27 | Magik Eye Inc. | Transferring the coordinate system of a three-dimensional camera to the incident point of a two-dimensional camera |
CN113865503A (en) * | 2020-06-30 | 2021-12-31 | 得力富企业股份有限公司 | Centroid detection device |
CN113318410B (en) * | 2021-05-31 | 2022-04-01 | 集美大学 | Running training method |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000337829A (en) | 1999-05-26 | 2000-12-08 | Suzuki Motor Corp | Luminance measuring method, storage medium to store program for measuring luminance, and three-dimensional measuring device |
US7065242B2 (en) * | 2000-03-28 | 2006-06-20 | Viewpoint Corporation | System and method of three-dimensional image capture and modeling |
JP2004191198A (en) * | 2002-12-11 | 2004-07-08 | Fuji Xerox Co Ltd | Apparatus and method for measuring three-dimensional geometry |
JP2005293075A (en) * | 2004-03-31 | 2005-10-20 | Brother Ind Ltd | 3-dimensional shape detection device, 3-dimensional shape detection method, 3-dimensional shape detection program |
JP4501551B2 (en) * | 2004-06-23 | 2010-07-14 | 富士ゼロックス株式会社 | Three-dimensional shape measuring apparatus and method |
US20060017720A1 (en) * | 2004-07-15 | 2006-01-26 | Li You F | System and method for 3D measurement and surface reconstruction |
JP2006098252A (en) * | 2004-09-30 | 2006-04-13 | Brother Ind Ltd | Three-dimensional information acquisition method |
JP4874657B2 (en) * | 2006-01-18 | 2012-02-15 | ローランドディー.ジー.株式会社 | Three-dimensional shape measuring method and apparatus |
JP4744610B2 (en) | 2009-01-20 | 2011-08-10 | シーケーディ株式会社 | 3D measuring device |
JP5618569B2 (en) * | 2010-02-25 | 2014-11-05 | キヤノン株式会社 | Position and orientation estimation apparatus and method |
WO2012023256A2 (en) * | 2010-08-19 | 2012-02-23 | Canon Kabushiki Kaisha | Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program |
JP5587137B2 (en) * | 2010-10-29 | 2014-09-10 | キヤノン株式会社 | Measuring apparatus and measuring method |
-
2011
- 2011-08-05 WO PCT/JP2011/004451 patent/WO2012023256A2/en active Application Filing
- 2011-08-05 CN CN201180040145.9A patent/CN103069250B/en active Active
- 2011-08-05 US US13/817,410 patent/US8964189B2/en active Active
- 2011-08-12 JP JP2011176946A patent/JP5882631B2/en not_active Expired - Fee Related
-
2016
- 2016-02-04 JP JP2016019752A patent/JP6109357B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
WO2012023256A3 (en) | 2012-04-26 |
US20130155417A1 (en) | 2013-06-20 |
JP2012063352A (en) | 2012-03-29 |
JP5882631B2 (en) | 2016-03-09 |
CN103069250A (en) | 2013-04-24 |
JP2016105108A (en) | 2016-06-09 |
CN103069250B (en) | 2016-02-24 |
US8964189B2 (en) | 2015-02-24 |
WO2012023256A2 (en) | 2012-02-23 |
JP6109357B2 (en) | 2017-04-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8964189B2 (en) | Three-dimensional measurement apparatus, method for three-dimensional measurement, and computer program | |
US8970853B2 (en) | Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium | |
US9007602B2 (en) | Three-dimensional measurement apparatus, three-dimensional measurement method, and computer-readable medium storing control program | |
US10288418B2 (en) | Information processing apparatus, information processing method, and storage medium | |
US9046355B2 (en) | Measuring apparatus, measuring method, and program | |
US9546863B2 (en) | Three-dimensional measuring apparatus and control method therefor | |
US10032279B2 (en) | Information processing apparatus, information processing method, and storage medium | |
JP6566768B2 (en) | Information processing apparatus, information processing method, and program | |
JP6115214B2 (en) | Pattern processing apparatus, pattern processing method, and pattern processing program | |
US10078907B2 (en) | Distance measurement apparatus, distance measurement method, and storage medium | |
CN104200456A (en) | Decoding method for linear structure-light three-dimensional measurement | |
JP2017003331A (en) | Measurement device for measuring a shape of a body to be measured, calculation device, and calculation method and program | |
JP2013257244A (en) | Distance measurement device, distance measurement method, and distance measurement program | |
US20230162442A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP6867766B2 (en) | Information processing device and its control method, program | |
JP5968370B2 (en) | Three-dimensional measuring apparatus, three-dimensional measuring method, and program | |
WO2019188194A1 (en) | Method for determining center of pattern on lens marker, device for same, program for causing computer to execute said determination method, and recording medium for program | |
JP2020027000A (en) | Correction method for lens marker image, correction device, program, and recording medium | |
JP2020046229A (en) | Three-dimensional measuring device and three-dimensional measuring method | |
JP2012098207A (en) | Position measuring device, position measuring method and marker | |
US20240013420A1 (en) | Image processing apparatus, image processing method, and storage medium | |
JP2016075517A (en) | Perspective distortion measuring device and perspective distortion measuring method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201180040145.9 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11760872 Country of ref document: EP Kind code of ref document: A2 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13817410 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 11760872 Country of ref document: EP Kind code of ref document: A2 |