WO2011070927A1 - 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム - Google Patents
点群データ処理装置、点群データ処理方法、および点群データ処理プログラム Download PDFInfo
- Publication number
- WO2011070927A1 WO2011070927A1 PCT/JP2010/071188 JP2010071188W WO2011070927A1 WO 2011070927 A1 WO2011070927 A1 WO 2011070927A1 JP 2010071188 W JP2010071188 W JP 2010071188W WO 2011070927 A1 WO2011070927 A1 WO 2011070927A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- dimensional
- points
- cloud data
- point cloud
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
Definitions
- the present invention relates to a point cloud data processing apparatus, and more particularly to a point cloud data processing apparatus that extracts features of point cloud data of a measurement object and generates a three-dimensional shape automatically in a short time.
- a scanning laser device scans a three-dimensional object to generate a point cloud.
- Point clouds are divided into groups of edge points and non-edge points based on changes in depth and normal with respect to scan points. Each group is fitted to a geometric original, and the fitted geometric original is expanded and intersected to generate a three-dimensional shape.
- segments are formed from point cloud data, and edges and planes are extracted based on the continuity between adjacent polygons, the normal direction, or the distance.
- planarity or curvature of the point group data of each segment is replaced with a plane equation or a curved surface equation using the least squares method, and grouping is performed to generate a three-dimensional shape.
- a two-dimensional rectangular area is set for three-dimensional point group data, and a combined normal vector of measurement points corresponding to the rectangular area is obtained. All measurement points in the rectangular area are rotationally moved so that the combined normal vector coincides with the Z-axis direction.
- the standard deviation ⁇ of the Z value is determined for each measurement point in the rectangular area, and when the standard deviation ⁇ exceeds a predetermined value, the measurement point corresponding to the central point of the rectangular area is treated as noise.
- the present invention aims to provide a technique for extracting features from point cloud data of a measurement object and generating a three-dimensional shape automatically in a short time.
- the invention according to claim 1 is the same as the non-surface removing portion for removing points of the non-surface region from the point cloud data of the measurement object and the points other than the points removed by the non-surface removing portion.
- a point cloud data processing apparatus characterized by
- the characteristics of the measurement object are mainly composed of three-dimensional edges that form a solid, and two-dimensional edges that form a pattern in a plane and a curved surface (hereinafter simply referred to as a plane).
- the three-dimensional edge is a line of intersection between planes having different positions and orientations or the outer edge of each plane, and the two-dimensional edge is a line or point where the color density changes sharply in the same plane.
- the three-dimensional edge constitutes the contour of the measurement object.
- outerline means an outline (outline) of the line forming the outline of the measurement object, which is necessary for visually grasping the appearance of the measurement object. Specifically, a bent portion or a portion where the curvature sharply decreases is an outline.
- the contour is not limited to only the part of the outer contour, but is the edge that characterizes the part that protrudes in a convex manner or the edge that characterizes the part that is recessed in a concave manner (for example, the part of the groove structure) Part of is also targeted.
- a so-called diagram can be obtained by the contour line, and image display can be performed in which the appearance of the object can be easily grasped.
- contour line exists at the boundary between the surface and the surface or at the edge portion, in the present invention, these portions are removed from the point cloud data as a non-surface region, so for example, the intersection line of divided faces
- the contour line is obtained by calculating the convex envelope.
- the invention according to claim 2 further comprises, in the invention according to claim 1, a normal line calculation unit which obtains a local plane centered on each point of the point cloud data and calculates a normal of the local plane. It is characterized by
- the points on the same plane can be extracted based on the normal direction at each point of the point cloud data, and the points of the point cloud data can be divided into each plane.
- the invention according to claim 3 is the local curvature according to the invention according to claim 2, wherein the local curvature is calculated by obtaining the standard deviation of the three-axis component of the normal and obtaining the square root of the square sum of the standard deviation.
- a calculation unit is further provided.
- the third aspect of the present invention it is possible to remove points in the non-area region based on the variation (local curvature) of the normal at each point of the point cloud data.
- the invention according to a fourth aspect is characterized in that, in the invention according to the third aspect, the non-surface removing portion removes points of the non-surface region based on the local curvature.
- a sharp three-dimensional edge generated due to a change in the orientation of a surface, a smooth three-dimensional edge generated due to a curved surface having a large local curvature, or a point in a non-surface area containing noise Can be removed.
- the invention according to claim 5 is characterized in that, in the invention according to claim 2, the non-surface removing unit removes points in the non-surface area based on the accuracy of the fitting of the local plane.
- three-dimensional edges or points in a non-plane area including noise are eliminated which are generated by an occlusion (a state in which an object in front of an object is blocked by an object in front).
- the accuracy of the fitting of the local plane is, for example, an average value of the distance between each point used for calculating the local plane and the local plane, and the three-dimensional edge generated by the occlusion is the point of the object behind and the front. Since the three-dimensional position with respect to the point of the object is largely different, the accuracy of the fitting of the local plane can remove such a three-dimensional edge or a point in a non-planar area including noise.
- the invention according to a sixth aspect is characterized in that, in the invention according to the second aspect, the non-surface removing portion removes points in the non-surface area based on coplanarity at the point of interest and the adjacent point. Do.
- coplanarity between the attention point and the adjacent point is a condition in which the inner product of each normal and the line connecting the attention point and the adjacent point is orthogonal to zero.
- the surface labeling section applies the same label to points on the same surface based on an angle difference between the attention point and the normal line at the adjacent point. It is characterized by
- the points of the point cloud data can be divided into planes.
- the invention according to claim 8 is characterized in that, in the invention according to claim 1, a noise removing unit for removing noise based on the area of the surface divided by the surface labeling unit.
- the invention according to claim 9 is the label expansion according to the invention according to claim 1, wherein a label of the nearest surface is given to a point where no label is given by the surface labeling unit, and the label is expanded. It further comprises a part.
- the invention according to claim 10 is characterized in that in the invention according to claim 1, the point cloud data is data in which three-dimensional coordinates of each point and a two-dimensional image are linked.
- the technique of image processing can be applied to the extraction of two-dimensional edges which are difficult to extract only by three-dimensional coordinates.
- the invention according to claim 11 is the invention according to claim 10, wherein the two-dimensional edge extraction unit extracts two-dimensional edges from within the area of the two-dimensional image corresponding to the surface divided by the surface labeling unit. It is characterized by extracting.
- the invention according to claim 12 is the invention according to claim 10, wherein the two-dimensional edge extraction unit extracts two-dimensional edges from within the area of the two-dimensional image corresponding to the surface divided by the surface labeling unit.
- the two-dimensional edge may be determined based on a three-dimensional position of a three-dimensional edge extracted and extracted near the two-dimensional edge.
- the three-dimensional position of the two-dimensional edge extracted from the two-dimensional image can be confirmed based on the three-dimensional position of the three-dimensional edge constituting the outer edge of the plane.
- the invention according to a thirteenth aspect is the invention according to the first aspect, wherein a rotary irradiating unit for rotary irradiating the distance measuring light to the object to be measured and a position thereof based on the time of flight of the distance measuring light.
- a distance measuring unit for measuring a distance to a measurement point on a measurement object, an irradiation direction detection unit for detecting an irradiation direction of the distance measurement light, and a third order of the measurement point based on the distance and the irradiation direction And a three-dimensional coordinate computing unit for computing original coordinates.
- point cloud data consisting of three-dimensional coordinates can be acquired.
- the invention according to a fourteenth aspect is the invention according to the first aspect, wherein, when the distance between points of the point cloud data is not constant, a grid at equal intervals is formed, and the point closest to the intersection of the grids is registered. And a grid forming unit.
- the point-to-point distance of the point cloud data can be corrected.
- the invention according to claim 15 is the invention according to claim 13, wherein an imaging unit for imaging the measurement object to acquire a two-dimensional image, three-dimensional coordinates of the measurement point, and the two-dimensional image And a link forming unit that forms linked point cloud data.
- the three-dimensional edge constituting the solid is extracted based on the three-dimensional coordinates, and the two-dimensional edge constituting the in-plane pattern (line or point where color density changes sharply) Can be extracted based on a two-dimensional image.
- the invention according to claim 16 is characterized in that, in the invention according to claim 1, a photographing unit for photographing a measurement object in a photographing region overlapping from different directions, and a feature point in the overlapping image obtained by the photographing unit. Based on the feature point correspondence unit to be associated, the shooting position and orientation measurement unit for measuring the position and orientation of the shooting unit, the position and orientation of the shooting unit, and the position of the feature point in the overlapping image And a three-dimensional coordinate computing unit for computing three-dimensional coordinates.
- the invention according to claim 17 is the same as the non-surface removing procedure for removing points of the non-surface region from the point cloud data of the measurement object, and the same surface with respect to the points other than the points removed by the non-surface removing portion.
- the invention according to claim 18 is the same as the non-surface removing procedure for removing points of the non-surface area from the point cloud data of the measurement object, and the same surface with respect to points other than the points removed by the non-surface removing portion.
- the eighteenth aspect of the present invention it is possible to extract the feature from the point cloud data of the measurement object, and generate the three-dimensional shape automatically and in a short time.
- the invention according to claim 19 is the same as the non-surface removing portion for removing points of the non-surface region from the point cloud data of the measurement object and the points other than the points removed by the non-surface removing portion.
- the invention according to claim 20 is the same as the non-surface removing section for removing points of the non-surface area from the point cloud data of the measurement object and the points other than the points removed by the non-surface removing section.
- a point cloud data processing apparatus characterized by comprising:
- the invention according to claim 21 is the same as the non-surface removing section for removing points of the non-surface area from the point cloud data of the measurement object and the points other than the points removed by the non-surface removing section.
- the three-dimensional shape of the object to be measured is given based on the surface extraction unit that applies the same label to the upper point and extracts a plurality of surfaces that constitute the measurement object, and the plurality of surfaces extracted by the surface extraction unit.
- a point cloud data processing apparatus comprising: a three-dimensional shape extraction unit to be extracted.
- the present invention it is possible to extract the feature from the point cloud data of the measurement object and generate a three-dimensional shape automatically and in a short time.
- Diagram (A) showing sharp three-dimensional edge of straight line diagram (B) showing smooth three-dimensional edge of straight line
- diagram (C) showing sharp three-dimensional edge of curve diagram (D) showing FIG. 5 is a cross-sectional view of a smooth three-dimensional edge. It is a conceptual diagram which shows the principle which calculates a convex envelope.
- Reference Signs List 1 point cloud data processing device 2 point cloud data 3 data input unit 4 computation unit 5 storage unit 6 operation unit 7 display unit 8 data output unit 9 grid formation Unit 10 Surface extraction unit 11 Three-dimensional edge extraction unit 12 Two-dimensional edge extraction unit 13 Edge integration unit 14 Normal calculation unit 15 Local curvature calculation unit 16 Non-surface removal unit , 17: surface labeling unit, 18: noise removal unit, 19: label extension unit, 20: three-dimensional polyline, 22: alignment unit, 23: rotation mechanism unit, 24: ranging unit, 25: imaging unit, 26: Control part 27 Body part 28 Rotating irradiation part 29 Base plate 30 Lower casing 31 Pin 32 Adjustment screw 33 Tension spring 34 Leveling motor 35 Leveling drive gear , 36 ... leveling driven gear, 37 ... inclination sensor, 38 ...
- horizontal rotation mode 39: horizontal rotation drive gear, 40: horizontal rotation gear, 41: rotation shaft, 42: rotation base, 43: bearing member, 44: horizontal angle detector, 45: main body casing, 46: lens barrel, DESCRIPTION OF SYMBOLS 47 ... Optical axis, 48 ... Beam splitter, 49, 50 ... Optical axis, 51 ... Pulsed laser light source, 52 ... Perforated mirror, 53 ... Beam waist change optical system, 54 ... Ranging light reception part, 55 ...
- Times for high and low angles Moving mirror 56: Projection optical axis, 57: Condensing lens, 58: Image light receiving part, 59: Projection casing, 60: Flange part, 61: Mirror holder plate, 62: Rotating shaft, 63: High and low angle gear , 64: height detector, 65: drive motor for height, 66: drive gear, 67: illumination light, 68: external storage device, 69: horizontal drive, 70: height drive, 71: leveling Drive unit, 72 ... distance data processing unit, 73 ...
- image data Processing unit 74 Three-dimensional coordinate calculation unit 75 Link formation unit 76, 77 Photography unit 78 Feature projection unit 79 Calibration subject 80 Target 81 Measurement position and orientation measurement unit 82 Feature point association unit 83: background removal unit 84: feature point extraction unit 85: corresponding point search unit 86: three-dimensional coordinate calculation unit 87: incorrect corresponding point determination unit 88: disparity determination unit 89: ... Space determination unit 90: form determination unit 91, 92: surface, 93: virtual three-dimensional edge 94: cylinder, 95: true three-dimensional edge 301: outline, 302: surface 303: non-surface area , 304 ... faces.
- FIG. 1 is a block diagram of a point cloud data processing apparatus.
- the point cloud data processing apparatus 1 extracts features of the measurement object based on the point cloud data 2 of the measurement object, and generates a three-dimensional shape based on the features.
- the features of the measurement object are mainly composed of three-dimensional edges constituting a solid, and two-dimensional edges constituting a pattern in a plane and a curved surface (hereinafter simply referred to as a plane).
- the three-dimensional edge is a line of intersection between planes having different positions and orientations or the outer edge of each plane, and the two-dimensional edge is a line or point where the color density changes sharply in the same plane.
- the three-dimensional shape generated by the point cloud data processing device 1 is a schematic shape based on the feature, and is a three-dimensional polyline 20 configured by a three-dimensional edge and a two-dimensional edge.
- the point cloud data processing device 1 includes a data input unit 3, an operation unit 4, a storage unit 5, an operation unit 6, a display unit 7, and a data output unit 8.
- the data input unit 3 is an external storage unit such as a flash memory, or a magnetic storage unit such as a hard disk, or a connection unit connected to a LAN (Local Area Network) or a WAN (Wide Area Network). It is an operation means such as a central processing unit (CPU), an application specific integrated circuit (ASIC), or a programmable logic device (PLD) such as a field programmable gate array (FPGA).
- CPU central processing unit
- ASIC application specific integrated circuit
- PLD programmable logic device
- FPGA field programmable gate array
- the storage unit 5 is a main storage unit such as a random access memory (RAM), and the operation unit 6 is an operation unit such as a mouse and a keyboard or a touch panel.
- the display unit 7 is a display unit such as a liquid crystal display, and the data output unit 8 uses the same configuration as that of the data input unit 3.
- the calculation unit 4 is operated by the operation unit 6, and the calculation unit 4 receives point cloud data 2 from the data input unit 3.
- the point cloud data 2 mainly includes three-dimensional coordinates and RGB intensity (two-dimensional image).
- the point cloud data 2 input from the data input unit 3 is stored in the storage unit 5, and the operation unit 4 calculates the three-dimensional edge and the two-dimensional edge of the measurement object based on the point cloud data 2 stored in the storage unit 5.
- Extract The three-dimensional and two-dimensional edges extracted by the calculation unit 4 are displayed on the display unit 7 as a three-dimensional polyline 20.
- the display unit 7 can simultaneously display the three-dimensional polyline 20 in two-dimensional space and three-dimensional space. Further, the three-dimensional polyline 20 can be output to the data output unit 8 as CAD (Computer Aided Design) data.
- CAD Computer Aided Design
- the calculation unit 4 includes a surface extraction unit 10, a three-dimensional edge extraction unit 11, a two-dimensional edge extraction unit 12, and an edge integration unit 13. These are configured by programs executed by the CPU or PLDs such as ASICs or FPGAs.
- the surface extraction unit 10 extracts a surface from the point cloud data 2.
- the surface extraction unit 10 includes a normal calculation unit 14, a local curvature calculation unit 15, a non-surface removal unit 16, a surface labeling unit 17, a noise removal unit 18, and a label extension unit 19.
- the normal calculation unit 14 calculates a normal vector in the local plane of each point, and the local curvature calculation unit 15 calculates the variation (local curvature) of the normal vector in the local region.
- the non-surface removal unit 16 removes points in the non-surface region based on (1) a portion with high local curvature, (2) fitting accuracy of the local plane, and (3) coplanarity.
- the non-surface area is an area that is flat and not a curved surface, but may include a curved surface with high curvature depending on the threshold values of (1) to (3).
- the surface labeling unit 17 applies the same label to the points on the same surface of the remaining points based on the continuity of the normal vector.
- the noise removing unit 18 removes a label (face) having a small area as noise, and the label extension unit 19 extends the label by giving the label of the nearest face to the point without the label. As a result, a plane is extracted from the point cloud data 2.
- the calculation unit 4 shown in FIG. 1 further includes a re-labeling unit and a label integration unit subsequent to the label expansion unit 19.
- the three-dimensional edge extraction unit 11 extracts a three-dimensional edge based on at least one of the intersection line between the surfaces extracted by the surface extraction unit 10 and a convex envelope that wraps each surface in a convex shape.
- the two-dimensional edge extraction unit 12 extracts a two-dimensional edge from within the surface segmented (segmented) by the surface extraction unit 10, and an edge substantially equal to the three-dimensional position of the three-dimensional edge near the two-dimensional edge Extract as an edge.
- the edge integration unit 13 integrates the three-dimensional edge extracted by the three-dimensional edge extraction unit 11 and the two-dimensional edge extracted by the two-dimensional edge extraction unit 12 to generate a three-dimensional polyline 20.
- the edge integration unit 13 is an example of means for calculating a three-dimensional contour of the measurement object, and is an example of the contour integration unit of the present invention. Further, the edge integration unit 13 integrates the three-dimensional edge and the two-dimensional edge extracted based on the surface extracted by the surface extraction unit 10, and data necessary for grasping the shape of the three-dimensional measurement object ( In the case of this example, calculation of a three-dimensional polyline is performed. In this sense, the edge integration unit 13 is also an example of a three-dimensional shape extraction unit that extracts the three-dimensional shape of the measurement object based on the plurality of surfaces extracted by the surface extraction unit 10.
- the processing of the calculation unit 4 is repeated for each block. Then, the calculation unit 4 converts the points of the three-dimensional edge and the two-dimensional edge extracted in each block into the same coordinates, and generates a three-dimensional polyline 20 composed of a plurality of directions.
- FIG. 2 is a flowchart showing the flow of processing of the computing unit.
- a program for executing this flowchart can be provided by a computer readable recording medium such as a CD ROM.
- the operation unit 4 receives the point cloud data 2 (step S1), and extracts a surface from the point cloud data 2 (step S2). Further, the computing unit 4 calculates a normal vector in the local plane of each point in order to extract a surface (step S3).
- a normal vector in the local plane of each point in order to extract a surface
- the equation of the local plane is determined from the three-dimensional coordinates of each point (local plane fitting).
- the least squares method is used for fitting of the local plane.
- FIG. In the case of Expression 1, normal vectors (nvx, nvy, nvz) are (a1, b1, ⁇ 1), (a2, ⁇ 1, b2), and ( ⁇ 1, a3, b3).
- normalization is performed so that the magnitude of the normal vector is 1 (range from ⁇ 1 to 1).
- FIG. 3 is a drawing substitute photograph showing an intensity image (nvx) of a normal vector in the x-axis direction
- FIG. 4 is a drawing substitute photograph showing an intensity image (nvy) of a normal vector in the y-axis direction
- FIG. 5 is a drawing substitute photograph showing an intensity image (nvz) of a normal vector in the z-axis direction.
- FIGS. 3 to 5 show the results using a 7 ⁇ 7 local plane.
- the variation (local curvature) of the normal vector in the local region is calculated (step S4).
- the average (mnvx, mnvy, mnvz) of the intensity values (nvx, nvy, nvz) of the three-axis component of each normal vector is determined in a square area of about 3 to 7 pixels centered on the attention point, and the standard deviation ( Find stdnvx, stdnvy, stdnvz).
- the square root of the sum of squares of the standard deviation is calculated as a local curvature (crv).
- FIG. 6 is a drawing substitute photograph showing a curvature image (stdnvx) in the x-axis direction
- FIG. 7 is a drawing substitute photograph showing a curvature image (stdnvy) in the y-axis direction
- FIG. It is a drawing substitute photograph showing a curvature image (stdnvz).
- FIG. 9 is a drawing substitute photograph showing a local curvature image (crv).
- the non-surface region is a region that is neither a flat surface nor a curved surface, but may include a curved surface with high curvature depending on the threshold values of (1) to (3).
- step S4 Portion with High Local Curvature
- the point with the high local curvature found in step S4 is removed. Since the local curvature represents the variation of the normal vector at the point of interest and its peripheral points, the value is small on a surface (a flat surface and a curved surface with small curvature), and the value is large outside the surface. Therefore, if the local curvature is larger than a predetermined threshold, it is determined that the point of interest is not on the surface. That is, the local curvature image of FIG. 9 is binarized, and the portion larger than the threshold is removed.
- FIG. 10 is a diagram showing the distance between a point used for calculating the local plane and the local plane.
- the local plane L is determined by designating a point A on the surface.
- Point A is an average coordinate of points P1 to P8 used for calculation of the local plane.
- FIG. 11 is an explanatory view for explaining a method of determining coplanarity.
- the normal vectors of the local plane p1 and the local plane p2 be n1 and n2, and the vector connecting the points defining the plane be r12.
- the inner product of the normal vectors n1 and n2 and the local plane r12 connecting the two local planes is close to zero because they are orthogonal without limit. If this property is used and the larger one of the inner product values is larger than a predetermined threshold value, it is determined that the attention point is not on the surface (Equation 3).
- a non-surface area including a sharp three-dimensional edge generated by changing the orientation of a surface or a smooth three-dimensional edge generated by a curved surface having a large curvature can be extracted by the method of (1). Since the position of the point changes sharply, the non-surface area including the three-dimensional edge which is generated by the object being blocked by the object and the back object disappears can be extracted by the method of (2), Non-surface areas including sharp three-dimensional edges generated by changing the orientation of the surface can be extracted by the method of (3).
- FIG. 12 is a drawing substitute photograph showing the result of non-surface removal on a two-dimensional space
- FIG. 13 is a drawing substitute photograph showing the result of non-surface removal on a three-dimensional space
- FIG. It is a drawing substitute photograph which showed the result of non-surface removal in a different direction from FIG.
- the black part in the figure is the pixel removed by this process, and the part between the faces having different orientations is removed.
- the circular frame B in FIG. 13 and FIG. 14 there is a portion determined to be non-face although it is a face region. This part represents a person who passed during acquisition of the point cloud data 2. In this way, it is possible to remove large noises such as trees in front of the measurement object and passers by removing the surface.
- step S6 surface labeling is performed on the remaining points based on the continuity of the normal vectors. Specifically, in the vicinity of 8, if the angle difference between normal vectors of the attention point and the adjacent point is equal to or less than a predetermined threshold, the same label is attached. After surface labeling, it is determined whether the label (surface) is a flat surface or a curved surface with a small curvature, using an angle difference of normal vectors or a standard deviation of triaxial components of the normal vectors.
- FIG. 15 is a drawing substitute photograph showing the result of surface labeling on a two-dimensional space
- FIG. 16 is a drawing substitute photograph showing the result of surface labeling on a three-dimensional space.
- the faces are divided into 10 colors, but if the labels are not connected in the vicinity of 8, the faces of the same color are not identical.
- FIG. 17 is a drawing substitute photograph showing the result of noise removal. As shown in FIG. 17, small labels present on the upper and lower ends etc. are removed as noise (blackened in the figure).
- the same label as that of the nearest surface is given to the point where no label has been provided. That is, the already labeled surface is expanded (step S8).
- the equation of the labeled surface is determined, and the distance between the surface and the point without the label is determined. If there are multiple labels (faces) around the point without labels, the label with the shortest distance is selected.
- FIG. 18 is a drawing substitute photograph showing the result of label expansion in a two-dimensional space
- FIG. 19 is a drawing substitute photograph showing the result of label expansion in a three-dimensional space. As shown in FIGS. 18 and 19, the label is expanded in the surface or at the end of the surface.
- a surface is extracted from the point cloud data 2 (step S2).
- step S5 when the point which does not have a label still remains, it relabels by automatically adjusting various threshold values in non-surface removal (step S5), noise removal (step S7), and label expansion (step S8). I do.
- non-surface removal step S5
- the score of the non-surface extraction is reduced by raising the threshold value of the local curvature.
- label expansion step S8, by increasing the threshold of the distance between the point without label and the nearest surface, more labels are given to the point without label.
- the labels may be integrated. That is, even if the faces are not continuous, faces having the same position or orientation are labeled the same. Specifically, by comparing the position and orientation of the normal vector of each surface, the same non-continuous surface is extracted and unified to the label of any surface.
- a convex envelope that wraps the intersections of the extracted faces and the faces in a convex shape is calculated, and a three-dimensional edge is extracted based on them (step S9).
- Two methods will be described below.
- a form which uses any one method a form which uses both, a form which uses the average and the calculation result judged to be more appropriate, and a form which uses properly according to a situation are mentioned.
- step S6 (1) Taking Intersection Lines as Three-Dimensional Edges
- two adjacent faces are taken out, they are regarded as infinite faces, and the intersection line between the two faces is extracted as three-dimensional edges.
- the three-dimensional edge is determined as a plane and a plane, or a plane and a curved surface, or an intersection of a curved surface and a curved surface.
- the normal vectors a and b of the two planes are determined, and the cross product a ⁇ b of the normal vectors a and b is determined to determine the intersection line vector Number 4).
- one point passing through the intersection line is obtained from the equations of the two surfaces. This determines the line of intersection.
- FIG. 20 is a diagram showing a section where two adjacent faces meet.
- FIG. 21 is a drawing substitute photograph showing a three-dimensional edge formed by the intersection line of two adjacent faces.
- a portion indicated by an arrow in FIG. 21 is a surface where point cloud data can not be acquired, but since there is no adjacent surface also in this portion, three-dimensional edges can not be extracted.
- FIG. 38A a contour line 301 extending in the Y-axis direction is shown.
- FIG. 38B conceptually shows one of cut surfaces cut by the ZX plane.
- labeled surfaces 302, 304 and a non-surface area 303 between the surfaces 302 and 304 are shown.
- the measured points B1, B2, and B3 are points not included in the labeled surface, and correspond to the points of interest.
- FIG. 38B shows an example in which the measured point B2 is at a position distant from the actual non-area region 303 due to a measurement error caused by the reflection state of the laser light and the like.
- the adjacent point A3 is determined.
- the adjacent point A3 is calculated as an intersection position of a portion extending outward from the point 302 from the point A2 at the edge portion of the surface 302 and a portion extending outward to another adjacent surface 304.
- the adjacent point A3 is extracted outside the surface 302.
- an intersecting angle ⁇ 0 between the surface 302 and a surface obtained by extending the surface 304 including the adjacent point C1 and the adjacent point C2 included in the labeled surface adjacent to the surface 302 is calculated.
- an angle ⁇ 1 measured counterclockwise from the extension direction of the surface 302 , Angle ⁇ 2 and angle ⁇ 3 are determined. Then, among the angle ⁇ 1 , the angle ⁇ 2 and the angle ⁇ 3 , one having the angle ⁇ 0 or less is adopted, and one larger than the angle ⁇ 0 is excluded. In the case of FIG. 38B, the points B1 and B3 are adopted, and the point B2 is excluded. Then, an approximate curve connecting the adopted points (in this case, the point B1 and the point B3) is calculated to be a convex hull line. In FIG.
- FIG. 38B when there is almost no error and the convex envelope, which is an approximate curve, substantially matches the cross-sectional shape of the non-face 303 (that is, the approximate curve connecting point B1 and point B3 is symbol 303). Is substantially the same as the curve shown in FIG.
- FIG. 38 (C) in the obtained convex hull line, one of the angles formed by the midpoint between the labeled surface and the surface, or the surface labeled with the angle of the normal to the approximate curve. Let the location where it becomes / 2 etc. be an outline position (outline passage point).
- FIG. 38C shows the case where the actual contour line 301 and the calculated contour line position substantially coincide with each other.
- the approximate curve to be a convex envelope can also be obtained as a connection of straight lines of a minute length.
- the above processing is performed at a plurality of positions on the Y axis, and a line connecting a plurality of contour line passing points calculated on the Y axis is calculated as a contour line.
- a convex envelope that wraps the corner portion of the bent surface in a convex shape as shown in FIG. 38A is obtained, and calculation of a three-dimensional edge approximated to an actual contour 301 based on this convex envelope is performed. To be done.
- FIG. 22 is a drawing substitute photograph showing an example of a three-dimensional edge based on a convex envelope.
- a method of extracting a three-dimensional edge a method of applying to a model or a method of repeatedly performing threshold processing based on a local curvature or the like and leaving something that looks like a three-dimensional edge may be applied.
- step S2 a two-dimensional edge is extracted within the area of the two-dimensional image corresponding to the divided surface, and an edge substantially equal to the three-dimensional position of the three-dimensional edge near the two-dimensional edge is extracted as a two-dimensional edge (Step S10).
- a two-dimensional edge constituting an in-plane pattern that are difficult to extract as three-dimensional edges are extracted.
- FIG. 23 is a drawing substitute photograph showing a two-dimensional edge to be extracted. For example, the dotted line shown in FIG. 23 is extracted as a two-dimensional edge.
- edges are extracted from within the region of the two-dimensional image corresponding to the surface segmented in step S2 using a known edge extraction operator such as Laplacian, Plewwitt, Sobel, Canny and the like.
- a known edge extraction operator such as Laplacian, Plewwitt, Sobel, Canny and the like.
- the height (z value) of the three-dimensional coordinates of the points constituting the extracted edge is compared with the height (z value) of the three-dimensional coordinates of the points constituting the three-dimensional edge in the vicinity; If this difference is within a predetermined threshold, it is extracted as a two-dimensional edge. That is, it is determined whether the points that make up the extracted edge on the two-dimensional image are on the segmented surface.
- the three-dimensional edge extracted at step S9 and the two-dimensional edge extracted at step S10 are integrated to form a three-dimensional polyline 20 (step S11).
- the three-dimensional polyline 20 is displayed on the two-dimensional space and the three-dimensional space (step S12).
- FIG. 24 is a drawing substitute photograph showing three-dimensional edges and two-dimensional edges displayed on two-dimensional space and three-dimensional space.
- the corresponding edge is displayed on the two-dimensional space or the three-dimensional space.
- the three-dimensional polyline 20 is converted into CAD data of a predetermined format, and the data is output (step S13).
- a three-dimensional polyline 20 for grasping the appearance of the object to be measured is calculated based on the labeled surface.
- the three-dimensional polyline 20 makes it possible to display or print a three-dimensional model that displays the appearance of the measurement object.
- a series of processes related to the calculation of the three-dimensional polyline 20 can be grasped as a process of extracting the three-dimensional shape of the measurement object based on the labeled surface.
- the measurement object is treated as an assembly of a plurality of surfaces, and the three-dimensional shape is grasped based on the plurality of extracted surfaces.
- each point of the point cloud data 2 is divided by face, and a three-dimensional edge is extracted based on at least one of the intersection of faces and a convex hull line that wraps the face in a convex shape. Extract the two-dimensional edge from within the divided plane, and integrate the three-dimensional edge and the two-dimensional edge.
- the point cloud data processing apparatus 1 does not directly extract edges having various shapes, there is little noise in the extracted edge, and three dimensional edges and two dimensional edges are automatically extracted from the point cloud data 2 can do.
- the face can be extracted more easily than the edge, the edge can be extracted in a short time.
- the appearance shape of the object to be measured is grasped as an assembly of surfaces.
- the outline which divides an adjacent surface is computed from the data of a surface as a three-dimensional edge.
- the point cloud data processing device 1 removes points in the non-surface area based on the local curvature. For this reason, it is possible to remove sharp three-dimensional edges generated by changing the orientation of the surface, smooth three-dimensional edges generated by a curved surface having a large local curvature, and points in a non-surface area including noise.
- the accuracy of the fitting of the local plane is, for example, an average value of the distance between each point used for calculating the local plane and the local plane, and the three-dimensional edge generated by the occlusion is the point of the object behind and the front. Since the three-dimensional position with respect to the point of the object is largely different, the accuracy of the fitting of the local plane can remove such a three-dimensional edge or a point in a non-planar area including noise.
- the point cloud data processing apparatus 1 removes points in the non-surface area based on coplanarity at the attention point and the adjacent point.
- Coplanarity at the point of interest and the adjacent point is a condition under which the inner product of each normal and the line segment connecting the point of interest and the adjacent point is orthogonal to zero. By this condition, it is possible to remove sharp three-dimensional edges generated by changing the orientation of the surface and points in the non-surface area including noise.
- the point cloud data processing device 1 applies a label of the nearest surface to a point to which a label is not attached, and extends this label. That is, by applying a label attached to a point constituting a face to a point of the non-face area closest to the face, at least a line of intersection between the divided faces, or at least a convex envelope which wraps the face in a convex manner Three-dimensional edges can be extracted based on one.
- the point cloud data 2 is data in which the three-dimensional coordinates of each point and the two-dimensional image are linked
- the image processing technique is applied to the extraction of two-dimensional edges that are difficult to extract using only three-dimensional coordinates. can do.
- the two-dimensional edge is extracted from within the area of the two-dimensional image corresponding to the surface divided by the surface labeling unit 17, the in-plane pattern is configured excluding the three-dimensional edge which mainly constitutes a solid. Only two-dimensional edges can be extracted.
- the three-dimensional position of the two-dimensional edge extracted from the two-dimensional image It can be confirmed based on the three-dimensional position of the original edge.
- the point cloud data processing apparatus rotationally irradiates distance measurement light (laser light) to a measurement object, and measures the distance from its own position to the measurement point on the measurement object based on the flight time of the laser light Do. Further, the point cloud data processing device detects the irradiation direction (horizontal angle and elevation angle) of the laser light, and calculates the three-dimensional coordinates of the measurement point based on the distance and the irradiation direction. Further, the point cloud data processing apparatus acquires a two-dimensional image (RGB intensity at each measurement point) obtained by imaging the measurement object, and forms point cloud data in which the two-dimensional image and three-dimensional coordinates are linked. Furthermore, the point cloud data processing device forms a three-dimensional polyline composed of a three-dimensional edge and a two-dimensional edge from the formed point cloud data.
- distance measurement light laser light
- the point cloud data processing device detects the irradiation direction (horizontal angle and elevation angle) of the laser light, and calculates the three-dimensional coordinates of the measurement point
- the point cloud data processing apparatus 1 includes a leveling unit 22, a rotation mechanism unit 23, a main unit 27, and a rotation irradiation unit 28.
- the main unit 27 includes a distance measuring unit 24, an imaging unit 25, a control unit 26, and the like. Note that FIG. 26 shows a state where only the rotary irradiation unit 28 is viewed from the side with respect to the cross-sectional direction shown in FIG. 25 for the convenience of description.
- the leveling unit 22 has a base 29 and a lower casing 30.
- the lower casing 30 is supported at three points by the pin 31 and two adjustment screws 32 on the table 29.
- the lower casing 30 tilts with the tip of the pin 31 as a fulcrum.
- a tension spring 33 is provided between the bed 29 and the lower casing 30 so that the bed 29 and the lower casing 30 do not separate from each other.
- two leveling motors 34 are provided inside the lower casing 30.
- the two leveling motors 34 are driven independently of each other by the control unit 26.
- the adjusting screw 32 is rotated via the leveling drive gear 35 and the leveling driven gear 36, and the amount of downward projection of the adjusting screw 32 is adjusted.
- an inclination sensor 37 (see FIG. 27) is provided inside the lower casing 30.
- the two leveling motors 34 are driven by the detection signal of the tilt sensor 37, whereby the leveling is performed.
- the rotation mechanism unit 23 has a horizontal angle drive motor 38 inside the lower casing 30.
- a horizontal rotation drive gear 39 is fitted on the output shaft of the horizontal angle drive motor 38.
- the horizontal rotation drive gear 39 is meshed with the horizontal rotation gear 40.
- the horizontal rotation gear 40 is provided on the rotation shaft portion 41.
- the rotating shaft portion 41 is provided at the central portion of the rotating base 42.
- the rotary base 42 is provided on the upper portion of the lower casing 30 via a bearing member 43.
- an encoder for example, is provided as the horizontal angle detector 44 in the rotary shaft portion 41.
- the horizontal angle detector 44 detects the relative rotation angle (horizontal angle) of the rotary shaft portion 41 with respect to the lower casing 30.
- the horizontal angle is input to the control unit 26, and the control unit 26 controls the horizontal angle drive motor 38 based on the detection result.
- the main body 27 has a main body casing 45.
- the main body casing 45 is fixed to the rotary base 42.
- a lens barrel 46 is provided inside the main body portion casing 45.
- the barrel 46 has a center of rotation that is concentric with the center of rotation of the main body casing 45.
- the rotation center of the lens barrel 46 is aligned with the optical axis 47.
- a beam splitter 48 as a light beam separating means is provided inside the lens barrel 46.
- the beam splitter 48 has a function of transmitting visible light and reflecting infrared light.
- Optical axis 47 is split by beam splitter 48 into optical axis 49 and optical axis 50.
- the distance measuring unit 24 is provided on the outer peripheral portion of the lens barrel 46.
- the distance measuring unit 24 has a pulse laser light source 51 as a light emitting unit. Between the pulse laser light source 51 and the beam splitter 48, a perforated mirror 52 and a beam waist changing optical system 53 for changing the beam waist diameter of the laser beam are disposed.
- the distance measurement light source unit includes a pulse laser light source 51, a beam waist changing optical system 53, and a perforated mirror 52.
- the perforated mirror 52 guides pulse laser light from the hole 52a to the beam splitter 48, and has a role of reflecting the reflected laser light reflected back from the object to be measured toward the distance measurement light receiving unit 54.
- the pulse laser light source 51 emits infrared pulse laser light at a predetermined timing under the control of the control unit 26.
- the infrared pulse laser light is reflected by the beam splitter 48 toward the elevation mirror 55.
- the elevation mirror 55 has a role of reflecting infrared pulse laser light toward the object to be measured.
- the rotation mirror 55 for high and low angles converts the light axis 47 extending in the vertical direction into a light projection optical axis 56 in the height and elevation directions by rotating in the height and elevation directions.
- a condensing lens 57 is disposed between the beam splitter 48 and the elevation mirror 55 and inside the lens barrel 46.
- the reflected laser light from the object to be measured is guided to the distance measurement light receiving unit 54 through the elevation angle turning mirror 55, the condensing lens 57, the beam splitter 48, and the perforated mirror 52. Further, the reference light is also guided to the distance measuring light receiving unit 54 through the internal reference light path. Point cloud data processing apparatus based on the difference between the time until the reflected laser beam is received by the distance measuring light receiving unit 54 and the time until the laser beam is received by the distance measuring light receiving unit 54 through the internal reference light path The distance from 1 to the measurement object is measured.
- the imaging unit 25 includes an image light receiving unit 58.
- the image light receiving unit 58 is provided at the bottom of the lens barrel 46.
- the image light receiving unit 58 is configured of, for example, a CCD (Charge Coupled Device), in which a large number of pixels are collectively arranged in a plane.
- the position of each pixel of the image light receiving unit 58 is specified by the optical axis 50. For example, assuming an optical axis 50 as an origin, assuming an XY coordinate, a pixel is defined as a point of the XY coordinate.
- the rotary irradiation unit 28 is housed inside the light projecting casing 59.
- a part of the peripheral wall of the light projecting casing 59 is a light projecting window.
- a pair of mirror holder plates 61 are provided opposite to each other on the flange portion 60 of the lens barrel 46.
- a pivot shaft 62 is stretched around the mirror holder plate 61.
- the elevation mirror 55 is fixed to the pivot shaft 62.
- a height gear 63 is fitted to one end of the rotation shaft 62.
- the elevation angle detector 64 is provided on the other end side of the pivot shaft 62. The elevation angle detector 64 detects the rotation angle of the elevation angle turning mirror 55 and outputs the detection result to the control unit 26.
- a high and low angle drive motor 65 is attached to one of the mirror holder plates 61.
- a drive gear 66 is fitted on the output shaft of the high and low angle drive motor 65.
- the drive gear 66 is meshed with the height angle gear 63.
- the drive motor 65 for high and low angles is appropriately driven by the control of the control unit 26 based on the detection result of the high and low angle detector 64.
- An illumination star gate 67 is provided on the top of the light projecting casing 59.
- the illumination star 67 is used to collimate the measurement object.
- the collimation direction using the illumination star gate 67 is orthogonal to the direction in which the light projection optical axis 56 extends and the direction in which the rotation axis 62 extends.
- FIG. 27 is a block diagram of a control unit.
- the control unit 26 receives detection signals from the horizontal angle detector 44, the elevation angle detector 64, and the inclination sensor 37.
- the control unit 26 also receives an operation instruction signal from the operation unit 6.
- the control unit 26 drives and controls the horizontal angle drive motor 38, the elevation angle drive motor 65, and the leveling motor 34, and controls the display unit 7 that displays the work status, measurement results, and the like.
- the control unit 26 is detachably provided with an external storage device 68 such as a memory card, an HDD, or the like.
- the control unit 26 includes an arithmetic unit 4, a storage unit 5, a horizontal drive unit 69, an elevation drive unit 70, a leveling drive unit 71, a distance data processing unit 72, an image data processing unit 73 and the like.
- the storage unit 5 displays a sequence program, an arithmetic program, a measurement data processing program for executing processing of measurement data, an image processing program for executing image processing, and data, which are necessary for detecting distance measurement and elevation angle and horizontal angle. It stores various programs such as an image display program to be displayed on the section 7 and an integrated management program for integrating and managing these various programs, and also stores various data such as measurement data and image data.
- the horizontal drive unit 69 drives and controls the horizontal angle drive motor 38
- the high and low drive unit 70 drives and controls the high and low angle drive motor 65
- the leveling drive unit 71 controls and drives the leveling motor 34.
- the distance data processing unit 72 processes the distance data obtained by the distance measuring unit 24, and the image data processing unit 73 processes the image data obtained by the imaging unit 25.
- FIG. 28 is a block diagram of an arithmetic unit.
- the calculation unit 4 further includes a link formation unit 75 and a grid formation unit 9 in addition to the configuration of FIG. 1 described in the first embodiment.
- the link formation unit 75 inputs distance data from the distance data processing unit 72, and inputs direction data (horizontal angle and elevation angle) from the horizontal angle detector 44 and the elevation angle detector 64.
- the link forming unit 75 calculates the three-dimensional coordinates (orthogonal coordinates) of each measurement point with the position of the point cloud data processing apparatus 1 as the origin (0, 0, 0) based on the input distance data and direction data. .
- the three-dimensional coordinates (x, y, z) of the measurement point are obtained by the following equation 5.
- the link formation unit 75 also receives image data from the image data processing unit 73.
- the link formation unit 75 forms point cloud data 2 in which image data (RGB intensity of each measurement point) and three-dimensional coordinates are linked.
- the point cloud data processing device 1 can acquire point cloud data 2 of the measurement object measured from different directions. Therefore, assuming that one measurement direction is one block, the point cloud data 2 is composed of two-dimensional images of a plurality of blocks and three-dimensional coordinates.
- FIG. 29 is a diagram showing a link structure of a two-dimensional image of point cloud data and three-dimensional coordinates.
- the left side of the figure is a two-dimensional data structure, and the right side of the figure is a three-dimensional data structure.
- Two-dimensional data includes block number (blk), block size (nx, ny), transformation matrix (RT), reference matrix (IDX), number of memory points (pts), original image (rgb, r, g, b, u) , Processed image (nvx, nvy, nvz, fe, mdx, mdy, mdz, crx, cry, crz, crv), number of two-dimensional edges (eds), number of points constituting two-dimensional edges (ne), two It consists of a point list of dimension edges.
- the number of blocks (blk) represents the number of measurement directions
- the block size (nx, ny) represents the image size (the number of pixels in the vertical and horizontal directions) in one block.
- the transformation matrix (RT) represents a 4 ⁇ 4 transformation matrix used when affine transforming point cloud data 2 of the second block and thereafter into the coordinate system of the first block.
- a reference matrix (IDX) is an index used to refer to three-dimensional coordinates (x, y, z) from a two-dimensional image of a block
- a memory score (pts) is a score stored in one block Represents
- the original image is composed of RGB intensity values (rgb, r, g, b) and their brightness (u).
- the processed image is stored in the two-dimensional data by the processing described in the first embodiment.
- the processed image is a gray scale (0, 255) of three axis intensity values of the normal vector (nvx, nvy, nvz), local plane fitting accuracy (fe), and three axis average values of the normal vector in the local region. (Mdx, mdy, mdz), and the standard axis of the normal vector in the local region in grayscale (0, 255) with three standard deviations (crx, cry, crz), the normal vector in the local region It is composed of local curvatures (crv).
- the list of points of the two-dimensional edge can refer to the RGB intensity value and the brightness (rgb, r, g, b, u) of each point.
- three-dimensional data includes all points (n), three-dimensional coordinates (x, y, z), block number (blk), position (i, j) in the image, processing data (nvx, nvy, nvz, fe) , Mdx, mdy, mdz, mnvx, mnvy, mnvz, sdnvx, sdnvy, sdnvz, crv), and an internal reference matrix (IDXPT).
- IDXPT internal reference matrix
- the total score (n) is the total score of all blocks, and all the points are coordinate-transformed into three-dimensional coordinates (x, y, z) in the coordinate system of the first block.
- the RGB intensity value and brightness (rgb, r, g, b, u) of each point can be referred to by the block number (blk) and the position (i, j) in the image.
- the internal reference matrix (IDXPT) is an index for referring to the point identification number (ptid).
- processing data are stored in three-dimensional data by the processing described in the first embodiment.
- Processing data are: intensity values of three normal axes (nvx, nvy, nvz), accuracy of local plane fitting (fe), average value of normal vectors at local region in three axes gray scale (0, 255) (Mdx, mdy, mdz), the mean value of three axes of the normal vector in the local region (mnvx, mnvy, mnvz), and the standard deviation of the normal vector in the local region (sdvnx, sdnvy, sdnvz) , And the local curvature (crv) of the normal vector in the local region.
- the number of three-dimensional edges (eds) in the three-dimensional data, the number of points constituting the three-dimensional edges (ne), and the list of three-dimensional edge points in the three-dimensional data by the three-dimensional edge extraction process described in the first embodiment. (ELIST) is stored.
- the three-dimensional coordinates (x, y, z) of each point can be referenced by a list of three-dimensional edge points (ELIST).
- the link forming unit 75 outputs the point cloud data 2 described above to the grid forming unit 9.
- the grid formation unit 9 forms equally spaced grids (meshes) and registers the point closest to the grid intersection point.
- the grid formation unit 9 corrects all points to grid intersection points using a linear interpolation method or a bicubic method.
- the process of the grid formation unit 9 can be omitted.
- FIG. 30 is a view showing point group data in which the distance between points is not constant
- FIG. 31 is a view showing the formed grid.
- the average horizontal angle H 1 to N of each column is determined, the difference ⁇ H i, j of the average horizontal angle between the columns is calculated, and the average is set as the horizontal interval ⁇ H of the grid (equation 6) ).
- the distance in the vertical direction is calculated by calculating the distance ⁇ V N, H with the adjacent point in the vertical direction in each column, and taking the average of ⁇ V N, H in the entire image of the image size W, H as the vertical distance ⁇ V ).
- a grid of the calculated horizontal interval ⁇ H and vertical interval ⁇ V is formed.
- the point closest to the intersection of the formed grid is registered.
- a predetermined threshold is provided for the distance from the intersection to each point to limit registration.
- the threshold is 1 ⁇ 2 of the horizontal interval ⁇ H and the vertical interval ⁇ V.
- all points may be corrected by weighting according to the distance from the intersection.
- the point is not originally measured.
- FIG. 32 is a drawing substitute photograph showing a point group registered at grid intersections in a three-dimensional space
- FIG. 33 is a drawing substitute photograph showing point groups registered at grid intersections in a two-dimensional space.
- the black pixels shown in FIG. 33 indicate that there is no nearby point within the limit distance of the intersection of the grid or the missing data at the time of measurement of the point cloud data 2.
- the three-dimensional laser scanner can acquire point cloud data consisting of a two-dimensional image and three-dimensional coordinates. Further, since point cloud data in which a two-dimensional image and three-dimensional coordinates are linked can be formed, three-dimensional edges constituting a solid are extracted based on three-dimensional coordinates, and two-dimensional edges constituting an in-plane pattern It is possible to extract (lines and points whose color density changes sharply) based on a two-dimensional image. By doing this, two-dimensional edges and three-dimensional edges can be simultaneously displayed, and extraction and checking can be performed (FIG. 24).
- the point cloud data processing apparatus captures an image of a measurement target in an overlapping imaging region from different directions, associates feature points in the overlapping image, and determines the position and orientation of the imaging unit determined in advance and the position of the feature point in the overlapping image And calculate the three-dimensional coordinates of the feature point. Further, the point cloud data processing device forms point cloud data by determining an erroneous corresponding point based on the parallax of the feature points in the overlapping image, the measurement space, and the reference form. In point cloud data, a two-dimensional image and three-dimensional coordinates are linked. Furthermore, the point cloud data processing apparatus forms a three-dimensional polyline composed of three-dimensional edges and two-dimensional edges from the point cloud data.
- FIG. 34 is a block diagram showing the configuration of a point cloud data processing apparatus.
- the point cloud data processing apparatus 1 includes imaging units 76 and 77, a feature projection unit 78, an image data processing unit 73, an arithmetic unit 4, a storage unit 5, an operation unit 6, a display unit 7, and a data output unit 8.
- a digital camera, a video camera, a CCD camera (charge coupled device camera) for industrial measurement, a CMOS camera (complementary metal oxide semiconductor camera) or the like is used as the imaging units 76 and 77.
- the photographing units 76 and 77 photograph the object to be measured in overlapping photographing areas from different photographing positions.
- the photographing unit can be one or more depending on the size and the shape of the object to be measured.
- the feature projection unit 78 For the feature projection unit 78, a projector, a laser device, or the like is used.
- the feature projection unit 78 projects a pattern such as a random dot pattern, spot-like spot light, linear slit light or the like on the measurement object.
- the feature is included in the portion where the feature of the measurement object is poor. It is mainly used in the case of the precise measurement of a medium to small size unpatterned artifact.
- the feature projection unit 78 is omitted when measurement of a relatively large measurement object which is usually outdoors, precise measurement is unnecessary, or when a feature can be applied to the measurement object and when a pattern can be applied.
- the image data processing unit 73 converts the overlapping image captured by the imaging units 76 and 77 into image data that can be processed by the calculation unit 4.
- the storage unit 5 calculates a three-dimensional coordinate based on a program for measuring the shooting position and orientation, a program for extracting and correlating feature points from within the overlapping image, and based on the shooting position and orientation and the position of feature points within the overlapping image , A program for determining incorrect corresponding points to form point cloud data, a program for extracting a face from point cloud data, and extracting a three-dimensional edge and a two-dimensional edge, and displaying the integrated edge on the display unit 7 While storing various programs, such as an image display program, and storing various data, such as point cloud data and image data.
- Operation unit 6 outputs an operation instruction signal to operation unit 4.
- the display unit 7 displays the processing data of the calculation unit 4, and the data output unit 8 outputs the processing data of the calculation unit 4.
- Arithmetic unit 4 receives image data from image data processing unit 73.
- the calculation unit 4 measures the positions and orientations of the imaging units 76 and 77 based on the photographed image of the calibration subject 79, and is characterized from within the overlapping image of the measurement object Extract points and associate them.
- the calculation unit 4 detects the same corresponding points photographed on two or more plural photographed images by detecting several points (6 or more) on each image.
- the position and orientation of the imaging units 76 and 77 are calculated, and the three-dimensional coordinates of the measurement object are calculated based on the positions of the feature points in the overlapping image, and point cloud data 2 is formed. Furthermore, the computing unit 4 extracts a plane from the point cloud data 2, extracts and integrates three-dimensional edges and two-dimensional edges, and forms a three-dimensional polyline of the measurement object.
- FIG. 35 is a block diagram of an arithmetic unit.
- the calculating unit 4 further includes a photographing position and orientation measuring unit 81, a feature point associating unit 82, a background removing unit 83, a feature point extracting unit 84, a corresponding point searching unit 85, and a three-dimensional coordinate computing unit 86, an erroneous correspondence point determination unit 87, a parallax determination unit 88, a space determination unit 89, and a form determination unit 90.
- the photographing position and orientation measurement unit 81 inputs a photographed image of the calibration subject 79 from the image data processing unit 73.
- a target 80 (retro target, or a code target, or a color code target) is attached to the calibration subject 79 at a predetermined interval, and the shooting position and orientation measurement unit 81 determines the target 80 from the captured image of the calibration subject 79.
- the image coordinates are detected, and the positions and orientations of the imaging units 76 and 77 are measured using a known relative orientation method, a single photo orientation method, a DLT (Direct Linear Transformation) method, or a bundle adjustment method.
- the relative orientation method, the single photo orientation method or the DLT method, and the bundle adjustment method may be used alone or in combination.
- the shooting position and orientation are measured by detecting several points (6 points or more) on each image of the same corresponding points photographed on two or more photographed images.
- the unit 81 measures the positions and orientations of the imaging units 76 and 77 using a known relative orientation method, a single photo orientation method, a DLT (Direct Linear Transformation) method, or a bundle adjustment method.
- the relative orientation method, the single photo orientation method or the DLT method, and the bundle adjustment method may be used alone or in combination.
- the feature point associating unit 82 inputs an overlapping image of the measurement object from the image data processing unit 73, extracts feature points of the measurement object from the overlapping image, and associates them.
- the feature point associating unit 82 includes a background removing unit 83, a feature point extracting unit 84, and a corresponding point searching unit 85.
- the background removing unit 26 subtracts the background image from which the measurement object is not copied from the photographed image in which the measurement object is copied, the operator designates the position to be measured by the operation unit 6, or the measurement point By automatically extracting (automatically detecting a part with abundant use and features of a pre-registered model), a background-removed image in which only the measurement object is photographed is generated. In the case where it is not necessary to remove the background, the processing of the background removing unit 26 can be omitted.
- the feature point extraction unit 84 extracts feature points from the background-removed image.
- derivative filters such as Sobel, Laplacian, Preuwit, Roberts, etc. are used.
- the corresponding point search unit 85 searches for a corresponding point corresponding to the feature point extracted in one image in the other image.
- template matching such as sequential similarity detection algorithm method (SSDA), normalized correlation method, orientation code matching method (OCM) or the like is used.
- the three-dimensional coordinate calculation unit 86 determines each feature based on the positions and orientations of the imaging units 76 and 77 measured by the photographing position and orientation measurement unit 81 and the image coordinates of the feature points associated by the feature point associating unit 82. Calculate the 3D coordinates of a point.
- the erroneous corresponding point determination unit 87 determines an erroneous corresponding point based on at least one of the parallax, the measurement space, and the reference form.
- the erroneous corresponding point determination unit 87 includes a parallax determination unit 88, a space determination unit 89, and a form determination unit 90.
- the parallax determination unit 88 creates a histogram of the parallaxes of the corresponding feature points in the overlapping image, and determines a characteristic point having a parallax that is not within the predetermined range from the average value of the parallaxes as a false correspondence point. For example, an average value ⁇ 1.5 ⁇ (standard deviation) is used as a threshold.
- the space determination unit 89 defines a space having a predetermined distance from the center of gravity of the calibration subject 70 as a measurement space, and the three-dimensional coordinates of the feature point calculated by the three-dimensional coordinate calculation unit 86 are out of the measurement space Then, the feature point is determined as a false correspondence point.
- the form determination unit 90 forms or inputs a reference form (rough surface) of the measurement object from the three-dimensional coordinates of the feature points calculated by the three-dimensional coordinate calculation unit 86, and generates the reference form and the three-dimensional coordinates of the feature points. Determine the false corresponding point based on the distance of. For example, a rough surface is formed by forming a TIN (Triangulated Irregular Network) having an edge having a predetermined length or more based on the feature point and deleting a long TIN. Next, a false correspondence point is determined based on the distance between the rough surface and the feature point.
- TIN Triangulated Irregular Network
- Point cloud data 2 excluding the erroneous corresponding point determined by the erroneous corresponding point determination unit 87 is formed.
- the point cloud data 2 has a direct link structure in which a two-dimensional image and three-dimensional coordinates are linked as described in the second embodiment.
- the calculating unit 4 determines between the false corresponding point determining unit 87 and the surface extracting unit 10, It is necessary to provide the grid forming unit 9.
- the grid forming unit 9 forms grids (meshes) at equal intervals, and registers the point closest to the grid intersection point.
- the surface is extracted from the point cloud data 2, and the three-dimensional edge and the two-dimensional edge are extracted and integrated.
- the image measurement apparatus can acquire point cloud data composed of a two-dimensional image and three-dimensional coordinates.
- FIG. 36 shows a diagram (A) showing a sharp three-dimensional edge of a straight line, a diagram (B) showing a smooth three-dimensional edge of a straight line, a diagram (C) showing a sharp three-dimensional edge of a curve, and a curve smooth 3D is a diagram (D) showing a three-dimensional edge.
- FIG. 37 is a cross-sectional view of a smooth three-dimensional edge.
- the present invention can be used in a technology for generating data of a three-dimensional shape of a measurement object from point cloud data of the measurement object.
Abstract
Description
以下、点群データ処理装置の一例について、図面を参照して説明する。
図1は、点群データ処理装置のブロック図である。点群データ処理装置1は、測定対象物の点群データ2に基づいて、測定対象物の特徴を抽出し、当該特徴に基づく三次元形状を生成する。測定対象物の特徴は、主に立体を構成する三次元エッジと、平面および曲面(以下、単に面とする)内の模様を構成する二次元エッジで構成されている。三次元エッジは、位置および向きの異なる面同士の交線や各面の外縁であり、二次元エッジは、同一面内で色濃度が急峻に変化する線や点である。点群データ処理装置1が生成する三次元形状は、当該特徴に基づいた概略形状であり、三次元エッジと二次元エッジで構成される三次元ポリライン20である。
以下、演算部4において行われる面ラベリング処理、更に面ラベリング処理によってラベリングされた面に基づいて測定対象物の三次元形状を抽出する処理について詳細に説明する。図2は、演算部の処理の流れを示すフローチャートである。このフローチャートを実行するプログラムは、CDROMなどのコンピュータ読み取り可能な記録媒体で提供することが可能である。
ステップS4で求めた局所曲率が高い部分の点を除去する。局所曲率は、注目点とその周辺点における法線ベクトルのバラツキを表しているので、面(平面および曲率の小さい曲面)ではその値が小さく、面以外ではその値は大きくなる。したがって、予め決めた閾値よりも局所曲率が大きければ、注目点が面上にないと判断する。すなわち、図9の局所曲率画像を二値化し、閾値よりも大きい部分の除去を行う。
ステップS4で求めた局所平面の算出に用いた各点と、当該局所平面との距離を計算し、これらの距離の平均が予め設定した閾値よりも大きい場合には、注目点が面上にないと判断する。図10は、局所平面の算出に用いた点と局所平面との距離を示す図である。局所平面Lは、面上の点Aを指定することで決定する。点Aは、局所平面の算出に用いた点P1~P8の平均座標とする。
図11は、共平面性の判定方法を説明する説明図である。局所平面p1と局所平面p2の各法線ベクトルをn1,n2とし、面を決める点を結ぶベクトルをr12とする。このとき、法線ベクトルn1,n2と2局所平面間を結ぶr12の内積は、局所平面p1と局所平面p2が同一面上にあれば、それらは限りなく直交するためゼロに近くなる。この性質を利用し、各内積値の大きい方の値が予め決めた閾値よりも大きければ、注目点が面上にないと判断する(数3)。
この方法では、隣接する2つの面を取り出し、それらを無限面と考え、2つの面の交線を三次元エッジとして抽出する。ステップS6の面ラベリングにおいて、各ラベルが平面であるか曲面であるかが判断されているため、三次元エッジは、平面と平面、または平面と曲面、または曲面と曲面の交線として求められる。
この方法では、各ラベル(面)の外側部分にある点を抽出し、それらを結んで三次元エッジとする。具体的には、比較的一般的な立方体の対象物を考えた場合、二次元画像上で面の外側部分の点を抽出し、隣接点同士を線で結び、注目点とその隣接点との線分が成す角が90°より大きいならば、注目点を除去して、隣接点同士を結んでいく。これを繰り返すことで、当該面を凸状に包む凸包線が形成される。
以下、第1の実施形態の優位性について説明する。第1の実施形態によれば、点群データ2の各点を面ごとに区分けし、面同士の交線および面を凸状に包む凸包線の少なくとも一つに基づき三次元エッジを抽出し、区分けされた面内から二次元エッジを抽出して、三次元エッジおよび二次元エッジを統合している。このように点群データ処理装置1は、様々な形状から成るエッジを直接抽出していないため、抽出したエッジにノイズが少なく、点群データ2から自動的に三次元エッジおよび二次元エッジを抽出することができる。また、面は、エッジよりも容易に抽出することができるため、短時間にエッジを抽出することができる。
以下、三次元レーザースキャナーを備えた点群データ処理装置について説明する。第1の実施形態と同様の構成については、同じ符号を用いて、その説明を省略する。
点群データ処理装置は、測定対象物に対して測距光(レーザー光)を回転照射し、レーザー光の飛行時間に基づいて自身の位置から測定対象物上の測定点までの距離を測距する。また、点群データ処理装置は、レーザー光の照射方向(水平角および高低角)を検出し、距離および照射方向に基づいて測定点の三次元座標を演算する。また、点群データ処理装置は、測定対象物を撮像した二次元画像(各測定点におけるRGB強度)を取得し、二次元画像と三次元座標とを結び付けた点群データを形成する。さらに、点群データ処理装置は、形成した点群データから三次元エッジと二次元エッジで構成される三次元ポリラインを形成する。
以下、第2の実施形態の優位性について説明する。第2の実施形態によれば、三次元レーザースキャナーによって二次元画像と三次元座標から成る点群データを取得することができる。また、二次元画像と三次元座標を結び付けた点群データを形成することができるため、立体を構成する三次元エッジを三次元座標に基づいて抽出し、面内の模様を構成する二次元エッジ(色濃度が急峻に変化する線や点)を二次元画像に基づいて抽出することができる。このようにすることで、二次元エッジと三次元エッジを同時に表示し、抽出やチェックが行えるようになる(図24)。
以下、画像計測装置を備えた点群データ処理装置について説明する。第1および第2の実施形態と同様の構成については、同じ符号を用いて、その説明を省略する。
点群データ処理装置は、異なる方向から重複した撮影領域で測定対象物を撮影し、重複画像内の特徴点を対応づけ、予め求めた撮影部の位置および姿勢と重複画像内における特徴点の位置とに基づいて、特徴点の三次元座標を演算する。また、点群データ処理装置は、重複画像における特徴点の視差、計測空間、および基準形態に基づいて、誤対応点を判定することで、点群データを形成する。点群データでは、二次元画像と三次元座標が結び付けられている。さらに、点群データ処理装置は、点群データから三次元エッジと二次元エッジで構成される三次元ポリラインを形成する。
以下、第3の実施形態の優位性について説明する。第3の実施形態によれば、画像計測装置によって二次元画像と三次元座標から成る点群データを取得することができる。
以下、三次元エッジの抽出方法の変形例、すなわちラベリングされた面に基づいて測定対象物の三次元形状を抽出する方法の他の例について説明する。第1~第3の実施形態と同様の構成については、同じ符号を用いて、その説明を省略する。
以下、第4の実施形態の優位性について説明する。第4の実施形態によれば、三次元エッジの種類に応じて、精度良く三次元エッジを抽出することができる。
Claims (21)
- 測定対象物の点群データから非面領域の点を除去する非面除去部と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング部と、
前記面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一つに基づき三次元エッジを抽出する三次元エッジ抽出部と、
前記面ラベリング部によって区分けされた面内から二次元エッジを抽出する二次元エッジ抽出部と、
前記三次元エッジと前記二次元エッジを統合するエッジ統合部と
を備えることを特徴とする点群データ処理装置。 - 前記点群データの各点を中心とする局所平面を求め、前記局所平面の法線を算出する法線算出部をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。
- 前記法線の3軸成分の標準偏差を求め、前記標準偏差の二乗和の平方根を得ることによって局所曲率を算出する局所曲率算出部をさらに備えることを特徴とする請求項2に記載の点群データ処理装置。
- 前記非面除去部は、前記局所曲率に基づいて、非面領域の点を除去することを特徴とする請求項3に記載の点群データ処理装置。
- 前記非面除去部は、前記局所平面のフィッティングの精度に基づいて、非面領域の点を除去することを特徴とする請求項2に記載の点群データ処理装置。
- 前記非面除去部は、注目点と隣接点における共平面性に基づいて、非面領域の点を除去することを特徴とする請求項2に記載の点群データ処理装置。
- 前記面ラベリング部は、注目点と隣接点における前記法線の角度差に基づいて、同一面上の点に同一ラベルを付与することを特徴とする請求項2に記載の点群データ処理装置。
- 前記面ラベリング部によって区分けされた面の面積に基づいて、ノイズを除去するノイズ除去部をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。
- 前記面ラベリング部によってラベルが付与されていない点に対して、最近傍面のラベルを付与し、前記ラベルを拡張するラベル拡張部をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。
- 前記点群データは、各点の三次元座標と二次元画像を結び付けたデータであることを特徴とする請求項1に記載の点群データ処理装置。
- 前記二次元エッジ抽出部は、前記面ラベリング部によって区分けされた面に対応する前記二次元画像の領域内から二次元エッジを抽出することを特徴とする請求項10に記載の点群データ処理装置。
- 前記二次元エッジ抽出部は、前記面ラベリング部によって区分けされた面に対応する前記二次元画像の領域内から二次元エッジを抽出し、前記二次元エッジの近傍にある三次元エッジの三次元位置に基づいて前記二次元エッジを判定することを特徴とする請求項10に記載の点群データ処理装置。
- 測定対象物に対して測距光を回転照射する回転照射部と、
前記測距光の飛行時間に基づいて自身の位置から測定対象物上の測定点までの距離を測距する測距部と、
前記測距光の照射方向を検出する照射方向検出部と、
前記距離および前記照射方向に基づいて、前記測定点の三次元座標を演算する三次元座標演算部と、をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。 - 前記点群データの点間距離が一定でない場合に、等間隔のグリッドを形成し、前記グリッドの交点に最も近い点を登録するグリッド形成部をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。
- 前記測定対象物を撮像して、二次元画像を取得する撮像部と、
前記測定点の三次元座標と前記二次元画像を結び付けた点群データを形成するリンク形成部と、をさらに備えることを特徴とする請求項13に記載の点群データ処理装置。 - 異なる方向から重複した撮影領域で測定対象物を撮影する撮影部と、
前記撮影部によって得られた重複画像内の特徴点を対応づける特徴点対応付部と、
前記撮影部の位置および姿勢を測定する撮影位置姿勢測定部と、
前記撮影部の位置および姿勢と前記重複画像内における特徴点の位置とに基づいて特徴点の三次元座標を演算する三次元座標演算部と、をさらに備えることを特徴とする請求項1に記載の点群データ処理装置。 - 測定対象物の点群データから非面領域の点を除去する非面除去手順と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング手順と、
前記面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一つに基づき三次元エッジを抽出する三次元エッジ抽出手順と、
前記面ラベリング部によって区分けされた面内から二次元エッジを抽出する二次元エッジ抽出手順と、
前記三次元エッジと前記二次元エッジを統合するエッジ統合手順と
をコンピュータに実行させるための点群データ処理プログラム。 - 測定対象物の点群データから非面領域の点を除去する非面除去手順と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング手順と、
前記面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一つに基づき三次元エッジを抽出する三次元エッジ抽出手順と、
前記面ラベリング部によって区分けされた面内から二次元エッジを抽出する二次元エッジ抽出手順と、
前記三次元エッジと前記二次元エッジを統合するエッジ統合手順と
を備えることを特徴とする点群データ処理方法。 - 測定対象物の点群データから非面領域の点を除去する非面除去部と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング部と、
前記面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一つを三次元エッジとして抽出する三次元エッジ抽出部と、
前記面ラベリング部によって区分けされた面内から二次元エッジを抽出する二次元エッジ抽出部と、
前記三次元エッジと前記二次元エッジを統合するエッジ統合部と
を備えることを特徴とする点群データ処理装置。 - 測定対象物の点群データから非面領域の点を除去する非面除去部と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング部と、
前記面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一つに基づき前記測定対象物の三次元輪郭を抽出する三次元輪郭抽出部と、
前記面ラベリング部によって区分けされた面内から二次元輪郭を抽出する二次元輪郭抽出部と、
前記三次元輪郭と前記二次元輪郭を統合する輪郭統合部と
を備えることを特徴とする点群データ処理装置。 - 測定対象物の点群データから非面領域の点を除去する非面除去部と、
前記非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与し前記測定対象物を構成する複数の面を抽出する面抽出部と、
前記面抽出部において抽出された前記複数の面に基づき前記測定対象物の三次元形状を抽出する三次元形状抽出部と
を備えることを特徴とする点群データ処理装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011545172A JP5480914B2 (ja) | 2009-12-11 | 2010-11-19 | 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム |
CN201080056247.5A CN102713671A (zh) | 2009-12-11 | 2010-11-19 | 点群数据处理装置、点群数据处理方法和点群数据处理程序 |
DE112010004767T DE112010004767T5 (de) | 2009-12-11 | 2010-11-19 | Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm |
US13/507,117 US9207069B2 (en) | 2009-12-11 | 2012-06-05 | Device for generating a three-dimensional model based on point cloud data |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009281883 | 2009-12-11 | ||
JP2009-281883 | 2009-12-11 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/507,117 Continuation US9207069B2 (en) | 2009-12-11 | 2012-06-05 | Device for generating a three-dimensional model based on point cloud data |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2011070927A1 true WO2011070927A1 (ja) | 2011-06-16 |
WO2011070927A8 WO2011070927A8 (ja) | 2012-06-07 |
Family
ID=44145474
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/071188 WO2011070927A1 (ja) | 2009-12-11 | 2010-11-19 | 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9207069B2 (ja) |
JP (1) | JP5480914B2 (ja) |
CN (1) | CN102713671A (ja) |
DE (1) | DE112010004767T5 (ja) |
WO (1) | WO2011070927A1 (ja) |
Cited By (36)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012141235A1 (ja) * | 2011-04-13 | 2012-10-18 | 株式会社トプコン | 三次元点群位置データ処理装置、三次元点群位置データ処理システム、三次元点群位置データ処理方法およびプログラム |
WO2016132490A1 (ja) * | 2015-02-18 | 2016-08-25 | 株式会社日立製作所 | 図面作成システム及び図面作成方法 |
CN106643578A (zh) * | 2016-09-30 | 2017-05-10 | 信阳师范学院 | 一种基于点云数据的树干横断面轮廓曲线的断面积计算方法 |
CN106931883A (zh) * | 2017-03-30 | 2017-07-07 | 信阳师范学院 | 一种基于激光点云数据的树干材积获取方法 |
EP3370209A1 (en) | 2017-03-02 | 2018-09-05 | Topcon Corporation | Point cloud data processing device, point cloud data processing method, and point cloud data processing program |
KR101907081B1 (ko) * | 2011-08-22 | 2018-10-11 | 삼성전자주식회사 | 3차원 점군의 물체 분리 방법 |
JP2018165726A (ja) * | 2014-11-19 | 2018-10-25 | 首都高技術株式会社 | 点群データ利用システム |
US10127709B2 (en) | 2014-11-28 | 2018-11-13 | Panasonic Intellectual Property Management Co., Ltd. | Modeling device, three-dimensional model generating device, modeling method, and program |
CN108830931A (zh) * | 2018-05-23 | 2018-11-16 | 上海电力学院 | 一种基于动态网格k邻域搜索的激光点云精简方法 |
EP3454008A1 (en) | 2017-09-06 | 2019-03-13 | Topcon Corporation | Survey data processing device, survey data processing method, and survey data processing program |
CN109816664A (zh) * | 2018-12-25 | 2019-05-28 | 西安中科天塔科技股份有限公司 | 一种三维点云分割方法及装置 |
JP2019105654A (ja) * | 2019-03-26 | 2019-06-27 | 株式会社デンソー | ノイズ除去方法および物体認識装置 |
JP2019518957A (ja) * | 2016-06-07 | 2019-07-04 | ディーエスシージー ソルーションズ,インコーポレイテッド | Lidarを使用した運動の推定 |
US10354447B2 (en) | 2016-09-16 | 2019-07-16 | Topcon Corporation | Image processing device, image processing method, and image processing program |
US10382747B2 (en) | 2016-09-27 | 2019-08-13 | Topcon Corporation | Image processing apparatus, image processing method, and image processing program |
WO2019160022A1 (ja) | 2018-02-14 | 2019-08-22 | 株式会社トプコン | 無人航空機の設置台、測量方法、測量装置、測量システムおよびプログラム |
JP2019148522A (ja) * | 2018-02-28 | 2019-09-05 | 富士通株式会社 | 流水位置検出装置、流水位置検出方法および流水位置検出プログラム |
JP2019192170A (ja) * | 2018-04-27 | 2019-10-31 | 清水建設株式会社 | 3次元モデル生成装置及び3次元モデル生成方法 |
CN110473223A (zh) * | 2019-08-15 | 2019-11-19 | 西南交通大学 | 基于接触网腕臂系统三维点云的二维图像辅助分割方法 |
CN110780307A (zh) * | 2019-05-29 | 2020-02-11 | 武汉星源云意科技有限公司 | 基于电瓶车车载式激光点云移动测量系统获取道路横断面的方法 |
CN110796671A (zh) * | 2019-10-31 | 2020-02-14 | 深圳市商汤科技有限公司 | 数据处理方法及相关装置 |
EP3628968A1 (en) | 2018-09-25 | 2020-04-01 | Topcon Corporation | Survey data processing device, survey data processing method, and survey data processing program |
US10614340B1 (en) | 2019-09-23 | 2020-04-07 | Mujin, Inc. | Method and computing system for object identification |
CN112016570A (zh) * | 2019-12-12 | 2020-12-01 | 天目爱视(北京)科技有限公司 | 用于背景板同步旋转采集中的三维模型生成方法 |
WO2020240949A1 (ja) | 2019-05-31 | 2020-12-03 | 公益財団法人かずさDna研究所 | 三次元計測装置、三次元計測方法および三次元計測用プログラム |
EP3771886A1 (en) | 2019-07-30 | 2021-02-03 | Topcon Corporation | Surveying apparatus, surveying method, and surveying program |
JP2021015600A (ja) * | 2019-07-15 | 2021-02-12 | 株式会社Mujin | 画像データに基づく物体検出システム及び方法 |
EP3792591A1 (en) | 2019-08-26 | 2021-03-17 | Topcon Corporation | Surveying data processing device, surveying data processing method, and surveying data processing program |
JP2021051442A (ja) * | 2019-09-24 | 2021-04-01 | 株式会社PocketRD | 特徴抽出装置、特徴抽出方法及びコンテンツ利用管理装置 |
US10969493B2 (en) | 2017-09-19 | 2021-04-06 | Topcon Corporation | Data processing device, data processing method, and data processing program |
CN113066162A (zh) * | 2021-03-12 | 2021-07-02 | 武汉大学 | 一种用于电磁计算的城市环境快速建模方法 |
WO2021152340A1 (ja) * | 2020-01-31 | 2021-08-05 | 日産自動車株式会社 | 物体認識方法及び物体認識装置 |
CN113487180A (zh) * | 2021-07-05 | 2021-10-08 | 河南理工大学 | 一种基于云平台的齿轮齿面评价方法 |
US11150346B2 (en) | 2016-08-17 | 2021-10-19 | Topcon Corporation | Measuring method and laser scanner |
CN113805179A (zh) * | 2021-08-30 | 2021-12-17 | 南京航空航天大学 | 一种机载气象雷达目标的三维建模方法 |
EP4008997A1 (en) | 2020-07-27 | 2022-06-08 | Topcon Corporation | Surveying system, surveying method, and surveying program |
Families Citing this family (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9189862B2 (en) * | 2010-06-10 | 2015-11-17 | Autodesk, Inc. | Outline approximation for point cloud of building |
JP5067450B2 (ja) * | 2010-06-29 | 2012-11-07 | カシオ計算機株式会社 | 撮影装置、撮影装置の制御装置、撮影装置の制御プログラム、及び撮影装置の制御方法 |
KR101901588B1 (ko) * | 2012-01-02 | 2018-09-28 | 삼성전자주식회사 | 물체 인식 방법과, 물체 인식을 위한 기술자 생성 방법, 물체 인식을 위한 기술자 생성 장치 |
CN103368901A (zh) * | 2012-03-27 | 2013-10-23 | 复旦大学 | 基于大规模离散数据的云计算系统 |
US20140123507A1 (en) * | 2012-11-02 | 2014-05-08 | Qualcomm Incorporated | Reference coordinate system determination |
US10079968B2 (en) | 2012-12-01 | 2018-09-18 | Qualcomm Incorporated | Camera having additional functionality based on connectivity with a host device |
CN103903297B (zh) * | 2012-12-27 | 2016-12-28 | 同方威视技术股份有限公司 | 三维数据处理和识别方法 |
JP5921469B2 (ja) * | 2013-03-11 | 2016-05-24 | 株式会社東芝 | 情報処理装置、クラウドプラットフォーム、情報処理方法およびそのプログラム |
US9530225B1 (en) * | 2013-03-11 | 2016-12-27 | Exelis, Inc. | Point cloud data processing for scalable compression |
CN103196392A (zh) * | 2013-04-17 | 2013-07-10 | 武汉武大卓越科技有限责任公司 | 基于cameralink相机的三维断面采集测量系统与方法 |
CN104252153A (zh) * | 2013-06-28 | 2014-12-31 | 鸿富锦精密工业(深圳)有限公司 | Cnc加工程序生成系统及方法 |
DE102013108713B8 (de) * | 2013-08-12 | 2016-10-13 | WebID Solutions GmbH | Verfahren zum Verifizieren der ldentität eines Nutzers |
CN104574282B (zh) * | 2013-10-22 | 2019-06-07 | 鸿富锦精密工业(深圳)有限公司 | 点云噪声点去除系统及方法 |
US9098754B1 (en) * | 2014-04-25 | 2015-08-04 | Google Inc. | Methods and systems for object detection using laser point clouds |
US20200234494A1 (en) * | 2015-09-11 | 2020-07-23 | Japan Science And Technology Agency | Structure estimating apparatus, structure estimating method, and computer program product |
US10268740B2 (en) | 2015-10-14 | 2019-04-23 | Tharmalingam Satkunarajah | 3D analytics actionable solution support system and apparatus |
CN106023096B (zh) * | 2016-05-10 | 2019-02-22 | 上海交通大学 | 消除边缘锯齿化影响的图像匹配方法 |
US10066946B2 (en) | 2016-08-26 | 2018-09-04 | Here Global B.V. | Automatic localization geometry detection |
US10315866B2 (en) * | 2016-10-20 | 2019-06-11 | Intelligrated Headquarters, Llc | 3D-2D vision system for robotic carton unloading |
KR20180065135A (ko) | 2016-12-07 | 2018-06-18 | 삼성전자주식회사 | 셀프 구조 분석을 이용한 구조 잡음 감소 방법 및 장치 |
US10657665B2 (en) * | 2016-12-07 | 2020-05-19 | Electronics And Telecommunications Research Institute | Apparatus and method for generating three-dimensional information |
WO2018151211A1 (ja) * | 2017-02-15 | 2018-08-23 | 株式会社Preferred Networks | 点群データ処理装置、点群データ処理方法、点群データ処理プログラム、車両制御装置及び車両 |
CN211236238U (zh) | 2017-03-29 | 2020-08-11 | 深圳市大疆创新科技有限公司 | 光检测和测距(lidar)系统及无人载运工具 |
EP3602122A4 (en) | 2017-03-29 | 2020-03-18 | SZ DJI Technology Co., Ltd. | LIDAR SENSOR SYSTEM WITH A SMALL SHAPE FACTOR |
CN110383647B (zh) | 2017-03-29 | 2022-10-25 | 深圳市大疆创新科技有限公司 | 中空马达设备及相关系统和方法 |
CN110476037B (zh) * | 2017-04-03 | 2022-01-11 | 富士通株式会社 | 距离信息处理装置、距离信息处理方法以及距离信息处理程序 |
CN110573901A (zh) | 2017-04-28 | 2019-12-13 | 深圳市大疆创新科技有限公司 | 激光传感器和视觉传感器的校准 |
EP3616159A4 (en) | 2017-04-28 | 2020-05-13 | SZ DJI Technology Co., Ltd. | CALIBRATION OF LASER SENSORS |
WO2018195998A1 (en) | 2017-04-28 | 2018-11-01 | SZ DJI Technology Co., Ltd. | Angle calibration in light detection and ranging system |
CN107274422A (zh) * | 2017-05-08 | 2017-10-20 | 燕山大学 | 一种基于法线信息和k邻域搜索结合的点云边缘提取方法 |
KR101858902B1 (ko) * | 2017-06-26 | 2018-05-16 | 한국도로공사 | 컴포넌트를 활용한 점군 데이터의 객체 위치정보 추출 시스템 |
JP6861592B2 (ja) * | 2017-07-14 | 2021-04-21 | 三菱電機株式会社 | データ間引き装置、測量装置、測量システム及びデータ間引き方法 |
CN116359934A (zh) | 2017-07-20 | 2023-06-30 | 深圳市大疆创新科技有限公司 | 用于光学距离测量的系统和方法 |
EP3631508A4 (en) | 2017-07-31 | 2020-06-24 | SZ DJI Technology Co., Ltd. | MOTION-BASED PRECISION CORRECTION IN POINT CLOUDS |
WO2019041269A1 (en) | 2017-08-31 | 2019-03-07 | SZ DJI Technology Co., Ltd. | CALIBRATION OF DELAY TIME OF OPTICAL DISTANCE MEASURING DEVICES AND ASSOCIATED SYSTEMS AND METHODS |
CN109697728B (zh) * | 2017-10-20 | 2023-04-25 | 阿里巴巴集团控股有限公司 | 数据处理方法、装置、系统和存储介质 |
JP6874855B2 (ja) * | 2017-11-06 | 2021-05-19 | 富士通株式会社 | 算出方法、算出プログラムおよび情報処理装置 |
US10989795B2 (en) * | 2017-11-21 | 2021-04-27 | Faro Technologies, Inc. | System for surface analysis and method thereof |
TWI815842B (zh) * | 2018-01-16 | 2023-09-21 | 日商索尼股份有限公司 | 影像處理裝置及方法 |
JP2019128641A (ja) * | 2018-01-22 | 2019-08-01 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
US11093759B2 (en) * | 2018-03-06 | 2021-08-17 | Here Global B.V. | Automatic identification of roadside objects for localization |
US10657388B2 (en) * | 2018-03-13 | 2020-05-19 | Honda Motor Co., Ltd. | Robust simultaneous localization and mapping via removal of dynamic traffic participants |
CN109118500B (zh) * | 2018-07-16 | 2022-05-10 | 重庆大学产业技术研究院 | 一种基于图像的三维激光扫描点云数据的分割方法 |
JP7132037B2 (ja) * | 2018-08-29 | 2022-09-06 | フォルシアクラリオン・エレクトロニクス株式会社 | 車載処理装置 |
US11200430B2 (en) * | 2018-11-05 | 2021-12-14 | Tusimple, Inc. | Systems and methods for detecting trailer angle |
WO2020170246A1 (en) * | 2019-02-18 | 2020-08-27 | Rnvtech Ltd | High resolution 3d display |
US11049282B2 (en) * | 2019-02-28 | 2021-06-29 | Intelligrated Headquarters, Llc | Vision calibration system for robotic carton unloading |
CN110009745B (zh) * | 2019-03-08 | 2023-01-06 | 浙江中海达空间信息技术有限公司 | 根据平面基元和模型驱动对点云提取平面的方法 |
DE102019114572B3 (de) * | 2019-05-29 | 2020-10-01 | Technische Universität München | Verfahren und system zur dreidimensionalen erfassung einer szene |
CN110443875B (zh) * | 2019-07-25 | 2022-11-29 | 广州启量信息科技有限公司 | 一种基于建筑点云数据的轴线自动绘制系统 |
JP7313998B2 (ja) * | 2019-09-18 | 2023-07-25 | 株式会社トプコン | 測量データ処理装置、測量データ処理方法および測量データ処理用プログラム |
CN111127312B (zh) * | 2019-12-25 | 2023-08-22 | 武汉理工大学 | 一种复杂物体点云提取圆的方法及扫描装置 |
CN111025331B (zh) * | 2019-12-25 | 2023-05-23 | 湖北省空间规划研究院 | 一种基于旋转结构的激光雷达建图方法及其扫描系统 |
EP4091143A4 (en) * | 2020-01-17 | 2023-09-13 | Hewlett-Packard Development Company, L.P. | MOVEMENT CARDS |
CN113297340B (zh) * | 2020-02-23 | 2023-12-19 | 北京初速度科技有限公司 | 点云地图的矢量化、矢量地图转化点云地图的方法和装置 |
JP7468002B2 (ja) * | 2020-03-10 | 2024-04-16 | 日本電気株式会社 | 異常箇所表示装置、異常箇所表示システム、異常箇所表示方法、及び異常箇所表示プログラム |
US11354547B2 (en) | 2020-03-31 | 2022-06-07 | Toyota Research Institute, Inc. | Systems and methods for clustering using a smart grid |
CN111666137B (zh) * | 2020-04-26 | 2022-04-05 | 广州文远知行科技有限公司 | 数据标注方法、装置、计算机设备和存储介质 |
CN111710023A (zh) * | 2020-06-16 | 2020-09-25 | 武汉称象科技有限公司 | 一种三维点云数据特征点提取方法及应用 |
CN111898684A (zh) * | 2020-07-31 | 2020-11-06 | 陈艳 | 一种基于多维点云数据的生物种属鉴别方法 |
CN112132840B (zh) * | 2020-09-01 | 2023-11-07 | 济南市房产测绘研究院(济南市房屋安全检测鉴定中心) | 一种车载行道树点云分类与特征信息提取方法 |
WO2022203663A1 (en) * | 2021-03-24 | 2022-09-29 | Hewlett-Packard Development Company, L.P. | Triangulation-based anomaly detection in three dimensional printers |
JP2023022517A (ja) * | 2021-08-03 | 2023-02-15 | 株式会社東芝 | 計測システム及び計測プログラム |
CN113420735B (zh) * | 2021-08-23 | 2021-12-21 | 深圳市信润富联数字科技有限公司 | 一种轮廓提取方法、装置、设备及存储介质 |
US11928824B2 (en) * | 2021-09-13 | 2024-03-12 | International Business Machines Corporation | Three-dimensional segmentation annotation |
CN113838114B (zh) * | 2021-09-22 | 2023-08-29 | 中南大学 | 一种基于边缘散焦追踪的高炉料面深度估计方法及系统 |
KR20230069670A (ko) | 2021-11-12 | 2023-05-19 | 고려대학교 산학협력단 | 측정센서의 위치를 추정하는 장치 및 그 방법 |
CN114310872B (zh) * | 2021-11-29 | 2023-08-22 | 杭州电子科技大学 | 一种基于dgg点云分割网络的机械臂自动打菜方法 |
CN114494609B (zh) * | 2022-04-02 | 2022-09-06 | 中国科学技术大学 | 一种3d目标检测模型的构建方法、装置和电子设备 |
CN115239625B (zh) * | 2022-06-21 | 2023-05-09 | 厦门微图软件科技有限公司 | 顶盖焊点云缺陷检测方法、装置、设备及存储介质 |
CN114859374B (zh) * | 2022-07-11 | 2022-09-09 | 中国铁路设计集团有限公司 | 基于无人机激光点云和影像融合的新建铁路交叉测量方法 |
CN115984801A (zh) * | 2023-03-07 | 2023-04-18 | 安徽蔚来智驾科技有限公司 | 点云目标检测方法、计算机设备、存储介质及车辆 |
CN117268498B (zh) * | 2023-11-20 | 2024-01-23 | 中国航空工业集团公司金城南京机电液压工程研究中心 | 一种油量测量方法及系统 |
CN117274995B (zh) * | 2023-11-22 | 2024-02-13 | 北京科技大学 | 基于点云数据的二维泡沫图像标签自动生成方法和装置 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006085333A (ja) * | 2004-09-15 | 2006-03-30 | Armonicos:Kk | 非接触測定点群リバースシステム、非接触測定点群リバースエンジニアリング方法及びそのプログラム |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5988862A (en) | 1996-04-24 | 1999-11-23 | Cyra Technologies, Inc. | Integrated system for quickly and accurately imaging and modeling three dimensional objects |
JP2004272459A (ja) | 2003-03-06 | 2004-09-30 | Cad Center:Kk | 三次元形状の自動生成装置、自動生成方法、そのプログラム、及びそのプログラムを記録した記録媒体 |
JP4427656B2 (ja) | 2003-07-01 | 2010-03-10 | 学校法人東京電機大学 | 測量データの処理方法 |
US20050140670A1 (en) * | 2003-11-20 | 2005-06-30 | Hong Wu | Photogrammetric reconstruction of free-form objects with curvilinear structures |
US7944561B2 (en) * | 2005-04-25 | 2011-05-17 | X-Rite, Inc. | Measuring an appearance property of a surface using a bidirectional reflectance distribution function |
EP2120009B1 (en) * | 2007-02-16 | 2016-09-07 | Mitsubishi Electric Corporation | Measuring device and measuring method |
US8179393B2 (en) * | 2009-02-13 | 2012-05-15 | Harris Corporation | Fusion of a 2D electro-optical image and 3D point cloud data for scene interpretation and registration performance assessment |
-
2010
- 2010-11-19 WO PCT/JP2010/071188 patent/WO2011070927A1/ja active Application Filing
- 2010-11-19 DE DE112010004767T patent/DE112010004767T5/de active Pending
- 2010-11-19 CN CN201080056247.5A patent/CN102713671A/zh active Pending
- 2010-11-19 JP JP2011545172A patent/JP5480914B2/ja active Active
-
2012
- 2012-06-05 US US13/507,117 patent/US9207069B2/en active Active
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006085333A (ja) * | 2004-09-15 | 2006-03-30 | Armonicos:Kk | 非接触測定点群リバースシステム、非接触測定点群リバースエンジニアリング方法及びそのプログラム |
Cited By (61)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPWO2012141235A1 (ja) * | 2011-04-13 | 2014-07-28 | 株式会社トプコン | 三次元点群位置データ処理装置、三次元点群位置データ処理システム、三次元点群位置データ処理方法およびプログラム |
US9053547B2 (en) | 2011-04-13 | 2015-06-09 | Kabushiki Kaisha Topcon | Three-dimensional point cloud position data processing device, three-dimensional point cloud position data processing system, and three-dimensional point cloud position data processing method and program |
JP6030549B2 (ja) * | 2011-04-13 | 2016-11-24 | 株式会社トプコン | 三次元点群位置データ処理装置、三次元点群位置データ処理システム、三次元点群位置データ処理方法およびプログラム |
WO2012141235A1 (ja) * | 2011-04-13 | 2012-10-18 | 株式会社トプコン | 三次元点群位置データ処理装置、三次元点群位置データ処理システム、三次元点群位置データ処理方法およびプログラム |
KR101907081B1 (ko) * | 2011-08-22 | 2018-10-11 | 삼성전자주식회사 | 3차원 점군의 물체 분리 방법 |
JP2018165726A (ja) * | 2014-11-19 | 2018-10-25 | 首都高技術株式会社 | 点群データ利用システム |
US10127709B2 (en) | 2014-11-28 | 2018-11-13 | Panasonic Intellectual Property Management Co., Ltd. | Modeling device, three-dimensional model generating device, modeling method, and program |
WO2016132490A1 (ja) * | 2015-02-18 | 2016-08-25 | 株式会社日立製作所 | 図面作成システム及び図面作成方法 |
JPWO2016132490A1 (ja) * | 2015-02-18 | 2017-05-25 | 株式会社日立製作所 | 図面作成システム及び図面作成方法 |
JP2019518957A (ja) * | 2016-06-07 | 2019-07-04 | ディーエスシージー ソルーションズ,インコーポレイテッド | Lidarを使用した運動の推定 |
JP7057289B2 (ja) | 2016-06-07 | 2022-04-19 | ディーエスシージー ソルーションズ,インコーポレイテッド | Lidarを使用した運動の推定 |
US11150346B2 (en) | 2016-08-17 | 2021-10-19 | Topcon Corporation | Measuring method and laser scanner |
US10354447B2 (en) | 2016-09-16 | 2019-07-16 | Topcon Corporation | Image processing device, image processing method, and image processing program |
US10382747B2 (en) | 2016-09-27 | 2019-08-13 | Topcon Corporation | Image processing apparatus, image processing method, and image processing program |
CN106643578B (zh) * | 2016-09-30 | 2018-12-21 | 信阳师范学院 | 一种基于点云数据的树干横断面轮廓曲线的断面积计算方法 |
CN106643578A (zh) * | 2016-09-30 | 2017-05-10 | 信阳师范学院 | 一种基于点云数据的树干横断面轮廓曲线的断面积计算方法 |
EP3370209A1 (en) | 2017-03-02 | 2018-09-05 | Topcon Corporation | Point cloud data processing device, point cloud data processing method, and point cloud data processing program |
US10810699B2 (en) | 2017-03-02 | 2020-10-20 | Topcon Corporation | Point cloud data processing device, point cloud data processing method, and point cloud data processing program |
CN106931883A (zh) * | 2017-03-30 | 2017-07-07 | 信阳师范学院 | 一种基于激光点云数据的树干材积获取方法 |
EP3454008A1 (en) | 2017-09-06 | 2019-03-13 | Topcon Corporation | Survey data processing device, survey data processing method, and survey data processing program |
US10565730B2 (en) | 2017-09-06 | 2020-02-18 | Topcon Corporation | Survey data processing device, survey data processing method, and survey data processing program |
US10969493B2 (en) | 2017-09-19 | 2021-04-06 | Topcon Corporation | Data processing device, data processing method, and data processing program |
WO2019160022A1 (ja) | 2018-02-14 | 2019-08-22 | 株式会社トプコン | 無人航空機の設置台、測量方法、測量装置、測量システムおよびプログラム |
US11221216B2 (en) | 2018-02-14 | 2022-01-11 | Topcon Corporation | Placement table for unmanned aerial vehicle, surveying method, surveying device, surveying system and program |
JP2019148522A (ja) * | 2018-02-28 | 2019-09-05 | 富士通株式会社 | 流水位置検出装置、流水位置検出方法および流水位置検出プログラム |
JP2019192170A (ja) * | 2018-04-27 | 2019-10-31 | 清水建設株式会社 | 3次元モデル生成装置及び3次元モデル生成方法 |
JP7133971B2 (ja) | 2018-04-27 | 2022-09-09 | 清水建設株式会社 | 3次元モデル生成装置及び3次元モデル生成方法 |
CN108830931B (zh) * | 2018-05-23 | 2022-07-01 | 上海电力学院 | 一种基于动态网格k邻域搜索的激光点云精简方法 |
CN108830931A (zh) * | 2018-05-23 | 2018-11-16 | 上海电力学院 | 一种基于动态网格k邻域搜索的激光点云精简方法 |
EP3628968A1 (en) | 2018-09-25 | 2020-04-01 | Topcon Corporation | Survey data processing device, survey data processing method, and survey data processing program |
CN109816664A (zh) * | 2018-12-25 | 2019-05-28 | 西安中科天塔科技股份有限公司 | 一种三维点云分割方法及装置 |
JP2019105654A (ja) * | 2019-03-26 | 2019-06-27 | 株式会社デンソー | ノイズ除去方法および物体認識装置 |
CN110780307B (zh) * | 2019-05-29 | 2023-03-31 | 武汉星源云意科技有限公司 | 基于电瓶车车载式激光点云移动测量系统获取道路横断面的方法 |
CN110780307A (zh) * | 2019-05-29 | 2020-02-11 | 武汉星源云意科技有限公司 | 基于电瓶车车载式激光点云移动测量系统获取道路横断面的方法 |
WO2020240949A1 (ja) | 2019-05-31 | 2020-12-03 | 公益財団法人かずさDna研究所 | 三次元計測装置、三次元計測方法および三次元計測用プログラム |
US11941852B2 (en) | 2019-05-31 | 2024-03-26 | Kazusa Dna Research Institute | Three-dimensional measurement device, three-dimensional measurement method, and three-dimensional measurement program |
US11288814B2 (en) | 2019-07-15 | 2022-03-29 | Mujin, Inc. | System and method of object detection based on image data |
JP2021015600A (ja) * | 2019-07-15 | 2021-02-12 | 株式会社Mujin | 画像データに基づく物体検出システム及び方法 |
EP3771886A1 (en) | 2019-07-30 | 2021-02-03 | Topcon Corporation | Surveying apparatus, surveying method, and surveying program |
US11725938B2 (en) | 2019-07-30 | 2023-08-15 | Topcon Corporation | Surveying apparatus, surveying method, and surveying program |
CN110473223A (zh) * | 2019-08-15 | 2019-11-19 | 西南交通大学 | 基于接触网腕臂系统三维点云的二维图像辅助分割方法 |
CN110473223B (zh) * | 2019-08-15 | 2023-05-05 | 西南交通大学 | 基于接触网腕臂系统三维点云的二维图像辅助分割方法 |
US11580696B2 (en) | 2019-08-26 | 2023-02-14 | Topcon Corporation | Surveying data processing device, surveying data processing method, and surveying data processing program |
EP3792591A1 (en) | 2019-08-26 | 2021-03-17 | Topcon Corporation | Surveying data processing device, surveying data processing method, and surveying data processing program |
US11763459B2 (en) | 2019-09-23 | 2023-09-19 | Mujin, Inc. | Method and computing system for object identification |
US10614340B1 (en) | 2019-09-23 | 2020-04-07 | Mujin, Inc. | Method and computing system for object identification |
JP2021051712A (ja) * | 2019-09-23 | 2021-04-01 | 株式会社Mujin | 物体識別のための方法および計算システム |
JP7433609B2 (ja) | 2019-09-23 | 2024-02-20 | 株式会社Mujin | 物体識別のための方法および計算システム |
JP2021051442A (ja) * | 2019-09-24 | 2021-04-01 | 株式会社PocketRD | 特徴抽出装置、特徴抽出方法及びコンテンツ利用管理装置 |
CN110796671A (zh) * | 2019-10-31 | 2020-02-14 | 深圳市商汤科技有限公司 | 数据处理方法及相关装置 |
CN112016570B (zh) * | 2019-12-12 | 2023-12-26 | 天目爱视(北京)科技有限公司 | 用于背景板同步旋转采集中的三维模型生成方法 |
CN112016570A (zh) * | 2019-12-12 | 2020-12-01 | 天目爱视(北京)科技有限公司 | 用于背景板同步旋转采集中的三维模型生成方法 |
JPWO2021152340A1 (ja) * | 2020-01-31 | 2021-08-05 | ||
WO2021152340A1 (ja) * | 2020-01-31 | 2021-08-05 | 日産自動車株式会社 | 物体認識方法及び物体認識装置 |
EP4246088A1 (en) | 2020-07-27 | 2023-09-20 | Topcon Corporation | Surveying system, surveying method, and surveying program |
EP4008997A1 (en) | 2020-07-27 | 2022-06-08 | Topcon Corporation | Surveying system, surveying method, and surveying program |
CN113066162B (zh) * | 2021-03-12 | 2022-04-29 | 武汉大学 | 一种用于电磁计算的城市环境快速建模方法 |
CN113066162A (zh) * | 2021-03-12 | 2021-07-02 | 武汉大学 | 一种用于电磁计算的城市环境快速建模方法 |
CN113487180A (zh) * | 2021-07-05 | 2021-10-08 | 河南理工大学 | 一种基于云平台的齿轮齿面评价方法 |
CN113805179A (zh) * | 2021-08-30 | 2021-12-17 | 南京航空航天大学 | 一种机载气象雷达目标的三维建模方法 |
CN113805179B (zh) * | 2021-08-30 | 2024-03-08 | 南京航空航天大学 | 一种机载气象雷达目标的三维建模方法 |
Also Published As
Publication number | Publication date |
---|---|
US9207069B2 (en) | 2015-12-08 |
WO2011070927A8 (ja) | 2012-06-07 |
US20120256916A1 (en) | 2012-10-11 |
DE112010004767T5 (de) | 2013-01-24 |
CN102713671A (zh) | 2012-10-03 |
JPWO2011070927A1 (ja) | 2013-04-22 |
JP5480914B2 (ja) | 2014-04-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011070927A1 (ja) | 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム | |
JP5343042B2 (ja) | 点群データ処理装置および点群データ処理プログラム | |
JP5465128B2 (ja) | 点群位置データ処理装置、点群位置データ処理システム、点群位置データ処理方法、および点群位置データ処理プログラム | |
JP5462093B2 (ja) | 点群データ処理装置、点群データ処理システム、点群データ処理方法、および点群データ処理プログラム | |
KR102296236B1 (ko) | 3d 이미지 데이터에서 3d 포즈와 기생포인트 제거의 스코어링을 개선하는 시스템 및 방법 | |
US7310431B2 (en) | Optical methods for remotely measuring objects | |
JP6030549B2 (ja) | 三次元点群位置データ処理装置、三次元点群位置データ処理システム、三次元点群位置データ処理方法およびプログラム | |
JP5593177B2 (ja) | 点群位置データ処理装置、点群位置データ処理方法、点群位置データ処理システム、および点群位置データ処理プログラム | |
US9972120B2 (en) | Systems and methods for geometrically mapping two-dimensional images to three-dimensional surfaces | |
JP2013101045A (ja) | 物品の3次元位置姿勢の認識装置及び認識方法 | |
US20140078519A1 (en) | Laser Scanner | |
WO2013061976A1 (ja) | 形状検査方法およびその装置 | |
KR101973917B1 (ko) | 3차원 계측 장치 및 그 계측 지원 처리 방법 | |
US20170140537A1 (en) | System and method for scoring clutter for use in 3d point cloud matching in a vision system | |
JP7353757B2 (ja) | アーチファクトを測定するための方法 | |
EP3975116A1 (en) | Detecting displacements and/or defects in a point cloud using cluster-based cloud-to-cloud comparison | |
Coudrin et al. | An innovative hand-held vision-based digitizing system for 3D modelling | |
JP6425406B2 (ja) | 情報処理装置、情報処理方法、プログラム | |
US20220414925A1 (en) | Tracking with reference to a world coordinate system | |
US11669988B1 (en) | System and method for three-dimensional box segmentation and measurement | |
Uyanik et al. | A method for determining 3D surface points of objects by a single camera and rotary stage | |
Kainz et al. | Estimation of camera intrinsic matrix parameters and its utilization in the extraction of dimensional units | |
Логвиненко | 3D scanning as a modern way to create three-dimensional virtual model | |
WO2023163760A1 (en) | Tracking with reference to a world coordinate system | |
Lang et al. | Active object modeling with VIRTUE |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080056247.5 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10835849 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011545172 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1120100047677 Country of ref document: DE Ref document number: 112010004767 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 10835849 Country of ref document: EP Kind code of ref document: A1 |