WO2016132489A1 - Drawing preparation system is drawing preparation method - Google Patents

Drawing preparation system is drawing preparation method Download PDF

Info

Publication number
WO2016132489A1
WO2016132489A1 PCT/JP2015/054481 JP2015054481W WO2016132489A1 WO 2016132489 A1 WO2016132489 A1 WO 2016132489A1 JP 2015054481 W JP2015054481 W JP 2015054481W WO 2016132489 A1 WO2016132489 A1 WO 2016132489A1
Authority
WO
WIPO (PCT)
Prior art keywords
shape
unit
reference surface
dimensional
creation system
Prior art date
Application number
PCT/JP2015/054481
Other languages
French (fr)
Japanese (ja)
Inventor
敬介 藤本
渡邊 高志
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to JP2016557092A priority Critical patent/JP6227801B2/en
Priority to PCT/JP2015/054481 priority patent/WO2016132489A1/en
Publication of WO2016132489A1 publication Critical patent/WO2016132489A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]

Definitions

  • the present invention relates to a system and method for creating a drawing from shape data measured by a shape measurement sensor.
  • a shape measurement sensor that can measure the three-dimensional shape of an entity by measuring the distance to the surrounding entity.
  • the shape measurement sensor uses the shape measurement sensor to measure the shape of the entire space to be acquired more quickly and with higher accuracy than by hand measurement, and the shape can be measured non-destructively from a place away from a high place or a dangerous place where human hands cannot reach.
  • Patent Document 1 Japanese Patent Laid-Open No. 2005-43248
  • Patent Document 2 Japanese Patent Laid-Open No. 2005-43248
  • Patent Document 3 Japanese Patent Laid-Open No. 2005-43248
  • an asymmetrical three-dimensional marker is attached to a measurement location of an object to be measured, and a partial region of the object to be measured is optically three-dimensional with a marker using a shape measuring instrument.
  • a measurement value is obtained by measurement, and then the shape measuring device is moved, and a partial region of another measurement location including the marker is optically measured three-dimensionally to obtain the measurement value.
  • Patent Document 2 Japanese Patent Application Laid-Open No. 2003-323461 acquires point cloud data for acquiring the position and shape of an object as point cloud data and displaying the point cloud on a display unit based on the point cloud data.
  • Point cloud designating unit a point cloud designating unit for designating a point cloud representing a member constituting the target object, and point cloud data which is a source of the point cloud representing the designated member
  • a CAD data creation device including a point group classification unit that classifies the data from other point group data.
  • Arbitrary attribute information can be added to the point cloud data indicating the members classified by the point cloud classification unit, and CAD data for drawing a 3D-CAD drawing is created for each attribute data. Can do.
  • a 3D-CAD drawing of the entire object can be created.
  • Patent Document 3 International Publication 2011/070927 discloses a non-surface removal unit 16 that removes points in a non-surface region from the point cloud data of the measurement object, and points that are removed by the non-surface removal unit. 3rd order based on at least one of the surface labeling part that gives the same label to the points on the same surface, the intersection line of the surfaces divided by the surface labeling part, and the convex hull that wraps the surface in a convex shape
  • a three-dimensional edge extraction unit that extracts the original edge, a two-dimensional edge extraction unit that extracts a two-dimensional edge from the plane divided by the surface labeling unit, and an edge integration unit that integrates the three-dimensional edge and the two-dimensional edge It is described in the point cloud data processing apparatus provided.
  • the surrounding shape can be measured as a set of measurement points (hereinafter referred to as a point group). Then, as described in Patent Document 2, a drawing is created from the measured point cloud data.
  • a method of instructing or searching the position of a predetermined object such as a wall or a pillar on the measured point cloud and describing it in the drawing For example, there are many shielding objects such as desks indoors, and many areas of the floor cannot be measured. Moreover, only a part of the wall can be measured by a shielding object such as a shelf. Furthermore, it is not possible to obtain in advance information about what objects exist in the space. In other words, in order to create a drawing from point cloud data, it is necessary to determine the object that exists in the measurement target space and the location of the object in an unknown environment and in a situation where only a part of the space can be measured. is there.
  • a typical example of the invention disclosed in the present application is as follows. That is, a drawing creation system for creating a shape drawing from surrounding measurement results by a measurement unit, which is set by a reference surface height setting unit that accepts a height setting for extracting a reference surface, and the reference surface height setting unit A reference surface extraction unit that extracts a reference surface at a height of the height, a reference normal estimation unit that estimates a reference normal that is a normal of the reference surface, and a shape of the reference surface as a two-dimensional shape A shape restoring unit that creates a drawing showing a shape extended from the position of the reference plane to a length set in the direction of the reference normal.
  • FIG. 1 is a diagram illustrating a schematic configuration of a drawing creation system according to the present embodiment.
  • the drawing creation system stores the shape data 101 measured by the shape measuring unit 100 and the drawing information 116 representing the restored shape in the storage device (memory 12, auxiliary storage device 13).
  • a set of points is handled as an example of the shape data 101.
  • the data represents a shape (specifically, the shape can be acquired and information on edges and vertices is not included). Good.
  • the reference surface height setting unit 102 accepts the height setting by the operator. For example, it is sufficient to set a height (about 2 m) at which there are few artifacts that are difficult for human hands to reach directly. Since there are few artifacts in high places, it is possible to measure a shape (for example, a wall) without being shielded by the artifact during shape measurement.
  • the height may be set to a lower limit height for extracting the reference plane or a height range for extracting the reference plane.
  • the reference surface extraction unit 103 extracts a surface that is present at a height higher than the height set by the reference surface height setting unit 102 (or the set height range) and serves as a reference for shape restoration from the shape data 101. To do. In the present embodiment, the entire shape is restored based on a high surface (for example, a ceiling surface) with less shielding. Note that the reference surface may be a surface other than the ceiling as long as the surface is at a high place where the shape can be easily measured.
  • a plane that is positioned in the vertical direction and that is higher than the height specified by the operator is extracted by a plane search algorithm such as Hough transform.
  • the reference normal estimation unit 106 estimates the normal vector of the shape data on the reference surface extracted by the reference surface extraction unit 103.
  • a normal vector can be estimated by fitting a plane to the shape data on the reference plane using the least square method and using the value of the normal of the fitted plane. Alternatively, the operator may directly input the normal value.
  • the shape restoration unit 115 converts the shape of the reference surface in the direction of the estimated normal to convert it into three-dimensional information, restores the entire shape, and stores it as drawing information 116.
  • FIG. 2 is a diagram showing a detailed configuration of the drawing creation system of the present embodiment.
  • FIG. 2 the description of the same components as those in the detailed configuration (FIG. 2) among the elements described above in the schematic configuration (FIG. 1) will be omitted.
  • the drawing creation system shown in FIG. 2 includes an optional configuration, and includes a reference surface height setting unit 102, a reference surface extraction unit 103, a shape data display unit 104, a shape data selection unit 105, a reference normal estimation unit 106, an artifact.
  • the drawing creating system stores shape data 101 obtained by measuring the surrounding shape by the shape measuring unit 100, two-dimensional shape data 110 which is a shape extracted by the shape extracting unit 109, and drawing information 116 representing the shape in a storage device (memory). 12 and stored in the auxiliary storage device 13).
  • the reference surface height setting unit 102 accepts the height setting by the operator. Then, the reference surface extraction unit 103 extracts a surface which is present at a height higher than the height set by the reference surface height setting unit 102 and serves as a reference for shape restoration from the shape data 101 (see FIG. 7).
  • the shape data display unit 104 displays the shape data 101 on the display device 19.
  • the shape data selection unit 105 receives a range selected by the operator from the shape data displayed on the screen, and selects shape data within the selected range.
  • the reference surface extraction unit 103 may use a surface constituted by the shape data of the selected range as the reference surface (see FIG. 6).
  • the reference normal estimation unit 106 estimates the normal vector (reference normal) of the shape data on the reference surface extracted by the reference surface extraction unit 103.
  • the artifact shape extraction unit 107 extracts partial shape data existing within a range set in the direction of the estimated reference normal from the reference surface extracted by the reference surface extraction unit 103.
  • the set range may be set by an operator as a range in the height direction from which partial shape data is extracted, or may be a preset range in the height direction.
  • the two-dimensional projection unit 108 projects the shape data extracted by the artifact shape extraction unit 107 onto a two-dimensional image plane that is parallel to the reference plane (see FIG. 8).
  • the artifact perpendicular to the reference plane is represented as a line shape on the two-dimensional image. For example, when the reference plane is a ceiling, a vertically installed wall is represented as a straight line in the two-dimensional image.
  • the shape extraction unit 109 extracts a line shape from the two-dimensional image, and stores the extracted shape as two-dimensional shape data 110 (see FIG. 9). Specifically, the line is automatically recognized from the two-dimensional image information using a plane search algorithm such as a Hough transform or a dynamic contour extraction method, and the two-dimensional shape data 110 is extracted.
  • a plane search algorithm such as a Hough transform or a dynamic contour extraction method
  • the image display unit 111 displays a two-dimensional image on the screen.
  • the shape editing unit 112 accepts designation by the operator of the two-dimensional shape data of the artifact perpendicular to the reference plane.
  • the shape extraction unit 109 may extract the two-dimensional shape data 110 according to an operator's specification.
  • the corresponding reference plane extraction unit 113 extracts, as a corresponding reference plane, an area that is separated from the extracted reference plane by a predetermined distance in the direction of the estimated reference normal.
  • the corresponding reference plane height setting unit 114 receives a predetermined distance setting by the operator (see FIG. 10). When extracting the corresponding reference surface, a surface having a normal in the same direction as the reference surface and existing in the vertically downward direction at the measurement point may be automatically recognized.
  • the shape restoration unit 115 sets the reference surface as the upper surface, the corresponding reference surface as the lower surface, and the two-dimensional shape data 110 extracted by the shape extraction unit 109 in the normal direction of the reference surface between the reference surface and the corresponding reference surface. By moving, it is converted into three-dimensional information (see FIG. 11). In this way, the entire shape can be restored as the drawing information 116 by interpolating between the reference plane and the corresponding reference plane.
  • FIG. 3 is a block diagram showing the physical configuration of the drawing creation system of the present embodiment.
  • the drawing creation system of this embodiment is configured by a computer having a processor (CPU) 11, a memory 12, an auxiliary storage device 13, and a communication interface 14.
  • the processor 11 executes a program stored in the memory 12.
  • the memory 12 includes a ROM that is a nonvolatile storage element and a RAM that is a volatile storage element.
  • the ROM stores an immutable program (for example, BIOS).
  • BIOS basic input/output
  • the RAM is a high-speed and volatile storage element such as a DRAM (Dynamic Random Access Memory), and temporarily stores a program executed by the processor 11 and data used when the program is executed.
  • the auxiliary storage device 13 is a large-capacity non-volatile storage device such as a magnetic storage device (HDD) or a flash memory (SSD), for example.
  • the auxiliary storage device 13 is a program executed by the processor 11 and data used when the program is executed (for example, , Map data). That is, the program is read from the auxiliary storage device 13, loaded into the memory 12, and executed by the processor 11.
  • the drawing creation system may have an input interface 15 and an output interface 18.
  • the input interface 15 is an interface to which a keyboard 16 and a mouse 17 are connected and receives an input from an operator.
  • the output interface 18 is an interface to which a display device 19 or a printer is connected, and outputs an execution result of the program in a format that can be visually recognized by the operator.
  • the communication interface 14 is a network interface device that controls communication with other devices according to a predetermined protocol.
  • the drawing creation system may be connected to a terminal (not shown) via the communication interface 14, may operate according to an instruction input from the terminal, and may output a calculation result to the terminal.
  • the program executed by the processor 11 is provided to the drawing creation system via a removable medium (CD-ROM, flash memory, etc.) or a network, and is stored in a nonvolatile storage device 13 which is a non-temporary storage medium. For this reason, the drawing creation system may have an interface for reading data from a removable medium.
  • a drawing creation system is a computer system that is configured on one computer or a plurality of computers that are logically or physically configured, and operates on separate threads on the same computer. Alternatively, it may operate on a virtual machine constructed on a plurality of physical computer resources.
  • FIG. 4 is a flowchart of processing executed by the drawing creation system of this embodiment.
  • the reference surface height setting unit 102 executes a reference surface height setting process for receiving a height setting by an operator (S102).
  • the reference surface extraction unit 103 executes a reference surface extraction process for extracting from the shape data 101 a surface that exists at a height higher than the height set in the reference surface height setting process S102 and serves as a reference for shape restoration. (S103).
  • the shape data display unit 104 executes shape data display processing for displaying the shape data 101 on the display device 19 (S104).
  • the shape data selection unit 105 receives the range selected by the operator from the shape data displayed on the screen, selects the shape data of the selected range, and executes the shape data selection process that is output to the reference plane extraction unit 103 (S105). ).
  • the reference surface extraction unit 103 may use a surface constituted by the shape data in the range selected in the shape data selection process S105 as the reference surface.
  • the reference normal estimation unit 106 executes a reference normal estimation process for estimating a normal vector (reference normal) of shape data on the reference plane extracted in the reference plane extraction process S103 (S106).
  • the artifact shape extraction unit 107 extracts, from the reference plane extracted in the reference plane extraction process S103, partial shape data existing within a predetermined range in the direction of the estimated reference normal. An extraction process is executed (S107).
  • the two-dimensional projection unit 108 executes a two-dimensional projection process for projecting the shape data extracted in the artifact shape extraction process S107 onto a two-dimensional image plane parallel to the reference plane (S108).
  • the shape extraction unit 109 extracts a line shape from the two-dimensional image, and executes a shape extraction process for storing the extracted shape as the two-dimensional shape data 110 (S109).
  • the image display unit 111 executes image editing processing for displaying a two-dimensional image on the screen (S111).
  • the shape editing unit 112 receives the designation by the operator of the two-dimensional shape data of the artifact perpendicular to the reference plane, and executes the shape editing process output to the shape extraction unit 109 (S112).
  • the shape extraction unit 109 may extract the two-dimensional shape data 110 according to an operator's specification.
  • the corresponding reference surface extraction unit 113 executes a corresponding reference surface extraction process that extracts a region that is a predetermined distance away from the extracted reference surface in the direction of the estimated reference normal as a corresponding reference surface (S113). ).
  • the corresponding reference surface height setting unit 114 receives a setting of a predetermined distance by the operator, and executes a corresponding reference surface height setting process that is output to the corresponding reference surface extraction unit 113 (S114).
  • the shape restoration unit 115 sets the reference surface as the upper surface, the corresponding reference surface as the lower surface, and uses the two-dimensional shape data 110 (or the shape of the reference surface) extracted in the shape extraction process S109 as the reference surface and the corresponding reference surface.
  • a shape restoration process for converting into three-dimensional information is performed by moving in the normal direction of the reference plane (S115).
  • FIG. 5 is a diagram showing the measurement of the surrounding shape by the shape measuring unit 100.
  • the shape measuring unit 100 irradiates the laser light in various directions and measures the time until the reflected light of the laser light returns to exist in the direction irradiated with the laser light. You can measure the distance to an object.
  • the shape measuring unit 100 can measure the shapes of all objects existing around by irradiating laser light in all directions and repeatedly measuring the time until the reflected light is received.
  • an entity 201 such as a ceiling, a wall, and a pillar and an entity 202 installed indoors such as a desk, chair, shelf, and board are illustrated.
  • measurement can be performed as a set of points 203, 204, and 205 corresponding to the laser light irradiation points.
  • the point cloud 203 shown in FIG. 5 is illustrated by omitting the point cloud on the front surface and the right surface, which are originally measured, so that the internal existence appears in the figure.
  • a rough shape 204 such as a wall is described with a large point
  • a minute indoor object 205 is described with a small point to indicate a fine shape.
  • the size of the laser light irradiation point is substantially the same, and the density of the points is also substantially the same.
  • the measurement density may be partially different (for example, for each region).
  • Measurement of the surroundings alone does not provide information on the relationship between measurement points and objects. Furthermore, it is impossible to measure a region (for example, a shape hidden on a shelf or a board) where the laser beam does not reach directly. Since the shape of the wall behind large objects such as shelves and boards can only be measured in a small area, it is difficult to determine the presence of a wall in that area. In addition, there are areas that are blocked by objects such as desks and chairs, and it is difficult to measure all floor surfaces. On the other hand, since the laser beam is less likely to be blocked by the shielding object at a high place such as the ceiling, the shape of the high place can be easily measured. For this reason, in this embodiment, the entire shape is restored based on the shape of the high place.
  • FIG. 6 is a diagram showing a screen for selecting a range that the shape data selection unit 105 accepts.
  • the shape data display unit 104 displays the shape data 101 on the display device 19.
  • the operator operates an input device such as the mouse 17 to designate a desired range 300.
  • the shape data selection unit 105 receives the range 300 specified by the operator, and selects the shape data 301 existing within the specified range. More specifically, the shape data selection unit 105 includes a mode switching unit that switches between a shape selection mode and a shape selection cancellation mode. When the operator designates a range in the shape selection mode, the shape data within the range is selected. On the other hand, when the operator designates a range in the shape selection cancellation mode, the shape data in the range is in a non-selected state.
  • the shape data selection unit 105 may automatically extract the front entity and make the rear entity non-selected.
  • a plane extraction algorithm such as a Hough transform is used to divide shape data included in a selection range into a plurality of planes and extract the plane of the entity. Then, by selecting the plane closest to the front on the display screen from the plurality of extracted planes, it is possible to extract only the existing entity. Also, the plane on the foreground may be selected by canceling the selection of the point group on the back side by the operator's operation and selecting the point on the foreground.
  • FIG. 7 is a diagram illustrating the extraction of the reference plane by the reference plane extraction unit 103.
  • the reference surface extraction unit 103 extracts a plane that exists above the received height by a plane search algorithm such as Hough transform. At this time, a plurality of plane candidates may be extracted simply by extracting a plane using a plane extraction method such as Hough transform. Therefore, when the shape data selection unit 105 accepts the selected shape 400, only the plane 401 including the selected shape 400 is extracted.
  • a plane search algorithm such as Hough transform
  • FIG. 8 is a diagram illustrating the projection of an entity onto a two-dimensional image by the two-dimensional projection unit 108.
  • the reference normal estimation unit 106 estimates the direction of the reference normal 500 of the reference surface 401 extracted by the reference surface extraction unit 103.
  • the normal direction can be estimated by applying a plane to the shape on the reference plane using the least square method.
  • a robust estimation method such as RANSAC may be used to make the fit robust.
  • the artifact shape extraction unit 107 extracts shape data 502 existing within a predetermined range 501 in the direction of the estimated reference normal 500 from the reference surface 401.
  • the predetermined range 501 is an area where there are only a small number of objects other than walls when the ceiling is the reference plane 401. For example, since there are objects other than walls such as desks and chairs in the region far from the ceiling, extracting the shape in the region far from the ceiling reduces the robustness of shape extraction.
  • a range extremely close to the ceiling an object suspended from the ceiling may be detected. In order to avoid these objects, a range from 20 cm to 100 cm from the ceiling may be set as the predetermined range 501. It should be noted that it may be in a range where there are few entities other than those provided perpendicular to the reference surface 401.
  • the operator inputs a numerical value on the input screen, displays a point within the range corresponding to the input numerical value in a predetermined color (for example, red), confirms the range on the screen, and operates the “OK” button.
  • a predetermined color for example, red
  • this process may be repeated twice or more to extract a plurality of predetermined ranges 501.
  • the predetermined range 501 can be set avoiding the object.
  • the two-dimensional projection unit 108 forms the shape 503 extracted by the artifact shape extraction unit 107 on a two-dimensional plane having the same normal as the direction of the reference normal 500 of the reference surface 401. Projection is performed to generate a two-dimensional image 504. Note that the point cloud shown in FIG. 8A is described with the front and right walls omitted as described above, and actually there are also point clouds in that region. For this reason, the two-dimensional image (FIG. 8B) also shows point groups corresponding to the front wall and the right wall.
  • FIG. 9 is a diagram illustrating extraction of the two-dimensional shape data 110 from the two-dimensional image on which the artifact is projected.
  • the shape 503 extracted by the artifact shape extraction unit 107 is a shape that extends perpendicularly to the reference plane, and thus is a line on the two-dimensional image.
  • the wall is a straight line
  • the cylinder is a circle
  • the prism is a rectangle.
  • the shape extraction unit 109 searches for a line shape 600 from the two-dimensional image, and extracts the searched line shape as two-dimensional shape data.
  • a line can be automatically extracted from a two-dimensional image using the Hough transform.
  • the dynamic contour method can be used to extract the contour element of the entire wall composed of a plurality of lines.
  • the image display unit 111 displays a two-dimensional image including the extracted two-dimensional shape data on the display device 19.
  • the operator can correct the two-dimensional shape data by operating an input device such as the mouse 17.
  • the shape editing unit 112 accepts correction of two-dimensional shape data by an operator's operation.
  • a line 601 can be added by clicking two points of a start point and an end point.
  • the line 602 can be deleted by clicking an arbitrary point on the line.
  • a cylinder and a prism can be added by designating a shape and a plurality of points.
  • the shape that can be extracted and the shape that can be edited may be other than the above-described object as long as it is represented as a line on the two-dimensional image.
  • FIG. 10 shows a screen for setting the distance 701 from the reference surface 401 to the corresponding reference surface 700, which is received by the corresponding reference surface height setting unit 114.
  • the ceiling is the reference plane 401
  • the floor is the corresponding reference plane 700
  • the corresponding reference plane 700 has the same shape as the reference plane 401.
  • a surface having the same shape as the reference surface 401 is generated at a position separated from the reference surface 401 in the direction of the reference normal 500 by a predetermined distance 701, and the generated surface is set as a corresponding reference surface 700.
  • the corresponding reference plane 700 may be arranged in a range in which the plane measured by the shape measuring unit 100 is distributed.
  • the distance from the reference surface 401 is input by the operator (ceiling height).
  • an interface is provided that allows an area corresponding to the corresponding reference plane to be specified on the screen while adjusting the height of the corresponding reference plane by operating an input device such as the mouse 17 (for example, dragging and dropping the reference plane 401). Also good.
  • a plane existing in the vertically downward direction is extracted using a plane extraction method such as Hough transform, and the height of the extracted plane
  • the shape of the reference plane may be applied to
  • FIG. 11 is a diagram illustrating restoration of the entire shape from the reference plane 401, the corresponding reference plane 700, and the two-dimensional shape data 800 by the shape restoration unit 115.
  • the two-dimensional shape data 800 becomes the two-dimensional shape data 801 in the three-dimensional space depending on the direction of the reference normal 500 used when the two-dimensional projection unit 108 projects the shape data (C, D).
  • the two-dimensional shape data 801 is moved in the direction of the reference normal 500 between the reference surface 401 and the corresponding reference surface 700, and the two-dimensional shape is stretched to be converted into a three-dimensional shape 803 having a size.
  • the room shape 806 can be restored by combining the three-dimensional shape 803 thus created, the reference planes 401 (A, B), and the corresponding reference planes 700 (A, B) (F).
  • the surface at a high place can be easily measured because it is less shielded by artifacts.
  • the wall and pillar artifacts use the property of extending in the vertical direction. For this reason, according to the present embodiment, it is possible to easily measure the shape even in the case where there is no prior information regarding the shape in an environment where there is much shielding, and it is possible to create a robust drawing from the measured shape data.
  • the image display unit 111 includes a shape editing unit 112 that executes at least one process of adding a two-dimensional shape to the image displayed on the display device 19 and correcting the two-dimensional shape extracted by the shape extraction unit 109. Therefore, a point cloud can be extracted reliably.
  • the artifact shape extraction unit 107 extracts shape data within two or more set ranges 501 in the direction from the reference plane 401 to the reference normal 500, even in an environment where the shielding objects are arranged in a complicated manner.
  • the two-dimensional shape can be extracted accurately, and the accuracy of the drawing can be improved.
  • the shape extraction unit 109 extracts a line shape from the two-dimensional image created by the two-dimensional projection unit 108, it is possible to create a highly accurate drawing with a small amount of calculation.
  • the corresponding reference surface extraction unit 113 extracts shape data located at a position away from the reference surface 401 by the set distance 701 as the corresponding reference surface (floor surface) 700, and the shape restoration unit 115 performs the shape extraction unit 109.
  • the shape data located at a position away from the reference surface 401 by the set distance 701 as the corresponding reference surface (floor surface) 700
  • the shape restoration unit 115 performs the shape extraction unit 109.
  • the corresponding reference surface height setting unit 114 receives the distance set by the operator, and the corresponding reference surface extraction unit 113 is set from a position away from the reference surface 401 by the distance set by the corresponding reference surface height setting unit 114. Since the surface existing within the range is extracted, the floor surface can be accurately determined.
  • a shape data display unit 104 that displays shape data on the screen
  • a corresponding reference plane height setting unit that receives selection of a shape included in the region of the shape data displayed on the display device 19 by the shape data display unit 104 114, and the corresponding reference surface extraction unit 113 extracts the corresponding reference surface 700 so as to include the selected shape data, so that the floor surface can be accurately determined.
  • the reference plane extraction unit 103 extracts the reference plane (ceiling) 401 so as to include the shape data selected by the shape data selection unit 105, the ceiling can be accurately selected.
  • the shape data selection unit 105 selects the entity located in the forefront, so that the selection work by the operator can be facilitated.
  • FIG. 12 is a diagram showing creation of a histogram according to the second embodiment.
  • the two-dimensional projection unit 108 projects shape data 903 within a range 902 designated in the direction of the reference normal 500 of the reference surface 401 onto a two-dimensional image 904 as shown in FIG.
  • the range 902 does not have to be the entire range from the ceiling to the floor, and may be a range that sufficiently includes information to be included in the drawing.
  • a histogram value corresponding to the number of points existing in the region corresponding to each pixel in the two-dimensional image 904 is calculated, the pixel color is determined according to the histogram value, and the two-dimensional image 904 is calculated.
  • the points can be measured at the same position in a wide range from the ceiling to the floor, so the histogram value becomes high (905).
  • the histogram value is slightly lower (906).
  • the number of measurement points of the existence is determined according to the height, and becomes a histogram value according to the number of measurement points. For example, the position where both the board and the foot are measured has a high histogram value (907), and the position where only the board is measured has a low histogram value (908).
  • the shape extraction unit 109 extracts a two-dimensional shape using the two-dimensional image 904 created by the two-dimensional projection unit 108.
  • a two-dimensional shape is extracted by a Hough transform, a dynamic contour method, or the like using a histogram of each pixel as a weight.
  • the two-dimensional projection unit 108 creates a histogram according to the number of points extracted by the artifact shape extraction unit 107, and the shape extraction unit 109 extracts a two-dimensional shape using the histogram. Even in an environment where a shield is placed on the wall, the shape can be extracted more robustly than the first embodiment described above, and a robust drawing can be created.
  • FIG. 13 is a diagram illustrating extraction of pipes connected to a wall.
  • the reference plane extraction unit 103 extracts the wall 1001 located at a high place as a reference plane.
  • the normal direction 1002 estimated by the reference normal estimation unit 106 is the same direction as the pipe 1000 extending vertically from the wall.
  • the pipe projected by the two-dimensional projection unit 108 has a circular shape 1004 in the two-dimensional image 1003.
  • the shape extraction unit 109 can extract the two-dimensional shape data 110 of the pipe by extracting the circular shape 1004 by the Hough transform, or directly by the operator using the shape editing unit 112.
  • the shape restoration unit 115 can make the pipe into drawing information.
  • piping can be detected particularly in an area where there are few shields such as high places.
  • the present invention is not limited to the above-described embodiments, and includes various modifications and equivalent configurations within the scope of the appended claims.
  • the above-described embodiments have been described in detail for easy understanding of the present invention, and the present invention is not necessarily limited to those having all the configurations described.
  • a part of the configuration of one embodiment may be replaced with the configuration of another embodiment.
  • another configuration may be added, deleted, or replaced.
  • each of the above-described configurations, functions, processing units, processing means, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit, and the processor realizes each function. It may be realized by software by interpreting and executing the program to be executed.
  • Information such as programs, tables, and files that realize each function can be stored in a storage device such as a memory, a hard disk, and an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, and a DVD.
  • a storage device such as a memory, a hard disk, and an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, and a DVD.
  • control lines and information lines indicate what is considered necessary for the explanation, and do not necessarily indicate all control lines and information lines necessary for mounting. In practice, it can be considered that almost all the components are connected to each other.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Evolutionary Computation (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

This drawing preparation system for preparing shape drawings from the results of a measurement unit measuring the surroundings, and is provided with a reference plane height setting unit which receives a height setting at which a reference plane is to be extracted, a reference plane extraction unit which extracts a reference plane in a high place at the height set by the reference plane height setting unit, a reference normal line estimation unit which estimates the reference normal line that is normal to the reference plane, and a shape reconstruction unit which, treating the shape of the reference plane as a two-dimensional shape, prepares a drawing in which a shape is represented that extends from a position on the reference plane in the direction of the reference normal line for a configured length.

Description

図面作成システム及び図面作成方法Drawing creation system and drawing creation method
 本発明は、形状計測センサが計測した形状データから図面を作成するためのシステム及び方法に関する。 The present invention relates to a system and method for creating a drawing from shape data measured by a shape measurement sensor.
 空間の形状を記録するための一つの手段として、周囲の存在物までの距離を測ることによって存在物の三次元的な形状を計測できる形状計測センサがある。形状計測センサを用いると、手計りに比べ迅速かつ高精度に空間全体の形状を取得でき、また人の手が届かない高所や危険な場所を離れた地点から非破壊で形状を計測できる。 As one means for recording the shape of the space, there is a shape measurement sensor that can measure the three-dimensional shape of an entity by measuring the distance to the surrounding entity. Using the shape measurement sensor, the shape of the entire space can be acquired more quickly and with higher accuracy than by hand measurement, and the shape can be measured non-destructively from a place away from a high place or a dangerous place where human hands cannot reach.
 本技術の背景技術として、特開2005-43248号公報(特許文献1)、特開2003-323461号公報(特許文献2)、国際公開2011/070927号公報(特許文献3)がある。特許文献1(特開2005-43248号公報)には、非対称の3次元形状のマーカーを被測定物の測定箇所に取り付け、被測定物の部分領域を形状測定器によりマーカーと共に光学的に3次元測定して測定値を得て、その後、形状測定器を移動して、マーカーを含む被測定物の他の測定箇所の部分領域を光学的に3次元測定して測定値を得る。そして、測定値から、マーカーを認識すると共に、マーカーの位置・姿勢を算出して、マーカーに基づいて座標変換係数を算出し、測定値を同一座標系に変換して、被測定物の外形形状を測定する3次元形状測定方法が記載されている。 As background arts of this technology, there are JP-A-2005-43248 (Patent Document 1), JP-A-2003-323461 (Patent Document 2), and International Publication 2011/070927 (Patent Document 3). In Patent Document 1 (Japanese Patent Laid-Open No. 2005-43248), an asymmetrical three-dimensional marker is attached to a measurement location of an object to be measured, and a partial region of the object to be measured is optically three-dimensional with a marker using a shape measuring instrument. A measurement value is obtained by measurement, and then the shape measuring device is moved, and a partial region of another measurement location including the marker is optically measured three-dimensionally to obtain the measurement value. Then, while recognizing the marker from the measured value, calculating the position / orientation of the marker, calculating a coordinate conversion coefficient based on the marker, converting the measured value to the same coordinate system, and measuring the outer shape of the object to be measured A three-dimensional shape measuring method for measuring is described.
 また、特許文献2(特開2003-323461号公報)には、対象物の位置および形状を点群データとして取得すると共に、点群データに基づいて表示部に点群を表示する点群データ取得部と、取得された対象物を示す点群の中から、対象物を構成する部材を示す点群を指定する点群指定部と、指定された部材を示す点群の元となる点群データと他の点群データと区分する点群区分部とを備えるCADデータ作成装置が記載されている。点群区分部にて区分された部材を示す点群データには、任意の属性情報を付加することができ、この属性情報毎に3D-CADの図面を描画するためのCADデータを作成することができる。部材または属性毎に作成された3D-CAD図面を組み合わせることにより、対象物全体の3D-CAD図面を作成することができる。 Patent Document 2 (Japanese Patent Application Laid-Open No. 2003-323461) acquires point cloud data for acquiring the position and shape of an object as point cloud data and displaying the point cloud on a display unit based on the point cloud data. Point cloud designating unit, a point cloud designating unit for designating a point cloud representing a member constituting the target object, and point cloud data which is a source of the point cloud representing the designated member And a CAD data creation device including a point group classification unit that classifies the data from other point group data. Arbitrary attribute information can be added to the point cloud data indicating the members classified by the point cloud classification unit, and CAD data for drawing a 3D-CAD drawing is created for each attribute data. Can do. By combining 3D-CAD drawings created for each member or attribute, a 3D-CAD drawing of the entire object can be created.
 さらに、特許文献3(国際公開2011/070927号公報)には、測定対象物の点群データから非面領域の点を除去する非面除去部16と、非面除去部によって除去された点以外の点に対して、同一面上の点に同一ラベルを付与する面ラベリング部と、面ラベリング部によって区分けされた面同士の交線および面を凸状に包む凸包線の少なくとも一に基づき三次元エッジを抽出する三次元エッジ抽出部と、面ラベリング部によって区分けされた面内から二次元エッジを抽出する二次元エッジ抽出部と、三次元エッジと二次元エッジを統合するエッジ統合部とを備える点群データ処理装置に記載されている。 Furthermore, Patent Document 3 (International Publication 2011/070927) discloses a non-surface removal unit 16 that removes points in a non-surface region from the point cloud data of the measurement object, and points that are removed by the non-surface removal unit. 3rd order based on at least one of the surface labeling part that gives the same label to the points on the same surface, the intersection line of the surfaces divided by the surface labeling part, and the convex hull that wraps the surface in a convex shape A three-dimensional edge extraction unit that extracts the original edge, a two-dimensional edge extraction unit that extracts a two-dimensional edge from the plane divided by the surface labeling unit, and an edge integration unit that integrates the three-dimensional edge and the two-dimensional edge It is described in the point cloud data processing apparatus provided.
特開2005-43248号公報Japanese Patent Laid-Open No. 2005-43248 特開2003-323461号公報JP 2003-323461 A 国際公開2011/070927号公報International Publication No. 2011/070927
 前述した従来技術によると、特許文献1に記載されるようなレーザ光を用いた形状計測では、周囲の形状を計測点の集合(以下、点群と称する)として測定できる。そして、特許文献2に記載されるように、実測した点群データから図面を作成する。 According to the above-described prior art, in the shape measurement using the laser beam as described in Patent Document 1, the surrounding shape can be measured as a set of measurement points (hereinafter referred to as a point group). Then, as described in Patent Document 2, a drawing is created from the measured point cloud data.
 点群から図面を作成するために、計測した点群上において、壁や柱などの所定のオブジェクトの位置を指示又は探索し、図面に記載する方法がある。例えば、屋内には机などの多数の遮蔽物が存在し、床面の多くの領域を計測できない。また、棚などの遮蔽物によって、壁も一部の領域しか計測できない。さらに、空間内にどのようなオブジェクトが存在するかの情報を事前に得ることはできない。つまり、点群データから図面を作成するためには、未知の環境下でかつ空間の一部しか計測できない状況下において、計測対象の空間に存在するオブジェクトや、当該オブジェクトの場所を決定する必要がある。 In order to create a drawing from a point cloud, there is a method of instructing or searching the position of a predetermined object such as a wall or a pillar on the measured point cloud and describing it in the drawing. For example, there are many shielding objects such as desks indoors, and many areas of the floor cannot be measured. Moreover, only a part of the wall can be measured by a shielding object such as a shelf. Furthermore, it is not possible to obtain in advance information about what objects exist in the space. In other words, in order to create a drawing from point cloud data, it is necessary to determine the object that exists in the measurement target space and the location of the object in an unknown environment and in a situation where only a part of the space can be measured. is there.
 特許文献3に記載されている方法によれば、未知の環境においても、計測した点群からエッジなどのモデルを認識することができる。一方、レーザ光で対象を計測した場合、遮蔽物の裏の領域にはレーザ光が届かないためにデータを取得できない。つまり、特許文献3で記載された方法では、データが存在しない領域の形状を認識できないため、レーザの届かない領域を図面化できない。 According to the method described in Patent Document 3, a model such as an edge can be recognized from a measured point cloud even in an unknown environment. On the other hand, when the target is measured with laser light, data cannot be acquired because the laser light does not reach the area behind the shield. That is, the method described in Patent Document 3 cannot recognize the shape of a region where no data exists, and therefore cannot draw a region where the laser does not reach.
 また、未知の環境においても、オブジェクトの種類を壁及び円柱として、オブジェクトを探索する方法が提案されている。しかし、この方法では、屋内のように遮蔽が多く、図面に記載すべき形状の大半が計測できていない場合は、探索を実施するのに十分な点群が得られず、オブジェクトの探索をロバストに行うことは難しい。 Also, in an unknown environment, a method for searching for an object using a type of object as a wall or a cylinder has been proposed. However, with this method, if there is a lot of shielding, such as indoors, and most of the shapes to be described in the drawing cannot be measured, a point cloud sufficient to perform the search cannot be obtained, and the object search is robust. Difficult to do.
 本願において開示される発明の代表的な一例を示せば以下の通りである。すなわち、計測部による周囲の計測結果から形状図面を作成する図面作成システムであって、基準面を抽出する高さの設定を受け付ける基準面高さ設定部と、前記基準面高さ設定部によって設定された高さの高所において基準面を抽出する基準面抽出部と、前記基準面の法線となる基準法線を推定する基準法線推定部と、前記基準面の形状を二次元形状として、前記基準面の位置から前記基準法線の方向に設定された長さまで延伸した形状が表された図面を作成する形状復元部と、を備える。 A typical example of the invention disclosed in the present application is as follows. That is, a drawing creation system for creating a shape drawing from surrounding measurement results by a measurement unit, which is set by a reference surface height setting unit that accepts a height setting for extracting a reference surface, and the reference surface height setting unit A reference surface extraction unit that extracts a reference surface at a height of the height, a reference normal estimation unit that estimates a reference normal that is a normal of the reference surface, and a shape of the reference surface as a two-dimensional shape A shape restoring unit that creates a drawing showing a shape extended from the position of the reference plane to a length set in the direction of the reference normal.
 本発明の一形態によれば、点群データから失敗が少なく、すなわちロバストな図面を作成することができる。前述した以外の課題、構成及び効果は、以下の実施例の説明により明らかにされる。 According to an embodiment of the present invention, it is possible to create a robust drawing with few failures from point cloud data, that is, a robust drawing. Problems, configurations, and effects other than those described above will become apparent from the description of the following embodiments.
第1実施例の図面作成システムの概要の構成を示す図である。It is a figure which shows the structure of the outline | summary of the drawing creation system of 1st Example. 第1実施例の図面作成システムの詳細な構成を示す図である。It is a figure which shows the detailed structure of the drawing creation system of 1st Example. 第1実施例の図面作成システムの物理的な構成を示すブロック図である。It is a block diagram which shows the physical structure of the drawing creation system of 1st Example. 第1実施例の図面作成システムが実行する処理のフローチャートである。It is a flowchart of the process which the drawing production system of 1st Example performs. 第1実施例の形状計測部による周囲の形状の計測を示す図である。It is a figure which shows the measurement of the surrounding shape by the shape measurement part of 1st Example. 第1実施例の形状データ選択部が受け付ける範囲を選択するための画面を示す図である。It is a figure which shows the screen for selecting the range which the shape data selection part of 1st Example receives. 第1実施例の基準面抽出部による基準面の抽出を示す図である。It is a figure which shows extraction of the reference plane by the reference plane extraction part of 1st Example. 第1実施例の二次元投影部による存在物の二次元画像への投影を示す図である。It is a figure which shows the projection to the two-dimensional image of the presence thing by the two-dimensional projection part of 1st Example. 第1実施例の二次元形状データの抽出を示す図である。It is a figure which shows extraction of the two-dimensional shape data of 1st Example. 第1実施例の対応基準面高さ設定部が受け付ける距離を設定するための画面を示す。The screen for setting the distance which the corresponding reference plane height setting part of 1st Example accepts is shown. 第1実施例の形状復元部による全体の形状の復元を示す図である。It is a figure which shows restoration | restoration of the whole shape by the shape restoration part of 1st Example. 第2実施例のヒストグラムの作成を示す図である。It is a figure which shows creation of the histogram of 2nd Example. 第3実施例の壁に接続されている配管の抽出を示す図である。It is a figure which shows extraction of the piping connected to the wall of 3rd Example.
 以下、図面を用いて本発明の実施例について説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 <第1実施例>
 図1は、本実施例の図面作成システムの概要の構成を示す図である。
<First embodiment>
FIG. 1 is a diagram illustrating a schematic configuration of a drawing creation system according to the present embodiment.
 図1に示す図面作成システムは、最小限の構成として、基準面高さ設定部102、基準面抽出部103、基準法線推定部106及び形状復元部115を有する。また、図面作成システムは、形状計測部100が周囲の形状を計測した形状データ101、及び復元された形状を表す図面情報116を記憶装置(メモリ12、補助記憶装置13)に格納する。 1 includes a reference surface height setting unit 102, a reference surface extraction unit 103, a reference normal estimation unit 106, and a shape restoration unit 115 as a minimum configuration. In addition, the drawing creation system stores the shape data 101 measured by the shape measuring unit 100 and the drawing information 116 representing the restored shape in the storage device (memory 12, auxiliary storage device 13).
 本実施例では、形状データ101の一例として、点の集合(点群)を扱うが、形状を表す(具体的には、形状が取得できてエッジや頂点の情報を含まない)データであればよい。 In the present embodiment, a set of points (point group) is handled as an example of the shape data 101. However, if the data represents a shape (specifically, the shape can be acquired and information on edges and vertices is not included). Good.
 まず、基準面高さ設定部102は、オペレータによる高さの設定を受け付ける。例えば、人の手が直接届きづらく人工物の少ない高さ(約2m)を設定すればよい。高所には人工物が少ないため、形状計測時に人工物に遮蔽されることなく形状(例えば、壁)を計測可能である。なお、高さの設定は、基準面を抽出する下限の高さでも、基準面を抽出する高さの範囲でもよい。 First, the reference surface height setting unit 102 accepts the height setting by the operator. For example, it is sufficient to set a height (about 2 m) at which there are few artifacts that are difficult for human hands to reach directly. Since there are few artifacts in high places, it is possible to measure a shape (for example, a wall) without being shielded by the artifact during shape measurement. The height may be set to a lower limit height for extracting the reference plane or a height range for extracting the reference plane.
 基準面抽出部103は、基準面高さ設定部102が設定した高さ以上(又は、設定した高さの範囲)の高所に存在し、形状復元の基準となる面を形状データ101から抽出する。本実施例では遮蔽の少ない高所の面(例えば、天井面)を基準として全体の形状を復元する。なお、形状を容易に計測できる高所にある面であれば、基準面は天井以外の面でもよい。基準面を抽出する際には、鉛直方向に位置しつつ且つオペレータが指示した高さ以上に存在する面をハフ変換などの平面探索アルゴリズムによって抽出する。 The reference surface extraction unit 103 extracts a surface that is present at a height higher than the height set by the reference surface height setting unit 102 (or the set height range) and serves as a reference for shape restoration from the shape data 101. To do. In the present embodiment, the entire shape is restored based on a high surface (for example, a ceiling surface) with less shielding. Note that the reference surface may be a surface other than the ceiling as long as the surface is at a high place where the shape can be easily measured. When extracting the reference plane, a plane that is positioned in the vertical direction and that is higher than the height specified by the operator is extracted by a plane search algorithm such as Hough transform.
 基準法線推定部106は、基準面抽出部103が抽出した基準面上の形状データの法線ベクトルを推定する。具体的には、基準面上の形状データに最小二乗法を用いて平面を当てはめ、当てはめた平面の法線の値を用いることによって法線ベクトルを推定できる。又は、オペレータが法線の値を直接入力してもよい。 The reference normal estimation unit 106 estimates the normal vector of the shape data on the reference surface extracted by the reference surface extraction unit 103. Specifically, a normal vector can be estimated by fitting a plane to the shape data on the reference plane using the least square method and using the value of the normal of the fitted plane. Alternatively, the operator may directly input the normal value.
 形状復元部115は、基準面の形状を推定された法線の方向に延ばすことによって、三次元情報に変換して、全体の形状を復元し、図面情報116として格納する。 The shape restoration unit 115 converts the shape of the reference surface in the direction of the estimated normal to convert it into three-dimensional information, restores the entire shape, and stores it as drawing information 116.
 図2は、本実施例の図面作成システムの詳細な構成を示す図である。図2の説明において、概略構成(図1)で前述した要素のうち詳細構成(図2)と同一の部分の説明は省略する。 FIG. 2 is a diagram showing a detailed configuration of the drawing creation system of the present embodiment. In the description of FIG. 2, the description of the same components as those in the detailed configuration (FIG. 2) among the elements described above in the schematic configuration (FIG. 1) will be omitted.
 図2に示す図面作成システムは、オプショナルな構成も含み、基準面高さ設定部102、基準面抽出部103、形状データ表示部104、形状データ選択部105、基準法線推定部106、人工物形状抽出部107、二次元投影部108、形状抽出部109、二次元形状データ110、画像表示部111、形状編集部112、対応基準面抽出部113、対応基準面高さ設定部114及び形状復元部115を有する。また、図面作成システムは、形状計測部100が周囲の形状を計測した形状データ101、形状抽出部109が抽出した形状である二次元形状データ110、及び形状を表す図面情報116を記憶装置(メモリ12、補助記憶装置13)に格納する。 The drawing creation system shown in FIG. 2 includes an optional configuration, and includes a reference surface height setting unit 102, a reference surface extraction unit 103, a shape data display unit 104, a shape data selection unit 105, a reference normal estimation unit 106, an artifact. Shape extraction unit 107, two-dimensional projection unit 108, shape extraction unit 109, two-dimensional shape data 110, image display unit 111, shape editing unit 112, corresponding reference surface extraction unit 113, corresponding reference surface height setting unit 114, and shape restoration Part 115. In addition, the drawing creating system stores shape data 101 obtained by measuring the surrounding shape by the shape measuring unit 100, two-dimensional shape data 110 which is a shape extracted by the shape extracting unit 109, and drawing information 116 representing the shape in a storage device (memory). 12 and stored in the auxiliary storage device 13).
 まず、基準面高さ設定部102は、オペレータによる高さの設定を受け付ける。そして、基準面抽出部103は、基準面高さ設定部102が設定した高さ以上の高所に存在し、形状復元の基準となる面を形状データ101から抽出する(図7参照)。 First, the reference surface height setting unit 102 accepts the height setting by the operator. Then, the reference surface extraction unit 103 extracts a surface which is present at a height higher than the height set by the reference surface height setting unit 102 and serves as a reference for shape restoration from the shape data 101 (see FIG. 7).
 形状データ表示部104は、形状データ101をディスプレイ装置19に表示する。形状データ選択部105は、画面上に表示した形状データからオペレータが選択した範囲を受け付け、選択された範囲内の形状データを選択する。基準面抽出部103は、選択された範囲の形状データによって構成される面を基準面としてもよい(図6参照)。 The shape data display unit 104 displays the shape data 101 on the display device 19. The shape data selection unit 105 receives a range selected by the operator from the shape data displayed on the screen, and selects shape data within the selected range. The reference surface extraction unit 103 may use a surface constituted by the shape data of the selected range as the reference surface (see FIG. 6).
 基準法線推定部106は、基準面抽出部103が抽出した基準面上の形状データの法線ベクトル(基準法線)を推定する。 The reference normal estimation unit 106 estimates the normal vector (reference normal) of the shape data on the reference surface extracted by the reference surface extraction unit 103.
 人工物形状抽出部107は、基準面抽出部103が抽出した基準面から、推定された基準法線の方向に設定された範囲内に存在する部分的な形状データを抽出する。この設定された範囲内は、部分的な形状データを抽出する高さ方向の範囲としてオペレータが設定してもよいし、予め設定した高さ方向の範囲でもよい。 The artifact shape extraction unit 107 extracts partial shape data existing within a range set in the direction of the estimated reference normal from the reference surface extracted by the reference surface extraction unit 103. The set range may be set by an operator as a range in the height direction from which partial shape data is extracted, or may be a preset range in the height direction.
 二次元投影部108は、人工物形状抽出部107が抽出した形状データを、基準面と水平な二次元画像面上に射影する(図8参照)。基準面と垂直な人工物は、二次元画像上において線形状となって表される。例えば基準面を天井とした場合、垂直に設置された壁は二次元画像において直線として表される。 The two-dimensional projection unit 108 projects the shape data extracted by the artifact shape extraction unit 107 onto a two-dimensional image plane that is parallel to the reference plane (see FIG. 8). The artifact perpendicular to the reference plane is represented as a line shape on the two-dimensional image. For example, when the reference plane is a ceiling, a vertically installed wall is represented as a straight line in the two-dimensional image.
 形状抽出部109は、二次元画像から線形状を抽出し、抽出した形状を二次元形状データ110として格納する(図9参照)。具体的には、二次元画像情報からハフ変換などの平面探索アルゴリズムや動的輪郭抽出法などを用いて線を自動的に認識し、二次元形状データ110を抽出する。 The shape extraction unit 109 extracts a line shape from the two-dimensional image, and stores the extracted shape as two-dimensional shape data 110 (see FIG. 9). Specifically, the line is automatically recognized from the two-dimensional image information using a plane search algorithm such as a Hough transform or a dynamic contour extraction method, and the two-dimensional shape data 110 is extracted.
 画像表示部111は、二次元画像を画面に表示する。形状編集部112は、基準面と垂直関係の人工物の二次元形状データの、オペレータによる指定を受け付ける。形状抽出部109は、オペレータの指定に従って二次元形状データ110を抽出してもよい。 The image display unit 111 displays a two-dimensional image on the screen. The shape editing unit 112 accepts designation by the operator of the two-dimensional shape data of the artifact perpendicular to the reference plane. The shape extraction unit 109 may extract the two-dimensional shape data 110 according to an operator's specification.
 対応基準面抽出部113は、抽出された基準面から、推定された基準法線の方向に所定の距離だけ離れた領域を対応基準面として抽出する。対応基準面高さ設定部114は、オペレータによる所定の距離の設定を受け付ける(図10参照)。なお、対応基準面を抽出する際に、基準面と同じ方向の法線を持ち、計測地点において鉛直下方向に存在する面を自動的に認識してもよい。 The corresponding reference plane extraction unit 113 extracts, as a corresponding reference plane, an area that is separated from the extracted reference plane by a predetermined distance in the direction of the estimated reference normal. The corresponding reference plane height setting unit 114 receives a predetermined distance setting by the operator (see FIG. 10). When extracting the corresponding reference surface, a surface having a normal in the same direction as the reference surface and existing in the vertically downward direction at the measurement point may be automatically recognized.
 形状復元部115は、基準面を上面とし、対応基準面を下面とし、形状抽出部109が抽出した二次元形状データ110を、基準面と対応基準面との間で基準面の法線方向に移動することによって、三次元情報に変換する(図11参照)。このようにして、基準面と対応基準面との間を補間することによって、全体の形状を図面情報116として復元することができる。 The shape restoration unit 115 sets the reference surface as the upper surface, the corresponding reference surface as the lower surface, and the two-dimensional shape data 110 extracted by the shape extraction unit 109 in the normal direction of the reference surface between the reference surface and the corresponding reference surface. By moving, it is converted into three-dimensional information (see FIG. 11). In this way, the entire shape can be restored as the drawing information 116 by interpolating between the reference plane and the corresponding reference plane.
 図3は、本実施例の図面作成システムの物理的な構成を示すブロック図である。 FIG. 3 is a block diagram showing the physical configuration of the drawing creation system of the present embodiment.
 本実施形態の図面作成システムは、プロセッサ(CPU)11、メモリ12、補助記憶装置13及び通信インターフェース14を有する計算機によって構成される。 The drawing creation system of this embodiment is configured by a computer having a processor (CPU) 11, a memory 12, an auxiliary storage device 13, and a communication interface 14.
 プロセッサ11は、メモリ12に格納されたプログラムを実行する。メモリ12は、不揮発性の記憶素子であるROM及び揮発性の記憶素子であるRAMを含む。ROMは、不変のプログラム(例えば、BIOS)などを格納する。RAMは、DRAM(Dynamic Random Access Memory)のような高速かつ揮発性の記憶素子であり、プロセッサ11が実行するプログラム及びプログラムの実行時に使用されるデータを一時的に格納する。 The processor 11 executes a program stored in the memory 12. The memory 12 includes a ROM that is a nonvolatile storage element and a RAM that is a volatile storage element. The ROM stores an immutable program (for example, BIOS). The RAM is a high-speed and volatile storage element such as a DRAM (Dynamic Random Access Memory), and temporarily stores a program executed by the processor 11 and data used when the program is executed.
 補助記憶装置13は、例えば、磁気記憶装置(HDD)、フラッシュメモリ(SSD)等の大容量かつ不揮発性の記憶装置であり、プロセッサ11が実行するプログラム及びプログラムの実行時に使用されるデータ(例えば、地図データ)を格納する。すなわち、プログラムは、補助記憶装置13から読み出されて、メモリ12にロードされて、プロセッサ11によって実行される。 The auxiliary storage device 13 is a large-capacity non-volatile storage device such as a magnetic storage device (HDD) or a flash memory (SSD), for example. The auxiliary storage device 13 is a program executed by the processor 11 and data used when the program is executed (for example, , Map data). That is, the program is read from the auxiliary storage device 13, loaded into the memory 12, and executed by the processor 11.
 図面作成システムは、入力インターフェース15及び出力インターフェース18を有してもよい。入力インターフェース15は、キーボード16やマウス17などが接続され、オペレータからの入力を受けるインターフェースである。出力インターフェース18は、ディスプレイ装置19やプリンタなどが接続され、プログラムの実行結果をオペレータが視認可能な形式で出力するインターフェースである。 The drawing creation system may have an input interface 15 and an output interface 18. The input interface 15 is an interface to which a keyboard 16 and a mouse 17 are connected and receives an input from an operator. The output interface 18 is an interface to which a display device 19 or a printer is connected, and outputs an execution result of the program in a format that can be visually recognized by the operator.
 通信インターフェース14は、所定のプロトコルに従って、他の装置との通信を制御するネットワークインターフェース装置である。図面作成システムは、通信インターフェース14を介して端末(図示省略)と接続されてもよく、該端末から入力された指示に従って動作し、該端末に演算結果を出力してもよい。 The communication interface 14 is a network interface device that controls communication with other devices according to a predetermined protocol. The drawing creation system may be connected to a terminal (not shown) via the communication interface 14, may operate according to an instruction input from the terminal, and may output a calculation result to the terminal.
 プロセッサ11が実行するプログラムは、リムーバブルメディア(CD-ROM、フラッシュメモリなど)又はネットワークを介して図面作成システムに提供され、非一時的記憶媒体である不揮発性の記憶装置13に格納される。このため、図面作成システムは、リムーバブルメディアからデータを読み込むインターフェースを有するとよい。 The program executed by the processor 11 is provided to the drawing creation system via a removable medium (CD-ROM, flash memory, etc.) or a network, and is stored in a nonvolatile storage device 13 which is a non-temporary storage medium. For this reason, the drawing creation system may have an interface for reading data from a removable medium.
 図面作成システムは、物理的に一つの計算機上で、又は、論理的又は物理的に構成された複数の計算機上で構成される計算機システムであり、同一の計算機上で別個のスレッドで動作してもよく、複数の物理的計算機資源上に構築された仮想計算機上で動作してもよい。 A drawing creation system is a computer system that is configured on one computer or a plurality of computers that are logically or physically configured, and operates on separate threads on the same computer. Alternatively, it may operate on a virtual machine constructed on a plurality of physical computer resources.
 図4は、本実施例の図面作成システムが実行する処理のフローチャートである。 FIG. 4 is a flowchart of processing executed by the drawing creation system of this embodiment.
 まず、基準面高さ設定部102は、オペレータによる高さの設定を受け付ける基準面高さ設定処理を実行する(S102)。 First, the reference surface height setting unit 102 executes a reference surface height setting process for receiving a height setting by an operator (S102).
 その後、基準面抽出部103は、基準面高さ設定処理S102において設定された高さ以上の高所に存在し、形状復元の基準となる面を形状データ101から抽出する基準面抽出処理を実行する(S103)。 Thereafter, the reference surface extraction unit 103 executes a reference surface extraction process for extracting from the shape data 101 a surface that exists at a height higher than the height set in the reference surface height setting process S102 and serves as a reference for shape restoration. (S103).
 一方、形状データ表示部104は、形状データ101をディスプレイ装置19に表示する形状データ表示処理を実行する(S104)。形状データ選択部105は、画面上に表示した形状データからオペレータが選択した範囲を受け付け、選択された範囲の形状データ選択し、基準面抽出部103に出力する形状データ選択処理を実行する(S105)。基準面抽出部103は、形状データ選択処理S105において選択された範囲の形状データによって構成される面を基準面としてもよい。 Meanwhile, the shape data display unit 104 executes shape data display processing for displaying the shape data 101 on the display device 19 (S104). The shape data selection unit 105 receives the range selected by the operator from the shape data displayed on the screen, selects the shape data of the selected range, and executes the shape data selection process that is output to the reference plane extraction unit 103 (S105). ). The reference surface extraction unit 103 may use a surface constituted by the shape data in the range selected in the shape data selection process S105 as the reference surface.
 基準法線推定部106は、基準面抽出処理S103において抽出された基準面上の形状データの法線ベクトル(基準法線)を推定する基準法線推定処理を実行する(S106)。 The reference normal estimation unit 106 executes a reference normal estimation process for estimating a normal vector (reference normal) of shape data on the reference plane extracted in the reference plane extraction process S103 (S106).
 その後、人工物形状抽出部107は、基準面抽出処理S103において抽出された基準面から、推定された基準法線の方向に所定の範囲内に存在する部分的な形状データを抽出する人工物形状抽出処理を実行する(S107)。 Thereafter, the artifact shape extraction unit 107 extracts, from the reference plane extracted in the reference plane extraction process S103, partial shape data existing within a predetermined range in the direction of the estimated reference normal. An extraction process is executed (S107).
 その後、二次元投影部108は、人工物形状抽出処理S107において抽出された形状データを、基準面と水平な二次元画像面上に射影する二次元投影処理を実行する(S108)。 Thereafter, the two-dimensional projection unit 108 executes a two-dimensional projection process for projecting the shape data extracted in the artifact shape extraction process S107 onto a two-dimensional image plane parallel to the reference plane (S108).
 その後、形状抽出部109は、二次元画像から線形状を抽出し、抽出した形状を二次元形状データ110として格納する形状抽出処理を実行する(S109)。 Thereafter, the shape extraction unit 109 extracts a line shape from the two-dimensional image, and executes a shape extraction process for storing the extracted shape as the two-dimensional shape data 110 (S109).
 また、画像表示部111は、二次元画像を画面に表示する画像編集処理を実行する(S111)。形状編集部112は、基準面と垂直関係の人工物の二次元形状データの、オペレータによる指定を受け付け、形状抽出部109に出力する形状編集処理を実行する(S112)。形状抽出部109は、オペレータの指定に従って二次元形状データ110を抽出してもよい。 Further, the image display unit 111 executes image editing processing for displaying a two-dimensional image on the screen (S111). The shape editing unit 112 receives the designation by the operator of the two-dimensional shape data of the artifact perpendicular to the reference plane, and executes the shape editing process output to the shape extraction unit 109 (S112). The shape extraction unit 109 may extract the two-dimensional shape data 110 according to an operator's specification.
 一方、対応基準面抽出部113は、抽出された基準面から、推定された基準法線の方向に所定の距離だけ離れた領域を対応基準面として抽出する対応基準面抽出処理を実行する(S113)。対応基準面高さ設定部114は、オペレータによる所定の距離の設定を受け付け、対応基準面抽出部113に出力する対応基準面高さ設定処理を実行する(S114)。 On the other hand, the corresponding reference surface extraction unit 113 executes a corresponding reference surface extraction process that extracts a region that is a predetermined distance away from the extracted reference surface in the direction of the estimated reference normal as a corresponding reference surface (S113). ). The corresponding reference surface height setting unit 114 receives a setting of a predetermined distance by the operator, and executes a corresponding reference surface height setting process that is output to the corresponding reference surface extraction unit 113 (S114).
 その後、形状復元部115は、基準面を上面とし、対応基準面を下面とし、形状抽出処理S109において抽出された二次元形状データ110(又は、基準面の形状)を、基準面と対応基準面との間で基準面の法線方向に移動することによって、三次元情報に変換する形状復元処理を実行する(S115)。 Thereafter, the shape restoration unit 115 sets the reference surface as the upper surface, the corresponding reference surface as the lower surface, and uses the two-dimensional shape data 110 (or the shape of the reference surface) extracted in the shape extraction process S109 as the reference surface and the corresponding reference surface. A shape restoration process for converting into three-dimensional information is performed by moving in the normal direction of the reference plane (S115).
 図5は、形状計測部100による周囲の形状の計測を示す図である。 FIG. 5 is a diagram showing the measurement of the surrounding shape by the shape measuring unit 100.
 図5(A)に示すように、形状計測部100は、様々な方向にレーザ光を照射し、レーザ光の反射光が戻るまでの時間を計測することによって、レーザ光を照射した方向に存在する物までの距離を計測できる。形状計測部100は、周囲全方向にレーザ光を照射し、反射光を受光するまでの時間を繰り返し計測することによって、周囲に存在する全ての物の形状を計測することができる。 As shown in FIG. 5A, the shape measuring unit 100 irradiates the laser light in various directions and measures the time until the reflected light of the laser light returns to exist in the direction irradiated with the laser light. You can measure the distance to an object. The shape measuring unit 100 can measure the shapes of all objects existing around by irradiating laser light in all directions and repeatedly measuring the time until the reflected light is received.
 形状計測部100を屋内に設置して周囲の形状を計測すると、天井、壁、柱などの存在物201と、机、椅子、棚、ボードなど屋内に設置している存在物202とが、図5(B)に示すように、レーザ光の照射点に対応する点203、204、205の集合として計測できる。 When the shape measuring unit 100 is installed indoors and the surrounding shape is measured, an entity 201 such as a ceiling, a wall, and a pillar and an entity 202 installed indoors such as a desk, chair, shelf, and board are illustrated. As shown in FIG. 5B, measurement can be performed as a set of points 203, 204, and 205 corresponding to the laser light irradiation points.
 なお、図5に示す点群203は、内部の存在物が図に表れるようにするため、本来は計測されている手前の面及び右の面の点群を省略して図示している。また、理解を容易にするため、壁などの大まかな形状204は大きな点で記載し、屋内の細かな存在物205は細かな形状を示すため小さな点で記載している。しかし、全領域において、レーザ光の照射点の大きさはほぼ同じであり、点の密度もほぼ同じである。但し、本実施例においては、部分的に(例えば、領域毎に)計測密度が異なってもよい。 Note that the point cloud 203 shown in FIG. 5 is illustrated by omitting the point cloud on the front surface and the right surface, which are originally measured, so that the internal existence appears in the figure. In order to facilitate understanding, a rough shape 204 such as a wall is described with a large point, and a minute indoor object 205 is described with a small point to indicate a fine shape. However, in all regions, the size of the laser light irradiation point is substantially the same, and the density of the points is also substantially the same. However, in the present embodiment, the measurement density may be partially different (for example, for each region).
 周囲を計測するだけでは、計測点とオブジェクトとの関係の情報は得られない。さらに、レーザ光が直接届かない領域(例えば、棚、ボードに隠れた形状)は計測できない。棚やボードなどの大きな存在物の背後の壁の形状は、僅かな領域しか計測できないため、その領域に壁の存在を判定することは困難である。さらに、机や椅子など、床に設置する存在物に遮られる領域もあり、全ての床面の計測は困難である。これに対し、天井のような高所では、レーザ光が遮蔽物に遮られることが少ないので、高所の形状は容易に計測できる。このため、本実施例では高所の形状に基づいて全体の形状を復元する。 Measurement of the surroundings alone does not provide information on the relationship between measurement points and objects. Furthermore, it is impossible to measure a region (for example, a shape hidden on a shelf or a board) where the laser beam does not reach directly. Since the shape of the wall behind large objects such as shelves and boards can only be measured in a small area, it is difficult to determine the presence of a wall in that area. In addition, there are areas that are blocked by objects such as desks and chairs, and it is difficult to measure all floor surfaces. On the other hand, since the laser beam is less likely to be blocked by the shielding object at a high place such as the ceiling, the shape of the high place can be easily measured. For this reason, in this embodiment, the entire shape is restored based on the shape of the high place.
 図6は、形状データ選択部105が受け付ける範囲を選択するための画面を示す図である。 FIG. 6 is a diagram showing a screen for selecting a range that the shape data selection unit 105 accepts.
 図面作成システムがオペレータによる手動で基準面(天井)を認識する場合、形状データ表示部104は、形状データ101をディスプレイ装置19に表示する。オペレータは、マウス17などの入力装置を操作して、所望の範囲300を指定する。形状データ選択部105は、オペレータが指定した範囲300を受け付け、指定された範囲内に存在する形状データ301を選択する。より具体的には、形状データ選択部105は、形状選択モード及び形状選択解除モードを切り替えるモード切替部を有する。形状選択モードで、オペレータが範囲を指定すると、範囲内の形状データが選択状態となる。一方、形状選択解除モードで、オペレータが範囲を指定すると、範囲内の形状データが非選択状態となる。 When the drawing creation system manually recognizes the reference plane (ceiling) by the operator, the shape data display unit 104 displays the shape data 101 on the display device 19. The operator operates an input device such as the mouse 17 to designate a desired range 300. The shape data selection unit 105 receives the range 300 specified by the operator, and selects the shape data 301 existing within the specified range. More specifically, the shape data selection unit 105 includes a mode switching unit that switches between a shape selection mode and a shape selection cancellation mode. When the operator designates a range in the shape selection mode, the shape data within the range is selected. On the other hand, when the operator designates a range in the shape selection cancellation mode, the shape data in the range is in a non-selected state.
 又は、形状データ選択部105は、オペレータが指定した範囲が複数のオブジェクトを含む場合、手前の存在物を自動で抽出し、奥の存在物を非選択状態にしてもよい。存在物の抽出方法は、ハフ変換などの平面抽出アルゴリズムを用いて、選択範囲内に含まれる形状データを複数の面に分けて、存在物の面を抽出する。そして、抽出した複数の平面から表示画面上で最も手前に位置する平面を選ぶことによって、手前に存在する存在物のみを抽出できる。また、オペレータの操作によって、奥側の点群の選択を解除し、最も手前にある点を選択することによって、最も手前にある平面を選んでもよい。 Alternatively, when the range designated by the operator includes a plurality of objects, the shape data selection unit 105 may automatically extract the front entity and make the rear entity non-selected. In the method for extracting an entity, a plane extraction algorithm such as a Hough transform is used to divide shape data included in a selection range into a plurality of planes and extract the plane of the entity. Then, by selecting the plane closest to the front on the display screen from the plurality of extracted planes, it is possible to extract only the existing entity. Also, the plane on the foreground may be selected by canceling the selection of the point group on the back side by the operator's operation and selecting the point on the foreground.
 図7は、基準面抽出部103による基準面の抽出を示す図である。 FIG. 7 is a diagram illustrating the extraction of the reference plane by the reference plane extraction unit 103.
 基準面抽出部103は、基準面高さ設定部102がオペレータによる高さの設定を受け付けた後、ハフ変換などの平面探索アルゴリズムによって、受け付けた高さ以上に存在する平面を抽出する。この際、ハフ変換などの平面抽出方法を用いて平面を抽出するだけでは、複数の平面候補が抽出されていることがある。そこで、形状データ選択部105が選択済み形状400を受け付けている場合、選択済み形状400を含む平面401のみを抽出する。 After the reference surface height setting unit 102 receives the height setting by the operator, the reference surface extraction unit 103 extracts a plane that exists above the received height by a plane search algorithm such as Hough transform. At this time, a plurality of plane candidates may be extracted simply by extracting a plane using a plane extraction method such as Hough transform. Therefore, when the shape data selection unit 105 accepts the selected shape 400, only the plane 401 including the selected shape 400 is extracted.
 図8は、二次元投影部108による存在物の二次元画像への投影を示す図である。 FIG. 8 is a diagram illustrating the projection of an entity onto a two-dimensional image by the two-dimensional projection unit 108.
 まず、基準法線推定部106が、基準面抽出部103が抽出した基準面401の基準法線500の方向を推定する。例えば、最小二乗法を用いて基準面上の形状に平面を当てはめることによって、法線方向を推定することができる。さらに、当てはめをロバストにするため、RANSAC等のロバスト推定法を用いてもよい。 First, the reference normal estimation unit 106 estimates the direction of the reference normal 500 of the reference surface 401 extracted by the reference surface extraction unit 103. For example, the normal direction can be estimated by applying a plane to the shape on the reference plane using the least square method. Furthermore, a robust estimation method such as RANSAC may be used to make the fit robust.
 次に、人工物形状抽出部107は、図8(A)に示すように、基準面401から、推定した基準法線500の方向に所定の範囲501の内に存在する形状データ502を抽出する。ここで、所定の範囲501とは、天井を基準面401とした場合、壁以外は少数のオブジェクトしか存在しない領域である。例えば、天井から遠い領域には机や椅子などの壁以外の存在物があるため、天井から遠い領域で形状を抽出すると、形状抽出のロバスト性が低下する。一方、天井に極めて近い範囲を選択した場合、天井から吊り下げられているオブジェクトが検出されることがある。これらのオブジェクトを避けるため、天井から20cm~100cmの範囲を所定の範囲501として設定するとよい。なお、基準面401と垂直に設けられた物以外の存在物が少ない範囲であればよい。 Next, as shown in FIG. 8A, the artifact shape extraction unit 107 extracts shape data 502 existing within a predetermined range 501 in the direction of the estimated reference normal 500 from the reference surface 401. . Here, the predetermined range 501 is an area where there are only a small number of objects other than walls when the ceiling is the reference plane 401. For example, since there are objects other than walls such as desks and chairs in the region far from the ceiling, extracting the shape in the region far from the ceiling reduces the robustness of shape extraction. On the other hand, when a range extremely close to the ceiling is selected, an object suspended from the ceiling may be detected. In order to avoid these objects, a range from 20 cm to 100 cm from the ceiling may be set as the predetermined range 501. It should be noted that it may be in a range where there are few entities other than those provided perpendicular to the reference surface 401.
 オペレータが入力画面において数値を入力し、入力された数値に対応する範囲内の点を所定の色(例えば、赤色)に変えて表示し、画面上で範囲を確認し、「OK」ボタンを操作することによって、所定の範囲501を設定するインターフェースを提供してもよい。 The operator inputs a numerical value on the input screen, displays a point within the range corresponding to the input numerical value in a predetermined color (for example, red), confirms the range on the screen, and operates the “OK” button. By doing so, an interface for setting the predetermined range 501 may be provided.
 さらに、本処理を2回以上繰り返し、複数の所定の範囲501を抽出してもよい。複数の範囲を抽出することによって、天井より下のある範囲に、壁や柱以外のオブジェクトが存在する場合でも、当該オブジェクトを避けて所定の範囲501を設定することができる。 Further, this process may be repeated twice or more to extract a plurality of predetermined ranges 501. By extracting a plurality of ranges, even when an object other than a wall or a pillar exists in a certain range below the ceiling, the predetermined range 501 can be set avoiding the object.
 二次元投影部108は、図8(B)に示すように、基準面401の基準法線500の方向と法線が同じ二次元平面上に、人工物形状抽出部107が抽出した形状503を射影し、二次元画像504を生成する。なお、図8(A)に示す点群は前述のように手前と右の壁を省略して記載しており、実際はその領域にも点群が存在する。そのため、二次元画像(図8(B))には、手前の壁と右の壁に対応する点群も表した。 As shown in FIG. 8B, the two-dimensional projection unit 108 forms the shape 503 extracted by the artifact shape extraction unit 107 on a two-dimensional plane having the same normal as the direction of the reference normal 500 of the reference surface 401. Projection is performed to generate a two-dimensional image 504. Note that the point cloud shown in FIG. 8A is described with the front and right walls omitted as described above, and actually there are also point clouds in that region. For this reason, the two-dimensional image (FIG. 8B) also shows point groups corresponding to the front wall and the right wall.
 図9は、人工物を投影した二次元画像上から二次元形状データ110の抽出を示す図である。 FIG. 9 is a diagram illustrating extraction of the two-dimensional shape data 110 from the two-dimensional image on which the artifact is projected.
 人工物形状抽出部107が抽出した形状503は、基準面に対して垂直に延伸する形状なので、二次元画像上では線となる。具体的には、壁は直線、円柱は丸、角柱は四角形となる。形状抽出部109は、二次元画像から線形状600を探索し、探索された線形状を二次元形状データとして抽出する。具体的には、ハフ変換を用いて、二次元画像から自動的に線を抽出することができる。また、複数の線から構成される壁全体の輪郭要素を抽出するためには動的輪郭法を用いることができる。 The shape 503 extracted by the artifact shape extraction unit 107 is a shape that extends perpendicularly to the reference plane, and thus is a line on the two-dimensional image. Specifically, the wall is a straight line, the cylinder is a circle, and the prism is a rectangle. The shape extraction unit 109 searches for a line shape 600 from the two-dimensional image, and extracts the searched line shape as two-dimensional shape data. Specifically, a line can be automatically extracted from a two-dimensional image using the Hough transform. Further, the dynamic contour method can be used to extract the contour element of the entire wall composed of a plurality of lines.
 画像表示部111は、抽出された二次元形状データを含む二次元画像をディスプレイ装置19に表示する。オペレータは、マウス17などの入力装置を操作して二次元形状データを修正できる。例えば、ノイズなどによってオブジェクトが誤検出された場合や、検出できなかったオブジェクトがある場合は、形状編集部112は、オペレータの操作による二次元形状データの修正を受け付ける。具体的には、壁を追加する場合、始点及び終点の2点をクリックすることによって、線601を追加することができる。壁を削除する場合、線上の任意の点をクリックすることによって、線602を削除することができる。また、形状及び複数の点を指定することによって、円柱及び角柱を追加することができる。 The image display unit 111 displays a two-dimensional image including the extracted two-dimensional shape data on the display device 19. The operator can correct the two-dimensional shape data by operating an input device such as the mouse 17. For example, when an object is erroneously detected due to noise or the like, or when there is an object that cannot be detected, the shape editing unit 112 accepts correction of two-dimensional shape data by an operator's operation. Specifically, when a wall is added, a line 601 can be added by clicking two points of a start point and an end point. When deleting a wall, the line 602 can be deleted by clicking an arbitrary point on the line. Moreover, a cylinder and a prism can be added by designating a shape and a plurality of points.
 さらに、抽出できる形状及び編集できる形状は、二次元画像上で線として表されるものであれば、前述したオブジェクト以外のものでもよい。 Further, the shape that can be extracted and the shape that can be edited may be other than the above-described object as long as it is represented as a line on the two-dimensional image.
 図10は、対応基準面高さ設定部114が受け付ける、基準面401から対応基準面700までの距離701を設定するための画面を示す。 FIG. 10 shows a screen for setting the distance 701 from the reference surface 401 to the corresponding reference surface 700, which is received by the corresponding reference surface height setting unit 114.
 前述したように、屋内の床面には複数の遮蔽物が配置されているので、床面を直接計測できないことが多い。そこで、天井を基準面401とし、床面を対応基準面700として、対応基準面700は基準面401と同じ形状であると決定する。基準面401から基準法線500の方向に所定の距離701だけ離れた位置に、基準面401と同じ形状の面を生成し、生成した面を対応基準面700とする。対応基準面700は、形状計測部100によって計測された面が分布する範囲に配置するとよい。 As described above, since a plurality of shields are arranged on the indoor floor surface, it is often impossible to directly measure the floor surface. Therefore, it is determined that the ceiling is the reference plane 401, the floor is the corresponding reference plane 700, and the corresponding reference plane 700 has the same shape as the reference plane 401. A surface having the same shape as the reference surface 401 is generated at a position separated from the reference surface 401 in the direction of the reference normal 500 by a predetermined distance 701, and the generated surface is set as a corresponding reference surface 700. The corresponding reference plane 700 may be arranged in a range in which the plane measured by the shape measuring unit 100 is distributed.
 基準面401からの距離は、オペレータが値(天井高)を入力する。また、マウス17などの入力装置の操作(例えば、基準面401をドラッグドロップ)によって、対応基準面の高さを調整しつつ対応基準面に対応する領域を画面上で指定できるインターフェースを提供してもよい。また、床面の遮蔽物が少なく、床面を平面として認識できる状態であれば、ハフ変換等の平面抽出方法を用いて鉛直下方向に存在する平面を抽出し、抽出された平面の高さに基準面の形状を当てはめてもよい。 The distance from the reference surface 401 is input by the operator (ceiling height). In addition, an interface is provided that allows an area corresponding to the corresponding reference plane to be specified on the screen while adjusting the height of the corresponding reference plane by operating an input device such as the mouse 17 (for example, dragging and dropping the reference plane 401). Also good. In addition, if there are few obstacles on the floor and the floor can be recognized as a plane, a plane existing in the vertically downward direction is extracted using a plane extraction method such as Hough transform, and the height of the extracted plane The shape of the reference plane may be applied to
 図11は、形状復元部115による、基準面401、対応基準面700及び二次元形状データ800から全体の形状の復元を示す図である。 FIG. 11 is a diagram illustrating restoration of the entire shape from the reference plane 401, the corresponding reference plane 700, and the two-dimensional shape data 800 by the shape restoration unit 115.
 二次元形状データ800は、二次元投影部108が形状データを投影する際に用いた基準法線500の方向によって、三次元空間において二次元形状データ801となる(C、D)。この二次元形状データ801を、基準面401と対応基準面700との間で基準法線500の方向に移動し、二次元形状を延伸することによって、大きさを持つ三次元の形状803に変換する(E)。これによって作成された三次元の形状803と、基準面401(A、B)と、対応基準面700(A、B)とを合わせることによって、部屋の形状806を復元できる(F)。 The two-dimensional shape data 800 becomes the two-dimensional shape data 801 in the three-dimensional space depending on the direction of the reference normal 500 used when the two-dimensional projection unit 108 projects the shape data (C, D). The two-dimensional shape data 801 is moved in the direction of the reference normal 500 between the reference surface 401 and the corresponding reference surface 700, and the two-dimensional shape is stretched to be converted into a three-dimensional shape 803 having a size. (E). The room shape 806 can be restored by combining the three-dimensional shape 803 thus created, the reference planes 401 (A, B), and the corresponding reference planes 700 (A, B) (F).
 以上に説明したように、高所にある面は人工物による遮蔽が少ないことから容易に計測できる。また、壁及び柱の人工物は鉛直方向に延びるという特性を利用する。このため、本実施例によれば、遮蔽が多い環境で、形状に関する事前情報が無い場合でも、形状を容易に計測でき、計測された形状データからロバストな図面を作成することができる。 As explained above, the surface at a high place can be easily measured because it is less shielded by artifacts. The wall and pillar artifacts use the property of extending in the vertical direction. For this reason, according to the present embodiment, it is possible to easily measure the shape even in the case where there is no prior information regarding the shape in an environment where there is much shielding, and it is possible to create a robust drawing from the measured shape data.
 また、形状抽出部109が抽出した二次元形状を、基準面401の位置から基準法線500の方向に設定された長さ701まで延伸した形状が表された図面を作成するので、計測対象の輪郭が明確になり、図面の精度を向上することができる。 In addition, since the drawing representing the shape obtained by extending the two-dimensional shape extracted by the shape extraction unit 109 from the position of the reference surface 401 to the length 701 set in the direction of the reference normal 500 is created, The outline becomes clear and the accuracy of the drawing can be improved.
 また、画像表示部111がディスプレイ装置19に表示した画像に二次元形状を加える、及び形状抽出部109が抽出した二次元形状を修正する、の少なくとも一つの処理を実行する形状編集部112を有するので、点群を確実に抽出することができる。 In addition, the image display unit 111 includes a shape editing unit 112 that executes at least one process of adding a two-dimensional shape to the image displayed on the display device 19 and correcting the two-dimensional shape extracted by the shape extraction unit 109. Therefore, a point cloud can be extracted reliably.
 また、人工物形状抽出部107は、基準面401から基準法線500の方向について二つ以上の設定された範囲501内の形状データを抽出するので、遮蔽物が複雑に配置された環境においても、正確に二次元形状を抽出でき、図面の精度を向上することができる。 Further, since the artifact shape extraction unit 107 extracts shape data within two or more set ranges 501 in the direction from the reference plane 401 to the reference normal 500, even in an environment where the shielding objects are arranged in a complicated manner. The two-dimensional shape can be extracted accurately, and the accuracy of the drawing can be improved.
 また、形状抽出部109は、二次元投影部108が作成した二次元画像から線形状を抽出するので、少ない演算量で、高精度な図面を作成することができる。 Further, since the shape extraction unit 109 extracts a line shape from the two-dimensional image created by the two-dimensional projection unit 108, it is possible to create a highly accurate drawing with a small amount of calculation.
 また、対応基準面抽出部113が、設定された距離701だけ基準面401から離れた位置にある形状データを対応基準面(床面)700として抽出し、形状復元部115が、形状抽出部109が抽出した二次元形状を基準面401の位置から基準法線500の方向に対応基準面700の位置まで延伸した形状と、基準面401と対応基準面700とで囲まれる領域とが表された図面を作成するので、床面を正確に定めることができる。 In addition, the corresponding reference surface extraction unit 113 extracts shape data located at a position away from the reference surface 401 by the set distance 701 as the corresponding reference surface (floor surface) 700, and the shape restoration unit 115 performs the shape extraction unit 109. Are extracted from the position of the reference plane 401 to the position of the corresponding reference plane 700 in the direction of the reference normal 500, and a region surrounded by the reference plane 401 and the corresponding reference plane 700 is represented. Since the drawing is created, the floor surface can be accurately determined.
 対応基準面高さ設定部114が、オペレータが設定する距離を受け付け、対応基準面抽出部113が、基準面401から対応基準面高さ設定部114が設定した距離だけ離れた位置から設定された範囲内に存在する面を抽出するので、床面を正確に定めることができる。 The corresponding reference surface height setting unit 114 receives the distance set by the operator, and the corresponding reference surface extraction unit 113 is set from a position away from the reference surface 401 by the distance set by the corresponding reference surface height setting unit 114. Since the surface existing within the range is extracted, the floor surface can be accurately determined.
 また、形状データを画面上に表示する形状データ表示部104と、形状データ表示部104がディスプレイ装置19に表示した形状データのうち領域内に含まれる形状の選択を受け付ける対応基準面高さ設定部114と、を有し、対応基準面抽出部113は、選択された形状データを含むように対応基準面700を抽出するので、床面を正確に定めることができる。 In addition, a shape data display unit 104 that displays shape data on the screen, and a corresponding reference plane height setting unit that receives selection of a shape included in the region of the shape data displayed on the display device 19 by the shape data display unit 104 114, and the corresponding reference surface extraction unit 113 extracts the corresponding reference surface 700 so as to include the selected shape data, so that the floor surface can be accurately determined.
 また、基準面抽出部103は、形状データ選択部105で選択された形状データを含むように基準面(天井)401を抽出するので、天井を正確に選択することができる。 Further, since the reference plane extraction unit 103 extracts the reference plane (ceiling) 401 so as to include the shape data selected by the shape data selection unit 105, the ceiling can be accurately selected.
 また、形状データ選択部105は、選択した範囲内に複数の存在物がある場合、最も手前に位置する存在物を選択するので、オペレータによる選択作業を容易にすることができる。 In addition, when there are a plurality of entities within the selected range, the shape data selection unit 105 selects the entity located in the forefront, so that the selection work by the operator can be facilitated.
 <第2実施例>
 第2実施例では、人工物形状抽出部107によって抽出した点の数に応じたヒストグラムを二次元投影部108が作成し、形状抽出部109が二次元形状データ110を抽出する方法を説明する。なお、第2実施例では、前述した第1実施例との相違点のみを説明し、同一の構成及び機能についての説明は省略する。
<Second embodiment>
In the second embodiment, a method in which the two-dimensional projection unit 108 creates a histogram corresponding to the number of points extracted by the artifact shape extraction unit 107 and the shape extraction unit 109 extracts the two-dimensional shape data 110 will be described. In the second embodiment, only differences from the first embodiment described above will be described, and description of the same configuration and function will be omitted.
 図12は、第2実施例のヒストグラムの作成を示す図である。 FIG. 12 is a diagram showing creation of a histogram according to the second embodiment.
 二次元投影部108は、図12(A)に示すように、基準面401の基準法線500の方向で指定された範囲902内の形状データ903を二次元画像904に射影する。範囲902は、天井から床面までの全範囲である必要はなく、図面にしたい情報が十分に含まれている範囲であればよい。第2実施例では、二次元画像904内の各画素に対応する領域内に存在する点の数に応じたヒストグラム値を計算し、ヒストグラム値に応じて画素の色を決定し、二次元画像904を作成する。 The two-dimensional projection unit 108 projects shape data 903 within a range 902 designated in the direction of the reference normal 500 of the reference surface 401 onto a two-dimensional image 904 as shown in FIG. The range 902 does not have to be the entire range from the ceiling to the floor, and may be a range that sufficiently includes information to be included in the drawing. In the second embodiment, a histogram value corresponding to the number of points existing in the region corresponding to each pixel in the two-dimensional image 904 is calculated, the pixel color is determined according to the histogram value, and the two-dimensional image 904 is calculated. Create
 図12(B)に示すように、壁が計測されている領域は、天井から床までの広範囲で同じ位置に点が計測できるので、ヒストグラム値は高くなる(905)。一方、屋内に配置した存在物の裏に位置する壁は、広範に計測されている壁より計測点数が低くなるので、ヒストグラム値は少し低くなる(906)。さらに、存在物は、その高さに応じて計測点数が決まり、計測点数に応じたヒストグラム値となる。例えば、ボードと足が両方計測される位置はヒストグラム値が高くなり(907)、ボードのみが計測される位置はヒストグラム値が低くなる(908)。 As shown in FIG. 12B, in the region where the wall is measured, the points can be measured at the same position in a wide range from the ceiling to the floor, so the histogram value becomes high (905). On the other hand, since the number of measurement points is lower in the wall located behind the existence object placed indoors, the histogram value is slightly lower (906). Furthermore, the number of measurement points of the existence is determined according to the height, and becomes a histogram value according to the number of measurement points. For example, the position where both the board and the foot are measured has a high histogram value (907), and the position where only the board is measured has a low histogram value (908).
 さらに、二次元投影部108が作成した二次元画像904を用いて、形状抽出部109が二次元形状を抽出する。この際、各画素のヒストグラムを重みとしたハフ変換や動的輪郭法などによって二次元形状を抽出する。 Furthermore, the shape extraction unit 109 extracts a two-dimensional shape using the two-dimensional image 904 created by the two-dimensional projection unit 108. At this time, a two-dimensional shape is extracted by a Hough transform, a dynamic contour method, or the like using a histogram of each pixel as a weight.
 第2実施例では、二次元投影部108が、人工物形状抽出部107が抽出した点の数に応じてヒストグラムを作成し、形状抽出部109が、ヒストグラムを用いて二次元形状を抽出するので、壁に遮蔽物が配置されている環境でも、前述した第1実施例よりロバストに形状を抽出することができ、ロバストな図面を作成することができる。 In the second embodiment, the two-dimensional projection unit 108 creates a histogram according to the number of points extracted by the artifact shape extraction unit 107, and the shape extraction unit 109 extracts a two-dimensional shape using the histogram. Even in an environment where a shield is placed on the wall, the shape can be extracted more robustly than the first embodiment described above, and a robust drawing can be created.
 <第3実施例>
 第3実施例では、前述した実施例の図面作成システムを用いて、壁に接続されている配管1000を抽出する例を述べる。なお、第3実施例では、前述した実施例との相違点のみを説明し、同一の構成及び機能についての説明は省略する。
<Third embodiment>
In the third embodiment, an example in which the pipe 1000 connected to the wall is extracted using the drawing creation system of the above-described embodiment will be described. In the third embodiment, only differences from the above-described embodiment will be described, and description of the same configuration and function will be omitted.
 図13は、壁に接続されている配管の抽出を示す図である。 FIG. 13 is a diagram illustrating extraction of pipes connected to a wall.
 基準面抽出部103は、高所に位置する壁1001を基準面として抽出する。基準法線推定部106が推定する法線方向1002は壁から垂直に延伸する配管1000と同じ方向になる。抽出された基準面1001及び法線方向1002を用いることによって、二次元投影部108によって投影された配管は、二次元画像1003において円形状1004となる。形状抽出部109は、円形状1004をハフ変換で抽出するか、又は直接オペレータが形状編集部112で指定することによって、配管の二次元形状データ110を抽出できる。 The reference plane extraction unit 103 extracts the wall 1001 located at a high place as a reference plane. The normal direction 1002 estimated by the reference normal estimation unit 106 is the same direction as the pipe 1000 extending vertically from the wall. By using the extracted reference plane 1001 and normal direction 1002, the pipe projected by the two-dimensional projection unit 108 has a circular shape 1004 in the two-dimensional image 1003. The shape extraction unit 109 can extract the two-dimensional shape data 110 of the pipe by extracting the circular shape 1004 by the Hough transform, or directly by the operator using the shape editing unit 112.
 抽出した配管の二次元形状データをオペレータが指定した長さだけ延ばすことによって、形状復元部115は当該配管を図面情報にすることができる。 By extending the extracted two-dimensional shape data of the pipe by the length designated by the operator, the shape restoration unit 115 can make the pipe into drawing information.
 第3実施例では、特に高所などの遮蔽物が少ない領域で配管を検出することができる。 In the third embodiment, piping can be detected particularly in an area where there are few shields such as high places.
 なお、本発明は前述した実施例に限定されるものではなく、添付した特許請求の範囲の趣旨内における様々な変形例及び同等の構成が含まれる。例えば、前述した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに本発明は限定されない。また、ある実施例の構成の一部を他の実施例の構成に置き換えてもよい。また、ある実施例の構成に他の実施例の構成を加えてもよい。また、各実施例の構成の一部について、他の構成の追加・削除・置換をしてもよい。 The present invention is not limited to the above-described embodiments, and includes various modifications and equivalent configurations within the scope of the appended claims. For example, the above-described embodiments have been described in detail for easy understanding of the present invention, and the present invention is not necessarily limited to those having all the configurations described. A part of the configuration of one embodiment may be replaced with the configuration of another embodiment. Moreover, you may add the structure of another Example to the structure of a certain Example. In addition, for a part of the configuration of each embodiment, another configuration may be added, deleted, or replaced.
 また、前述した各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等により、ハードウェアで実現してもよく、プロセッサがそれぞれの機能を実現するプログラムを解釈し実行することにより、ソフトウェアで実現してもよい。 In addition, each of the above-described configurations, functions, processing units, processing means, etc. may be realized in hardware by designing a part or all of them, for example, with an integrated circuit, and the processor realizes each function. It may be realized by software by interpreting and executing the program to be executed.
 各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリ、ハードディスク、SSD(Solid State Drive)等の記憶装置、又は、ICカード、SDカード、DVD等の記録媒体に格納することができる。 Information such as programs, tables, and files that realize each function can be stored in a storage device such as a memory, a hard disk, and an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, and a DVD.
 また、制御線や情報線は説明上必要と考えられるものを示しており、実装上必要な全ての制御線や情報線を示しているとは限らない。実際には、ほとんど全ての構成が相互に接続されていると考えてよい。 Also, the control lines and information lines indicate what is considered necessary for the explanation, and do not necessarily indicate all control lines and information lines necessary for mounting. In practice, it can be considered that almost all the components are connected to each other.

Claims (15)

  1.  計測部による周囲の計測結果から形状図面を作成する図面作成システムであって、
     基準面を抽出する高さの設定を受け付ける基準面高さ設定部と、
     前記基準面高さ設定部によって設定された高さの高所において基準面を抽出する基準面抽出部と、
     前記基準面の法線となる基準法線を推定する基準法線推定部と、
     前記基準面の形状を二次元形状として、前記基準面の位置から前記基準法線の方向に設定された長さまで延伸した形状が表された図面を作成する形状復元部と、を備える図面作成システム。
    A drawing creation system for creating a shape drawing from surrounding measurement results by a measurement unit,
    A reference surface height setting unit that accepts a height setting for extracting the reference surface;
    A reference surface extraction unit that extracts a reference surface at a height of the height set by the reference surface height setting unit;
    A reference normal estimation unit that estimates a reference normal that is a normal of the reference plane;
    A drawing creation system comprising: a shape restoration unit that creates a drawing in which a shape extending from a position of the reference surface to a length set in the direction of the reference normal is represented as a two-dimensional shape of the reference surface .
  2.  請求項1に記載の図面作成システムであって、
     前記基準面から前記基準法線の方向について、設定された範囲内に存在する形状データを抽出する人工物形状抽出部と、
     前記人工物形状抽出部が抽出した形状データを前記基準面と平行な面上に射影することによって二次元画像を作成する二次元投影部と、
     前記二次元投影部が作成した二次元画像から二次元形状を抽出する形状抽出部と、を備え、
     前記形状復元部は、前記基準面の形状の代わりに、前記形状抽出部が抽出した二次元形状を、前記基準面の位置から前記基準法線の方向に前記設定された長さまで延伸した形状が表された図面を作成することを特徴とする図面作成システム。
    A drawing creation system according to claim 1,
    For the direction of the reference normal from the reference plane, an artifact shape extraction unit that extracts shape data existing within a set range;
    A two-dimensional projection unit that creates a two-dimensional image by projecting the shape data extracted by the artifact shape extraction unit onto a plane parallel to the reference plane;
    A shape extraction unit that extracts a two-dimensional shape from a two-dimensional image created by the two-dimensional projection unit,
    Instead of the shape of the reference surface, the shape restoration unit has a shape obtained by extending the two-dimensional shape extracted by the shape extraction unit from the position of the reference surface to the set length in the direction of the reference normal. A drawing creation system for creating a represented drawing.
  3.  請求項2に記載の図面作成システムであって、
     前記二次元投影部が作成した画像を画面上に表示する画像表示部と、
     前記画像表示部が表示する画像に二次元形状を加える、及び前記形状抽出部が抽出した二次元形状を修正する、の少なくとも一つの処理を実行する形状編集部と、を備える図面作成システム。
    The drawing creation system according to claim 2,
    An image display unit for displaying an image created by the two-dimensional projection unit on a screen;
    A drawing creation system comprising: a shape editing unit that executes at least one process of adding a two-dimensional shape to an image displayed by the image display unit and correcting the two-dimensional shape extracted by the shape extraction unit.
  4.  請求項2に記載の図面作成システムであって、
     前記人工物形状抽出部は、前記基準面から前記基準法線の方向について二つ以上の前記設定された範囲内の形状データを抽出することを特徴とした図面作成システム。
    The drawing creation system according to claim 2,
    The drawing creation system, wherein the artifact shape extraction unit extracts two or more shape data within the set range with respect to a direction of the reference normal from the reference plane.
  5.  請求項2に記載の図面作成システムであって、
     前記二次元投影部は、前記人工物形状抽出部が抽出した形状データの数に応じてヒストグラムを作成し、
     前記形状抽出部は、前記ヒストグラムを用いて二次元形状を抽出することを特徴とする図面作成システム。
    The drawing creation system according to claim 2,
    The two-dimensional projection unit creates a histogram according to the number of shape data extracted by the artifact shape extraction unit,
    The drawing extraction system, wherein the shape extraction unit extracts a two-dimensional shape using the histogram.
  6.  請求項2に記載の図面作成システムであって、
     前記形状抽出部は、前記二次元投影部が作成した二次元画像から線形状を抽出することとした図面作成システム。
    The drawing creation system according to claim 2,
    The drawing creation system, wherein the shape extraction unit extracts a line shape from a two-dimensional image created by the two-dimensional projection unit.
  7.  請求項2に記載の図面作成システムであって、
     前記基準面から、設定された距離だけ離れた位置にある形状データを対応基準面として抽出する対応基準面抽出部を備え、
     前記形状復元部は、前記形状抽出部が抽出した二次元形状を前記基準面の位置から前記基準法線の方向に前記対応基準面の位置まで延伸した形状と、前記基準面と前記対応基準面とで囲まれる領域とが表された図面を作成することを特徴とする図面作成システム。
    The drawing creation system according to claim 2,
    A corresponding reference surface extraction unit that extracts shape data at a position away from the reference surface by a set distance as a corresponding reference surface;
    The shape restoration unit includes a shape obtained by extending the two-dimensional shape extracted by the shape extraction unit from the position of the reference plane to the position of the corresponding reference plane in the direction of the reference normal, the reference plane, and the corresponding reference plane A drawing creation system for creating a drawing in which a region surrounded by is represented.
  8.  請求項7に記載の図面作成システムであって、
     オペレータが設定する距離を受け付ける対応基準面高さ設定部を備え、
     前記対応基準面抽出部は、前記基準面から前記対応基準面高さ設定部が設定した距離だけ離れた位置から設定された範囲内の面を抽出することを特徴とする図面作成システム。
    The drawing creation system according to claim 7,
    With a corresponding reference surface height setting unit that accepts the distance set by the operator,
    The drawing creation system, wherein the corresponding reference plane extraction unit extracts a plane within a set range from a position away from the reference plane by a distance set by the corresponding reference plane height setting unit.
  9.  請求項7に記載の図面作成システムであって、
     形状データを画面上に表示する形状データ表示部と、
     前記形状データ表示部が表示する形状データの選択を受け付ける対応基準面高さ設定部と、を備え、
     前記対応基準面抽出部は、前記選択された形状データを含むように対応基準面を抽出することを特徴とした図面作成システム。
    The drawing creation system according to claim 7,
    A shape data display for displaying shape data on the screen;
    A corresponding reference surface height setting unit that accepts selection of shape data displayed by the shape data display unit,
    The corresponding reference plane extraction unit extracts a corresponding reference plane so as to include the selected shape data.
  10.  請求項7に記載の図面作成システムであって、
     前記対応基準面抽出部は、床面を抽出することを特徴とする図面作成システム。
    The drawing creation system according to claim 7,
    The corresponding reference plane extraction unit extracts a floor surface.
  11.  請求項1から10のいずれか一つに記載の図面作成システムであって、
     形状データを画面上に表示する形状データ表示部と、
     前記形状データ表示部に表示された形状データの選択を受け付ける形状データ選択部と、を備え、
     前記基準面抽出部は、前記選択された形状データを含むように基準面を抽出することを特徴とする図面作成システム。
    A drawing creation system according to any one of claims 1 to 10,
    A shape data display for displaying shape data on the screen;
    A shape data selection unit for receiving selection of shape data displayed on the shape data display unit,
    The drawing creation system, wherein the reference plane extraction unit extracts a reference plane so as to include the selected shape data.
  12.  請求項11に記載の図面作成システムであって、
     前記形状データ選択部は、選択した範囲内に複数の存在物が前記形状データから特定される場合、最も手前に位置する存在物を選択することを特徴とする図面作成システム。
    A drawing creation system according to claim 11,
    The drawing data selection unit, when a plurality of entities are specified from the shape data within a selected range, selects the entity located at the forefront.
  13.  請求項1から10のいずれか一つに記載の図面作成システムであって、
     前記基準面抽出部は、天井を抽出することを特徴とする図面作成システム。
    A drawing creation system according to any one of claims 1 to 10,
    The drawing creation system, wherein the reference plane extraction unit extracts a ceiling.
  14.  計測部による周囲の計測結果から形状図面をコンピュータを用いて作成する方法であって、
     前記コンピュータは、プログラムを実行するプロセッサと、前記プログラムを格納するメモリとを有し、
     前記方法は、
     前記プロセッサが、基準面を抽出する高さの設定を受け付ける基準面高さ設定手順と、
     前記プロセッサが、前記基準面高さ設定部によって設定された高さの高所において基準面を抽出する基準面抽出手順と、
     前記プロセッサが、前記基準面の法線となる基準法線を推定する基準法線推定手順と、
     前記プロセッサが、前記基準面の形状を二次元形状として、前記基準面の位置から前記基準法線の方向に設定された長さまで延伸した形状が表された図面を作成する形状復元手順と、を含む図面作成方法。
    A method of creating a shape drawing from a surrounding measurement result by a measuring unit using a computer,
    The computer has a processor that executes a program, and a memory that stores the program,
    The method
    A reference surface height setting procedure for receiving a setting of a height at which the processor extracts a reference surface;
    A reference surface extraction procedure in which the processor extracts a reference surface at a height of a height set by the reference surface height setting unit;
    A reference normal estimation procedure in which the processor estimates a reference normal that is a normal of the reference plane;
    A shape restoration procedure in which the processor creates a drawing representing a shape extended from the position of the reference surface to a length set in the direction of the reference normal, with the shape of the reference surface being a two-dimensional shape. Including drawing creation method.
  15.  請求項14に記載の図面作成方法であって、
     前記プロセッサが、前記基準面から前記基準法線の方向について、設定された範囲内に存在する形状データを抽出する人工物形状抽出手順と、
     前記プロセッサが、前記人工物形状抽出手順で抽出された形状データを前記基準面と平行な面上に射影することによって二次元画像を作成する二次元投影手順と、
     前記プロセッサが、前記二次元投影手順で作成された二次元画像から二次元形状を抽出する形状抽出手順と、を含み、
     前記形状復元手順では、前記基準面の形状の代わりに、前記形状抽出手順で抽出された二次元形状を、前記基準面の位置から前記基準法線の方向に前記設定された長さまで延伸した形状が表された図面を作成することを特徴とする図面作成方法。
    A drawing creation method according to claim 14,
    An artifact shape extraction procedure in which the processor extracts shape data existing within a set range with respect to the direction of the reference normal from the reference plane;
    A two-dimensional projection procedure in which the processor creates a two-dimensional image by projecting the shape data extracted in the artifact shape extraction procedure onto a plane parallel to the reference plane;
    The processor includes a shape extraction procedure for extracting a two-dimensional shape from the two-dimensional image created by the two-dimensional projection procedure;
    In the shape restoration procedure, instead of the shape of the reference surface, the shape obtained by extending the two-dimensional shape extracted in the shape extraction procedure from the position of the reference surface to the set length in the direction of the reference normal line A drawing creation method, characterized in that a drawing in which is represented is created.
PCT/JP2015/054481 2015-02-18 2015-02-18 Drawing preparation system is drawing preparation method WO2016132489A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2016557092A JP6227801B2 (en) 2015-02-18 2015-02-18 Drawing creation system and drawing creation method
PCT/JP2015/054481 WO2016132489A1 (en) 2015-02-18 2015-02-18 Drawing preparation system is drawing preparation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2015/054481 WO2016132489A1 (en) 2015-02-18 2015-02-18 Drawing preparation system is drawing preparation method

Publications (1)

Publication Number Publication Date
WO2016132489A1 true WO2016132489A1 (en) 2016-08-25

Family

ID=56692496

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2015/054481 WO2016132489A1 (en) 2015-02-18 2015-02-18 Drawing preparation system is drawing preparation method

Country Status (2)

Country Link
JP (1) JP6227801B2 (en)
WO (1) WO2016132489A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111882A (en) * 1996-10-04 1998-04-28 Honda Motor Co Ltd Three-dimensional cad system and method for conversion from two-dimensional cad drawing to three-dimensional cad drawing
JP2013058106A (en) * 2011-09-08 2013-03-28 Toshiba Plant Systems & Services Corp Three-dimensional cad data creation system and three-dimensional cad data creation method
JP2014186565A (en) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Analysis method of three-dimensional point group

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10111882A (en) * 1996-10-04 1998-04-28 Honda Motor Co Ltd Three-dimensional cad system and method for conversion from two-dimensional cad drawing to three-dimensional cad drawing
JP2013058106A (en) * 2011-09-08 2013-03-28 Toshiba Plant Systems & Services Corp Three-dimensional cad data creation system and three-dimensional cad data creation method
JP2014186565A (en) * 2013-03-25 2014-10-02 Geo Technical Laboratory Co Ltd Analysis method of three-dimensional point group

Also Published As

Publication number Publication date
JP6227801B2 (en) 2017-11-08
JPWO2016132489A1 (en) 2017-04-27

Similar Documents

Publication Publication Date Title
CN118235166A (en) Capturing environmental features using 2D and 3D scanning
US10354402B2 (en) Image processing apparatus and image processing method
US20160117795A1 (en) Point cloud data processing system and method thereof and computer readable storage medium
KR20140066177A (en) Coordinate measuring system data reduction
KR101909544B1 (en) Apparatus and method for plane detection
US10210669B1 (en) Method for 3D object, environment model, and documentation generation using scan point clouds and digital object libraries
CN112904369B (en) Robot repositioning method, apparatus, robot, and computer-readable storage medium
JP2014115915A (en) Three-dimensional model generation device, three-dimensional model generation method, and three-dimensional model generation program
US20210056337A1 (en) Recognition processing device, recognition processing method, and program
JP2023525535A (en) Method and apparatus for identifying surface features in three-dimensional images
JP2020109626A (en) Apparatus and method for identifying articulatable part of physical object using multiple 3d point clouds
CN108292450A (en) Determine the illuminating effect for the lamps and lanterns virtually placed
JP2015184061A (en) Extracting device, method, and program
WO2016084142A1 (en) Work assistance system and work assistance method
US20210200232A1 (en) Method of generating scan path of autonomous mobile robot and computing device
JP6227801B2 (en) Drawing creation system and drawing creation method
JP6280425B2 (en) Image processing apparatus, image processing system, three-dimensional measuring instrument, image processing method, and image processing program
JP7107015B2 (en) Point cloud processing device, point cloud processing method and program
EP2993613A1 (en) A capture system arranged to create a 3d model from a scanned scene, a method and a graphical user interface
Wiemann et al. An evaluation of open source surface reconstruction software for robotic applications
CN113538562B (en) Indoor area determination method and device, electronic equipment and storage medium
JP7188798B2 (en) Coordinate calculation device, coordinate calculation method, and program
JP6950208B2 (en) Information processing programs, information processing methods, and information processing equipment
JP5620741B2 (en) Information processing apparatus, information processing method, and program
KR100782152B1 (en) Method for obtaining 3-dimensional building data from aerial photograph db

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2016557092

Country of ref document: JP

Kind code of ref document: A

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15882592

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15882592

Country of ref document: EP

Kind code of ref document: A1