WO2014192316A1 - モデリング装置、3次元モデル生成装置、モデリング方法、プログラム、レイアウトシミュレータ - Google Patents
モデリング装置、3次元モデル生成装置、モデリング方法、プログラム、レイアウトシミュレータ Download PDFInfo
- Publication number
- WO2014192316A1 WO2014192316A1 PCT/JP2014/002895 JP2014002895W WO2014192316A1 WO 2014192316 A1 WO2014192316 A1 WO 2014192316A1 JP 2014002895 W JP2014002895 W JP 2014002895W WO 2014192316 A1 WO2014192316 A1 WO 2014192316A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- plane
- measurement
- modeling
- unit
- extraction unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/10—Geometric CAD
- G06F30/13—Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/04—Architectural design, interior design
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
Definitions
- the present invention relates to a modeling apparatus that generates a model of a three-dimensional object using a measurement result obtained by performing three-dimensional measurement of the three-dimensional object. Furthermore, the present invention relates to a three-dimensional model generation device that generates a model of a three-dimensional object in real space, a modeling method that generates a model of a three-dimensional object, a program that realizes the modeling device, and a layout simulator that uses the modeling device.
- Reference 1 classifies the plane adjacency from the position and inclination of the plane into creases, step boundaries, and in-plane boundaries, calculates the boundary of the plane, calculates the intersection line, calculates the indoor structural line Yes. That is, the configuration described in Document 1 extracts folds, step boundaries, and in-plane boundaries in order to extract indoor structural lines including objects already arranged in the target space.
- Document 1 describes that a structural line that is blocked by a small obstacle and cannot be detected from an image is restored by extracting the structural line in this way.
- Reference 1 describes that the structure line can be stably restored even when a part of the region is missing due to noise or a small obstacle.
- the technique described in Document 1 is a technique in which a large object such as a desk is a constituent element of a three-dimensional model, and is placed indoors for the purpose of re-clothing cloth or constructing a heat insulating material. It is not possible to generate a model of a three-dimensional object from which an object is removed.
- the present invention provides a modeling apparatus capable of generating a model of a three-dimensional object even when there is a region that cannot be measured by the measurement apparatus by blocking a part of the three-dimensional object by a relatively large object.
- Another object of the present invention is to provide a three-dimensional model generation apparatus that generates a model of a three-dimensional object in real space, a modeling method for generating a model of a three-dimensional object, a program that realizes the modeling apparatus, and a layout simulator that uses the modeling apparatus. .
- the modeling apparatus obtains, as first data, a three-dimensional coordinate value related to a plurality of measurement points belonging to a three-dimensional object from a measurement apparatus that performs three-dimensional measurement of a three-dimensional object having a plurality of surfaces.
- a surface extraction unit that generates a plane expression representing the surface using the first data related to the measurement points belonging to the surface, and a plurality of adjacent surfaces among the surfaces.
- a vertex extraction unit that extracts the points as vertices shared by the plurality of surfaces, and information on a model that represents the solid object using the plane expression and the vertices are generated.
- a plane that is defined by a plane equation obtained by using the coordinate values of three measurement points belonging to one surface, and further selects a predetermined surface from the candidate surface. Present in distance range And determining a plane expression by performing recalculation using a plurality of measurement points that.
- a three-dimensional model generation device includes a modeling device and the measurement device.
- an acquisition unit uses, as a first data, a three-dimensional coordinate value related to a plurality of measurement points belonging to a three-dimensional object from a measurement device that performs three-dimensional measurement of a three-dimensional object having a plurality of surfaces. And then, for each surface, a surface extraction unit generates a plane expression representing the surface using the first data regarding the measurement points belonging to the surface, and then a plurality of adjacent ones of the surfaces By calculating points that simultaneously satisfy the plane expression representing each surface, the vertex extraction unit extracts the points as vertices shared by the plurality of surfaces, and represents the solid object using the plane expression and the vertices.
- the model generation unit generates information on the model, and the surface extraction unit uses a plane defined by a plane equation obtained using the coordinate values of three measurement points belonging to one surface as a candidate surface, Within a certain distance from the surface And determining a plane expression by performing recalculation using a plurality of measurement points.
- the program according to the present invention acquires, as first data, a three-dimensional coordinate value relating to a plurality of measurement points belonging to the three-dimensional object from a measuring device that performs three-dimensional measurement of the three-dimensional object having a plurality of surfaces.
- An acquisition unit a surface extraction unit that generates a plane expression representing the surface using the first data related to the measurement points belonging to the surface, and a plurality of adjacent surfaces among the surfaces.
- the surface extraction unit uses a plane defined by a plane equation obtained by using coordinate values of three measurement points belonging to one surface as a candidate surface, and further includes a candidate surface. From the given It is intended to function as a modeling device for determining a plane expression by performing recalculation using a plurality of measurement points existing in the release range.
- a layout simulator includes a display control unit that displays a virtual space, which is computer graphics of the three-dimensional object, on a screen of a monitor device using information on the model generated by the modeling device described above, An article placement unit that places a three-dimensional article having three-dimensional information at a desired position; a texture expression unit that pastes texture information on the surface surrounded by the boundary line in the virtual space; the article and the texture And an attribute adjusting unit that adjusts an attribute including the position of.
- a three-dimensional coordinate value at a plurality of measurement points is obtained for a three-dimensional object having a plurality of surfaces, and a plane expression that represents the surface for each surface is generated using the coordinate values.
- the vertexes shared by a plurality of adjacent faces are extracted by the calculation using, and the model is generated as a set of boundary lines connecting the vertices. Therefore, even if there is an area that cannot be measured by the measuring device due to a part of the three-dimensional object being blocked by a relatively large object, the position of the vertex is estimated by the calculation of the plane formula.
- the plane equation is determined by performing recalculation using measurement points existing in a predetermined distance range with respect to the candidate surface, so that the shape of the three-dimensional object can be accurately reproduced with the model. .
- the three-dimensional model generation device 1 described below includes a measuring device 20 that performs three-dimensional measurement of a three-dimensional object 30 (see FIG. 2) having a plurality of surfaces 3 (see FIG. 2), and a three-dimensional object. And a modeling device 10 that generates model information of the object 30.
- This embodiment assumes a room provided inside a building as a three-dimensional object 30 having a plurality of surfaces 3 as shown in FIG. That is, this embodiment pays attention to the inner surface of the room.
- the three-dimensional object 30 may be either inside or outside the building, and the technology described below can be applied to the three-dimensional object 30 other than the building.
- a measuring device 20 that performs three-dimensional measurement is a so-called 3D laser scanner.
- the 3D laser scanner is configured to three-dimensionally scan a beam-shaped pulse laser and output a three-dimensional coordinate value of a portion irradiated with the pulse laser.
- the measuring device 20 projects intensity-modulated light whose intensity periodically changes with time, and receives reflected light from the space where the intensity-modulated light is projected.
- the measuring device 20 is configured to detect the phase difference of the intensity-modulated light between the time of light projection and the time of light reception, and obtain the flight time of the intensity-modulated light from this phase difference.
- the pulse laser is simply referred to as a laser.
- the 3D laser scanner used in this embodiment includes a measurement unit (not shown) that rotates in a plane parallel to the installed plane.
- the measurement unit is configured to scan (scan) the laser in a plane orthogonal to the installed plane at each rotated position.
- the 3D laser scanner When the 3D laser scanner is installed on the floor surface, the 3D laser scanner irradiates the entire range of the room except a part of the floor surface while the measurement unit rotates in a plane parallel to the floor surface. . Therefore, the laser is irradiated in various directions around the measurement unit. That is, the beam-shaped laser is scanned three-dimensionally. However, the direction to which the laser is irradiated excludes the direction toward the member that supports the measurement unit. Therefore, the measurement unit does not irradiate the area around the portion where the 3D laser scanner is installed on the floor.
- the 3D laser scanner measures the phase difference until the irradiated laser is reflected by the three-dimensional object 30 and returns to the measuring device 20.
- the 3D laser scanner converts the measured phase difference into a distance to the part of the three-dimensional object 30 that reflects the laser.
- the 3D laser scanner specifies the position where the laser beam is irradiated on the three-dimensional object 30 based on the direction in which the laser beam is irradiated and the distance to the portion where the laser beam is reflected.
- the position specified by the direction of irradiation with the laser and the distance to the part that reflects the laser is represented by a coordinate value in a polar coordinate (ie, spherical coordinate) system.
- the measuring device 20 outputs a coordinate value for each position of the three-dimensional object 30 where the laser is irradiated. That is, the coordinate values regarding the three-dimensional object 30 are obtained discretely.
- the position where the coordinate value is obtained is referred to as “measurement point”.
- data relating to a three-dimensional coordinate value output from the measuring device 20 is referred to as first data.
- the measurement device 20 projects a pulse laser, and outputs a three-dimensional coordinate value of the measurement point with a portion of the three-dimensional object 30 or the like reflecting the pulse laser as a measurement point.
- Measured points are represented by coordinate values in the polar coordinate system, so the density increases as the distance is short, and the density decreases as the distance is long.
- the lattice points of this square lattice are made to correspond to the measurement points, the interval between adjacent measurement points is widened at the short-distance measurement points and the long-distance measurement points. Then it narrows.
- the measuring device 20 has not only a three-dimensional measurement but also a function of imaging the three-dimensional object 30. Therefore, the measuring device 20 includes a solid-state imaging device such as a CCD image sensor or a CMOS image sensor, and a wide-angle optical system disposed in front of the solid-state imaging device. For example, the measuring device 20 makes two rounds of the measurement unit in a plane parallel to the installation surface, obtains a three-dimensional coordinate value related to the three-dimensional object 30 using a laser in the first round, and determines the three-dimensional object 30 in the second round. Configured to image.
- a solid-state imaging device such as a CCD image sensor or a CMOS image sensor
- a wide-angle optical system disposed in front of the solid-state imaging device.
- the measuring device 20 makes two rounds of the measurement unit in a plane parallel to the installation surface, obtains a three-dimensional coordinate value related to the three-dimensional object 30 using a laser in the first round, and determines the three-dimensional object 30 in the second
- the measuring device 20 includes a wide-angle optical system, a focused image can be obtained without adjusting the focus of the optical system even if the depth of field is deep and the distance to the three-dimensional object 30 is changed. .
- the measurement device 20 captures a plurality of images (for example, about 100 images) during the period in which the measurement unit moves, Join images together.
- the captured image may be a monochrome grayscale image, but in the following, it is assumed that the image is a color image and, for example, data of luminance of each color of red, green, and blue is output.
- color image data is referred to as color information.
- the image data output from the measuring device 20 is referred to as second data.
- Kinect registered trademark
- Kinect is known in addition to the 3D laser scanner described above.
- the measuring device 20 has a function of associating the second data, which is the pixel value in the color image, with the first data.
- the measuring device 20 has a function of associating color information with each measurement point.
- the measurement device 20 extracts a pixel corresponding to the measurement point in the color image using the direction in which the laser is irradiated from the measurement unit and the field of view when the color image is captured, and the color information of the pixel is obtained from the measurement point. Associate with coordinate values.
- the image obtained in this way is, for example, a distorted image as shown in FIG.
- a distorted image is obtained by expanding the measurement points on a two-dimensional plane, so that the interval between the adjacent measurement points is widened at the short-distance measurement points, and the long-distance image is obtained. This is because it narrows at the measurement point.
- the measuring device 20 generates the first data related to the three-dimensional shape of the three-dimensional object 30 and the second data related to the color information of the three-dimensional object 30. Further, as described above, the second data is associated with the three-dimensional coordinate value of the measurement point that is the first data. That is, the pixels constituting the two-dimensional image shown in FIG. 4 have a one-to-one correspondence with the measurement points, and have three-dimensional coordinate value information and color information. In other words, selecting a specific pixel of the color image is equivalent to selecting a measurement point, and first data corresponding to the pixel is extracted.
- the measuring apparatus 20 is not limited to the configuration that irradiates the beam-shaped pulse laser, and may employ a configuration that projects a pattern such as a line shape, a stripe shape, or a lattice shape.
- the measurement device 20 may be a distance image sensor that receives reflected light of intensity-modulated light with an area image sensor and generates a distance image in which the pixel value is a distance value from the output of the area image sensor.
- the measuring device 20 may measure the flight time from light projection to light reception without using intensity-modulated light. Or you may employ
- the measuring device 20 when the measuring device 20 is configured to measure the first data by the stereo image method, the pixel value obtained from the grayscale image or the color image for calculating the first data is used as the second data. It is possible.
- the measuring device 20 is configured to measure using intensity-modulated light, a grayscale image corresponding to the received light intensity of the reflected light can be used for the second data. In this configuration, the received light intensity of the reflected light is integrated over one period or a plurality of periods of the intensity-modulated light, so that a change in the intensity of the reflected light with time is removed.
- the measuring device 20 may be configured to output only the first data. Further, the measuring device 20 may have a function of converting the coordinate value of the polar coordinate system into the coordinate value of the orthogonal coordinate system defined in the measuring device 20.
- the coordinate system of the measuring device 20 is set without depending on the arrangement of the three-dimensional object 30. For example, the z axis is set in the vertical direction, and the reference point is set at 0 m above sea level. Or the structure by which the plane which installed the measuring apparatus 20 is defined to xy plane may be sufficient.
- the measuring device 20 is configured not only to output the coordinate value of the measurement point obtained based on the flight time of the laser, but also to output the received light intensity of the laser, the coordinate value of the measurement point Can be used as the first data, and the received light intensity of the laser can be used as the second data.
- the light reception intensity of the laser depends on the absorption rate, diffusivity, distance, etc. of the laser at the part irradiated with the laser, but when an image with the laser light reception intensity as the pixel value is generated, the gray scale is similar to a grayscale image. Images are generated. That is, when the measuring device 20 outputs information on the received light intensity of the laser, this information can be used as the second data.
- the process of associating the first data and the second data may be performed by the modeling device 10 described later, not the measurement device 20. Further, the modeling apparatus 10 may have a function of converting the coordinate value of the polar coordinate system into the coordinate value of the orthogonal coordinate system. In short, the process of associating the first data and the second data and the process of coordinate conversion from the polar coordinate system to the orthogonal coordinate system may be performed by either the measurement apparatus 20 or the modeling apparatus 10. Below, regarding the three-dimensional object 30, a process after the coordinate value of the orthogonal coordinate system is obtained will be described.
- the measuring device 20 when the measuring device 20 outputs information on the received light intensity of the laser and uses this information as the second data, the information on the received light intensity can also be acquired with the information on the laser when the coordinate value is acquired. Therefore, it is not necessary to acquire the second data separately, and it is not necessary to associate the first data with the second data.
- the three-dimensional object 30 since the three-dimensional object 30 is a room, the three-dimensional object 30 includes a floor surface 31, a ceiling surface 32, and a wall surface 33 as shown in FIG. Therefore, the measuring device 20 outputs the first data and the second data regarding the inner space of the three-dimensional object 30.
- the first data is represented by coordinate values in an orthogonal coordinate system
- the second data is associated with the measurement point of the first data.
- the ceiling surface 32 may not be parallel to the floor surface 31, it is assumed here that the ceiling surface 32 is parallel to the floor surface 31.
- the field of view (measurement range) of the measuring device 20 is relatively wide, but the entire inner space of the three-dimensional object 30 cannot be measured only by performing the measurement once.
- the purpose of the measurement it may be necessary to measure only a part of the inner space of the three-dimensional object 30 of interest, and in this case, the purpose may be achieved even by performing the measurement once.
- this embodiment assumes that the entire inner space of the three-dimensional object 30 is measured. Therefore, the measuring device 20 performs measurement a plurality of times by changing the arrangement and orientation. Although the function of the measuring device 20 will not be described in detail, the measuring device 20 has a function of connecting the first data and the second data obtained by multiple measurements without overlapping based on the attribute of the measurement point. is doing. Therefore, if the measuring device 20 is properly arranged in a plurality of measurements, the first data and the second data output from the measuring device 20 include information on the entire surface of the three-dimensional object 30.
- the modeling device 10 generates a model of the three-dimensional object 30 using the data acquired from the measurement device 20.
- a wire frame model is used as the model of the three-dimensional object 30.
- the wire frame model is a data structure model in which points on the surface of the three-dimensional object 30 are connected by line segments in order to represent the surface shape of the three-dimensional object 30.
- the model may be a surface model.
- the modeling apparatus 10 may have a function of extracting various information related to the three-dimensional object 30 using a model.
- the modeling apparatus 10 when the modeling apparatus 10 is added with a function of changing the surface attribute of the surface 3 in the virtual space which is the computer graphics of the three-dimensional object 30, a function of arranging the article 34 in the virtual space, etc., the modeling apparatus 10 functions as a layout simulator. It is also possible. These configurations will be described later.
- the modeling apparatus 10 is configured using a computer that operates according to a program.
- the computer preferably includes a keyboard and a pointing device as the input device 18 and a display device as the output device 17.
- the computer may be a tablet terminal or a smartphone integrated with a display device in which a touch panel as the input device 18 is the output device 17.
- the computers may be designed exclusively.
- the computer may be a computer server or a cloud computing system. The user may enjoy the functions described below using a terminal device capable of communicating with a computer server or a cloud computing system.
- the program is provided by a computer-readable recording medium or through a telecommunication line such as the Internet.
- This program causes a computer to function as the modeling apparatus 10 having the following functions.
- the modeling device 10 includes an acquisition unit 11 that acquires first data and second data from the measurement device 20 as shown in FIG.
- the acquisition unit 11 desirably acquires the second data, but may be configured to acquire only the first data.
- the modeling apparatus 10 includes a modeling unit 14 that generates a model of the three-dimensional object 30 using the first data regarding the three-dimensional object 30 acquired by the acquisition unit 11.
- the modeling apparatus 10 estimates the entire shape of the surface 3 using information obtained by measuring a part of the surface 3 and knowledge about the surface 3 (that is, rules or laws).
- the overall shape of the three-dimensional object 30 is estimated from the overall shape of the surface 3.
- the knowledge about the surface 3 includes knowledge about the shape of the surface 3 and knowledge about the arrangement of the different surfaces 3.
- the shape of the surface 3 the knowledge that “the room of the building is surrounded by a set of planes” is used.
- the surface 3 is not a flat surface, it is possible to use a relatively simple curved surface (a U-shaped cross section, a hemispherical shape, etc.), but in the present embodiment, it is assumed that the surface 3 is a flat surface.
- the knowledge that “the boundary line of the adjacent surfaces 3 is included in the line of intersection of the adjacent surfaces 3” is used.
- the surface 3 is a plane, the knowledge that “the vertex that is one end of the boundary line is shared by the three surfaces 3” is used.
- the model of the three-dimensional object 30 is represented by vertices corresponding to the corners of each plane constituting the three-dimensional object 30 and a line segment connecting the two vertices.
- the surface 3 In order to generate a model of the three-dimensional object 30, the surface 3 must be specified using the first data or the second data output from the measuring device 20. That is, the modeling device 10 identifies the surface 3 and uses the first data at a plurality of (three or more) measurement points belonging to the identified surface 3 to generate a mathematical expression representing the identified surface 3.
- An extraction unit 12 is provided.
- the plane 3 is assumed to be a plane, and a mathematical expression representing the plane 3 is a plane formula as described later.
- the modeling apparatus 10 includes a vertex extraction unit 13 that extracts vertices shared by the three surfaces 3 using the mathematical formula generated by the surface extraction unit 12.
- the modeling unit 14 provided in the modeling device 10 generates model information representing the three-dimensional object 30 using the mathematical formula generated by the surface extraction unit 12 and the vertex extracted by the vertex extraction unit 13.
- the modeling unit 14 extracts an intersection line shared by two adjacent surfaces 3 as a boundary line between the two surfaces 3 using the mathematical formula generated by the surface extraction unit 12. That is, the modeling unit 14 sets two vertices on the intersection line shared by the two surfaces 3 as end points, and sets a line segment between the two end points as a boundary line between the two surfaces 3.
- the modeling unit 14 generates model information represented by a set of boundary lines. That is, the modeling unit 14 stores a wire frame model in which coordinate values of each vertex of the model and a boundary line having each vertex as an end point are associated with each other. The modeling unit 14 generates a wireframe model represented by vertices and boundary lines, and also represents a model of the three-dimensional object 30 using mathematical expressions representing the surfaces 3 and vertices included in each surface 3. It may be.
- the model of the three-dimensional object 30 is drawn on the screen of the display device. Since the model is formed using three-dimensional coordinate values, a virtual space of the three-dimensional object 30 by computer graphics is formed.
- the surface extraction unit 12 pays attention to the configuration in which the user specifies all the surfaces 3 using the input device 18 and the attribute of the first data, and at least a part such as the floor surface 31 or the ceiling surface 32. At least one of the configuration for automatically specifying the surface 3 is employed.
- the configuration in which the user specifies the surface 3 is that the surface 3 has a function of displaying a color image formed by the second data on the display device which is the output device 17 and an area designated by the input device 18 in the color image. This is realized by the function of recognizing the existing area.
- the color image displayed on the screen of the output device 17 is an image with relatively large distortion as described above (see FIG. 4).
- the boundary line of the surface 3 is inclined or distorted, and it is not easy for the unskilled person to distinguish the surface 3.
- a color image is displayed on the screen of the output device 17. Therefore, the user can use the color information of the surface 3 as a determination material for identifying the type of the surface 3, and can distinguish the surface 3 relatively easily.
- the input device 18 can specify a desired region in the color image in a state where a color image showing the entire three-dimensional object 30 is displayed on the screen of the output device 17.
- a selection frame having an appropriate shape on the screen of the output device 17 and move the selection frame so that the selection frame is included within the range of the surface 3 to be designated.
- the selection frame may have various shapes, but it is desirable to use a simple shape such as a quadrangle, a triangle, or an ellipse.
- the extraction range of the measurement points belonging to the surface 3 is set with respect to the image displayed on the output device 17. It is input interactively by a selection frame. That is, the user looks at the image displayed on the screen of the output device 17 and operates the input device 18 to operate all the surfaces 3 surrounding the room (the floor surface 31, the ceiling surface 32, and the individual wall surfaces 33). ) For the selection frame.
- the surface extraction unit 12 extracts the first data for three measurement points whose relative positions with the outline of the selection frame are predetermined within the range of the selection frame. Since the first data is a three-dimensional coordinate value, by obtaining three coordinate values included in the same plane, the formula of the plane passing through the three measurement points set in the selection frame is uniquely determined. It is determined. In short, a mathematical expression representing a plane including three measurement points is obtained.
- the selection frame does not necessarily have to be set when the three measurement points used by the surface extraction unit 12 are determined.
- a pointing device such as a mouse or a touch pen is used as the input device 18, three or more measurement points may be sequentially selected using the input device 18.
- the coordinate values of the three measurement points A, B, C extracted by the surface extraction unit 12 to define one surface 3 are (Ax, Ay, Az), (Bx, By, Bz), (Cx, Cy). , Cz), the above-described planar parameters a, b, c, and d are determined as follows.
- a (By-Ay) (Cz-Az)-(Cy-Ay) (Bz-Az)
- b (Bz-Az) (Cx-Ax)-(Cz-Az) (Bx-Ax)
- c (Bx ⁇ Ax) (Cy ⁇ Ay) ⁇ (Cx ⁇ Ax) (By ⁇ Ay)
- d ⁇ (Ax + bAy + cAz).
- the coordinate values of the three measurement points A, B, and C extracted by the surface extraction unit 12 are used, a plane equation is obtained by the above-described calculation, but the coordinate values may include measurement errors. . Therefore, instead of the coordinate values of the three measurement points A, B, C extracted by the surface extraction unit 12, the coordinate values of the measurement points included in the surrounding predetermined range are obtained for each measurement point A, B, C, It is desirable to use the median (median) or average value of the obtained coordinate values as the coordinate value for obtaining the mathematical formula.
- the formula of the plane 3 can be determined more accurately. Can do.
- the existence range of the measurement points used for determining the formula of the surface 3 is set within a range where the distance from the candidate surface is ⁇ 10 mm or less, and preferably ⁇ mm or less.
- the measurement points used to determine the mathematical expression of the surface 3 are all measurement points acquired by the acquisition unit 11 as a mother set, and use the measurement points existing within a predetermined distance from the candidate surface in the mother set. . However, if all the measurement points acquired by the acquisition unit 11 are set as a population, measurement points that cause errors are likely to be included. Therefore, only the measurement points existing inside the selection frame described above are used as the population. Also good.
- the surface extraction unit 12 obtains the planar parameters a, b, c, and d applicable to the measurement points extracted from the population by robust estimation.
- Robust estimation is employed to suppress any of the results due to the influence of an abnormal value having a large residual when determining a mathematical formula by the least square method.
- Turkey's bi-weight method is adopted as a robust estimation method.
- a plane equation is determined by the least square method after applying a weighting factor larger than the measurement point where the residual exceeds the predetermined value to the measurement point where the residual is less than the predetermined value.
- the weighting factor is set to 2 (that is, a weighting factor that is twice that of other measurement points).
- the surface extraction unit 12 may obtain a plane equation by another method instead of the robust estimation.
- the surface extraction unit 12 obtains a plurality of plane expressions by performing a process for obtaining a plane expression using three measurement points for various measurement points, and adopts an average of the plurality of plane expressions. It may be determined as an expression. That is, the surface extraction unit 12 may obtain the parameters a, b, c, and d for a plurality of plane expressions, respectively, and adopt the plane expression using the average value for each parameter as a parameter.
- the surface extraction unit 12 can determine a plane equation by a least square method that does not use a weighting factor.
- the user interactively inputs the extraction range of the measurement points included in the plane by using the output device 17 and the input device 18 by the user.
- the measurement points included in the floor surface 31 and the ceiling surface 32 are automatically extracted using the first data even if the user does not specify the extraction range of the measurement points. Is possible.
- the z-axis direction in the three-dimensional orthogonal coordinate system is determined in the vertical direction or the direction orthogonal to the surface on which the measuring device 20 is installed. Except for the case of forming a slope, the floor surface 31 is generally constructed so as to be orthogonal to the vertical direction. Therefore, when the measuring device 20 is installed on the floor surface 31, the floor surface 31 is The orthogonal direction is substantially equivalent. In addition, in the present embodiment, it is assumed that the ceiling surface 32 is parallel to the floor surface 31.
- the measurement points corresponding to the floor surface 31 and the ceiling surface 32 have the same coordinate value in the z-axis direction. That is, when the frequencies of the measurement points having the same coordinate value in the z direction are obtained, the frequencies of the measurement points corresponding to the floor surface 31 and the ceiling surface 32 are compared with the frequencies of the other measurement points as shown in FIG. Expected to be significantly larger.
- the surface extraction unit 12 sets a plurality of sections divided along the z axis, obtains the frequency of the measurement point for each section, and when the frequency exceeds a predetermined reference value, the floor surface 31 or the ceiling surface 32. It is judged that. That is, by appropriately setting the reference value, there are only two sections in which the frequency exceeds the reference value, one section represents the range of the z coordinate of the floor 31, and the other section is the z of the ceiling 32. It can be considered to represent a range of coordinates. Actually, of the two sections, the section located on the lower side in the real space corresponds to the floor surface 31, and the section located on the upper side corresponds to the ceiling surface 32.
- the frequency of the floor surface 31 or the ceiling surface 32 may decrease, and the frequency is more than 3 sections.
- the standard value may be exceeded. Accordingly, the surface extraction unit 12 determines that the lowermost section in the z-axis direction among the sections in which the frequency exceeds the reference value is a section that satisfies the condition of the floor surface 31, and the first data included in this section Is determined to belong to the floor 31.
- the surface extraction unit 12 sets the first data included in the section to the ceiling surface 32. Judge that it belongs.
- the condition of the ceiling surface 32 for example, a condition that the distance to the section satisfying the condition of the floor surface 31 is within an allowable range, a condition that the frequency is the maximum among the candidates for the ceiling surface 32, and the like are used.
- the first data belonging to the floor surface 31 and the ceiling surface 32 is the first data belonging to the wall surface 33.
- the wall surface 33 can be distinguished from other articles.
- the frequency of the first data is obtained for each plane, and the distance from the measuring device 20 is supplementary information.
- the floor surface 31 may be provided with a lid of the underfloor storage, a lid of the underfloor inspection port, a rug, furniture, a houseplant, and the like, and the ceiling surface 32 may have lighting fixtures,
- the wall surface 33 may have a curtain, a window, or the like. That is, an object unnecessary for extracting the surface 3 may be present on each surface 3, and members to be noted when renovating may be disposed. Since measurement points included in these objects or members cannot be used as measurement points belonging to a specific surface 3 by the surface extraction unit 12, it is desirable to exclude them from the measurement point extraction range.
- the modeling apparatus 10 includes a color extraction unit 123 that extracts second data having color information in a specified range and displays the second data on the screen of the output device 17.
- the color extraction unit 123 excludes specific color information for the color of the object or member as described above from the second data acquired by the acquisition unit 11, and the floor surface 31, the ceiling surface 32, and the wall surface 33 are extracted. As described above, the range of color information to be extracted can be set.
- the color extraction unit 123 may determine the range of color information so that the floor surface 31, the ceiling surface 32, and the wall surface 33 are individually displayed on the output device 17.
- the range of color information to be extracted by the color extraction unit 123 is preferably set in advance, but may be configured to extract color information of a portion where the cursor is located.
- the color information of a plurality of measurement points belonging to different planes 3 can be extracted, and the color difference is within a predetermined range with respect to the color information for each measurement point. It is desirable that color information can be extracted. If the color extraction unit 123 is configured in this way, measurement points having color information corresponding to the floor surface 31, the ceiling surface 32, and the wall surface 33 are extracted in a lump, and measurement points having other color information are extracted. It becomes possible to exclude.
- the color extraction unit 123 described above is configured to specify color information of measurement points to be extracted, but may be configured to specify color information of measurement points to be excluded.
- the color extraction unit 123 roughly identifies measurement points related to objects or members different from the floor surface 31, the ceiling surface 32, and the wall surface 33 by using the color information obtained from the second data. Is possible. For example, if an object such as furniture or a houseplant has a color different from that of the floor surface 31, the ceiling surface 32, and the wall surface 33, the color extraction unit 123 excludes measurement points corresponding to these objects or members by color information. It becomes possible to do.
- the position of the selection frame can be easily determined when the user designates the measurement point extraction range using the input device 18.
- a place to be noted when renovating a room can be immediately confirmed on the screen of the output device 17, attention is drawn when a member for renovation is procured.
- the input device 18 When an opening such as a window or an entrance is added to the model, the input device 18 is operated so that a frame surrounding the opening is displayed on the screen of the output device 17, and this frame matches the opening.
- the size of the frame may be adjusted so that The process of matching the frame with the opening can be automated only by manually setting the frame.
- the operation of matching the frame with the opening may be performed in the same manner as general drawing graphic software. For example, in a state where a rectangular frame is displayed on the screen and the frame is selected by clicking the mouse, the frame can be moved to the position of the opening on the screen by dragging the entire frame.
- the size of the frame can be adjusted by extending or contracting the frame vertically and horizontally by dragging a part of the frame so that the frame matches the opening.
- the vertex extraction unit 13 is shared by the floor surface 31 and the two wall surfaces 33, for example.
- the vertex shared by the vertex, ceiling surface 32 and the two wall surfaces 33 is extracted.
- the vertex extraction unit 13 extracts a vertex shared by the floor surface 31, the ceiling surface 32, and the wall surface 33. If there is a step on the floor surface 31, the vertex shared by the two surfaces forming the step and the wall surface 33 is extracted. That is, the vertex extraction unit 13 extracts vertices shared by three planes 3 for the three-dimensional object 30.
- the vertex extraction unit 13 extracts the vertex in a state where an image based on the second data is displayed on the output device 17 constituting the surface extraction unit 12.
- the target surface 3 is interactively input from the input device 18. In this case, for example, on the image displayed on the screen of the output device 17, the user moves the cursor to each of the three surfaces 3 for which the coordinate value of the vertex is to be obtained, and selects each surface 3 ( Click the mouse, press the return key, etc.).
- a rule is set in the order of the surfaces 3 for designating the measurement points, the surface 3 from which the vertex is extracted is selected.
- the process of inputting from the input device 18 can be omitted. That is, when such a rule is established, it is possible to perform the work of designating the surface 3 from which the vertex is extracted simultaneously with the designation of the measurement point.
- a rule may be established in which measurement points for obtaining a plane equation are specified in the order of the floor surface 31, the ceiling surface 32, and the wall surface 33, and the measurement points are specified clockwise for the wall surface 33.
- the vertex extraction unit 13 treats three mathematical expressions each representing the selected three surfaces 3 as simultaneous equations, and calculates a solution of the equations.
- the obtained solution becomes a three-dimensional coordinate value representing a vertex common to the three planes 3.
- the vertex extraction unit 13 performs a combination of three planes 3 for all planes 3 obtained by the plane extraction unit 12.
- the obtained three mathematical expressions are treated as simultaneous equations, and the solution of these equations is calculated.
- the solution obtained here is a three-dimensional coordinate value related to the vertex of the three-dimensional object 30 when the three-dimensional object 30 has a simple shape such as a rectangular parallelepiped.
- a simple shape means a shape in which a combination that provides a solution matches the number of vertices.
- the rectangular parallelepiped has 20 combinations of three planes 3 each, and there are 8 combinations that can be solved, and the number of vertices is 8, so it can be said to be a simple shape.
- the number of solutions may not match the number of vertices.
- the number of vertices is twelve, but there are 56 combinations of three planes 3 each, and there are 18 combinations from which solutions can be obtained. That is, six of the obtained solutions represent non-existing coordinate values.
- the verification unit 131 verifies whether or not the vertex mechanically extracted by the vertex extraction unit 13 is an actual vertex, and performs processing for excluding the nonexistent vertex.
- the verification process is a process for determining whether or not a measurement point exists in the three surfaces 3 sharing each vertex for each of the vertices extracted by the vertex extraction unit 13, and at least one surface If the measurement point does not exist in 3, the vertex is excluded.
- an allowable range is set for determining whether or not the measurement point is included in the surface 3.
- the verification unit 131 determines a distance range corresponding to the allowable range, and if the measurement point exists within the distance range from the surface 3, the corresponding measurement point is included in the surface 3. To be judged.
- the solution obtained from the mathematical expression representing the wall 33 is, as shown in FIG.
- the virtual vertices 36 and 37 are represented.
- the vertex 36 is obtained by calculation by extending the wall surface 33 to the outside of the room
- the vertex 37 is obtained by calculation by extending the wall surface 33 to the inside of the room.
- the verification unit 131 determines whether or not there are measurement points for the three planes 3 sharing the vertices for all the vertices 35, 36, and 37 obtained in the calculation.
- the surface sharing the virtual vertex 36 outside the room is only the virtual surface 38 formed outside the room, and there is no measurement point in the surface 38, so the vertex 36 is excluded.
- the surface sharing the virtual vertex 37 inside the room is the actual wall surface 33 and the virtual surface 39 formed inside the room, and there are no measurement points in the surface 39. 37 is excluded.
- the modeling unit 14 defines a straight line connecting the vertices obtained by the vertex extraction unit 13 along the intersection of two adjacent surfaces as a boundary line, and represents the three-dimensional object 30 as a set of boundary lines. Generate the model. In addition, the modeling unit 14 stores, as model information, three-dimensional coordinate values for the vertices and information on a set of vertices connected by the boundary line.
- the monitor device 41 When the three-dimensional object 30 is displayed on the screen of the monitor device 41 (see FIG. 8) by computer graphics technology using the model information stored in the modeling unit 14, a virtual space having three-dimensional information. Is displayed on the monitor device 41.
- the monitor device 41 is preferably used also as the output device 17, but the output device 17 and the monitor device 41 may be provided separately.
- the modeling unit 14 since the modeling unit 14 stores model information, the distance between vertices on the boundary line in the model can be obtained. That is, since the three-dimensional coordinate value of the vertex on the boundary line is obtained using the first data that is the result of the three-dimensional measurement performed on the three-dimensional object 30, the vertex coordinate value is used. Thus, the distance between the vertices in the real space can be easily obtained. That is, it is desirable that the modeling apparatus 10 includes a dimension calculation unit 15 that obtains each dimension of the boundary line using the coordinate value of the vertex. Furthermore, the modeling apparatus 10 preferably includes a drawing generation unit 16 that generates information for describing a drawing to which the dimension obtained by the dimension calculation unit 15 with respect to the three-dimensional object 30 is applied. The numerical value of the dimension calculated by the dimension calculation unit 15 is reflected in the drawing generated by the drawing generation unit 16.
- the drawing generation unit 16 is a dimensional diagram in which the three-dimensional object 30 is developed and the dimensions obtained by the dimension calculation unit 15 are entered. Also, if the three-dimensional object 30 is a room and you are going to construct it by adding wallpaper, heat insulation sheets, soundproof sheets, etc. to the inner surface of the inner wall, create an allocation diagram that describes the dimensions of the members to be constructed Also good. In the allocation diagram, the position where the member is attached is also described in consideration of the thickness dimension of the member to be constructed. Further, in order to create an allocation diagram, an attribute may be given to the surface 3 so as to include the texture and color tone of the members described later.
- a mathematical expression representing the surface 3 is generated using measurement points that are sure to be included in the surface 3, and a model of the three-dimensional object 30 is generated by calculation using the mathematical expression. Therefore, the model generated by the modeling unit 14 does not include an object other than the three-dimensional object 30 or an opening formed in the three-dimensional object 30. Since this information is also required in the allocation diagram, it is desirable that the drawing generation unit 16 allows the user to process the drawing using the operation device (which may also be used as the input device 18). .
- the ceiling surface 32 and the floor surface 31 are parallel.
- the ceiling surface 32 and the floor surface 31 are assumed. And may not be parallel.
- the ceiling surface 32 is inclined with respect to the floor surface 31.
- the floor surface 31 may form three or more vertices between one wall surface 33.
- the vertex extraction unit 13 can extract the vertexes shared by the three adjacent surfaces of the floor surface 31, the ceiling surface 32, and the wall surface 33. Therefore, the modeling unit 14 can generate model information representing the three-dimensional object 30 using the plane formula and the vertex. That is, since the modeling unit 14 can generate a model even for the inclined ceiling surface 32, the drawing generation unit 16 creates an allocation diagram when constructing a heat insulating material on this type of ceiling surface 32. It becomes possible to do.
- model information generated by the modeling device 10 described above various processes can be performed on the virtual space of computer graphics formed by the three-dimensional object 30. That is, as shown in FIG. 8, by providing a layout simulator 40 that uses 3D graphics technology and assigning various attributes to the model by the layout simulator 40, the virtual space can be processed.
- the layout simulator 40 may have a function of simulating illumination, a function of changing the position of the viewpoint, and the like.
- the layout simulator 40 preferably includes an article placement unit 43 that places an article such as furniture in the room.
- the article placement unit 43 places the article 34 in the virtual space using known three-dimensional data for the article 34 (see FIG. 9) such as furniture.
- the layout simulator 40 includes a display control unit 42 that displays the virtual space on the screen of the monitor device 41, and immediately reflects the arrangement of the article 34 in the virtual space on the display of the monitor device 41.
- the layout simulator 40 includes a texture expression unit 44 that pastes texture information on the surface 3 of the model, and an attribute adjustment unit 45 that adjusts attributes including the position of the article 34 and the texture.
- the attribute adjusting unit 45 preferably has a function of adjusting the color tone of the article 34 and the surface 3.
- the layout simulator 40 can simulate not only the texture or color of the floor surface 31, the ceiling surface 32, and the wall surface 33, but also the type and arrangement of the article 34. Therefore, when a room renovation is planned, it is possible to imagine the state after the remodeling by simulation, and the persuasive power to the customer can be enhanced.
- the layout simulator 40 can be configured by diverting a computer that constitutes the modeling apparatus 10, and can be configured by using a computer different from the computer that constitutes the modeling apparatus 10. Further, similarly to the modeling apparatus 10, the function of the layout simulator 40 may be realized in a computer server or a cloud computing system.
- the modeling apparatus 10 described above is an example in which the second data is used.
- the second data is not always necessary, and as is clear from the above-described technique, the model is generated using only the first data. It is possible to generate.
- the modeling apparatus 10 includes the acquisition unit 11, the surface extraction unit 12, the vertex extraction unit 13, and the modeling unit 14.
- the acquisition unit 11 acquires three-dimensional coordinate values related to a plurality of measurement points belonging to the three-dimensional object 30 from the measurement device 20 as first data.
- the surface extraction unit 12 generates a plane expression that represents the surface 3 using the first data regarding the measurement points belonging to the surface 3.
- the vertex extraction unit 13 extracts the points as vertices shared by the plurality of surfaces by calculating points that simultaneously satisfy the planar expressions representing the plurality of adjacent surfaces of the surfaces 3.
- the modeling unit 14 generates information on a model representing the three-dimensional object 30 using the planar formula and the vertex.
- the surface extraction unit 12 uses a plane defined by a plane formula obtained using the coordinate values of three measurement points belonging to one surface as a candidate surface, and further includes a plurality of planes existing within a predetermined distance range from the candidate surface.
- the plane equation is determined by performing recalculation using the measurement points.
- the boundary of the surface 3 hidden by the shielding object 30A is estimated even when a part of the three-dimensional object 30 is hidden by another shielding object 30A. Is possible.
- a model excluding the shielding object 30A can be generated.
- the surface extraction unit determines the plane equation by determining the candidate surface and then performing recalculation using measurement points existing within a predetermined distance range with respect to the candidate surface, the shape of the three-dimensional object is accurately determined. Can be reproduced with a model well.
- the surface extraction unit performs robust estimation, the plane equation can be determined with good accuracy.
- the three-dimensional object 30 is a room surrounded by a floor surface 31, a ceiling surface 32, and a plurality of wall surfaces 33. It is desirable that the surface extraction unit 12 generates a plane expression that represents the floor surface 31, the ceiling surface 32, and the wall surface 33. Further, it is desirable that the vertex extraction unit 13 extracts vertices shared by three adjacent surfaces of the floor surface 31, the ceiling surface 32, and the wall surface 33, respectively.
- the modeling unit 14 desirably determines a boundary line shared by two adjacent surfaces of the floor surface 31, the ceiling surface 32, and the wall surface 33.
- a model is formed by the vertices formed by the floor surface 31, the ceiling surface 32, and the wall surface 33 and the line segments connecting the vertices. That is, a model similar in shape to the room that is the three-dimensional object 30 is formed.
- the measuring device 20 desirably has a function of outputting second data in which pixel values of an image obtained by capturing the three-dimensional object 30 are associated with each measurement point.
- the surface extraction unit 12 preferably includes an output unit 121 and an input unit 122.
- the output unit 121 displays the second data on the screen of the output device 17.
- the input unit 122 interactively inputs information for designating measurement points belonging to the surface 3 from the input device 18 to the image displayed on the output device 17.
- a selection frame can be set.
- the second data includes color information.
- the acquisition unit 11 has a function of acquiring the second data from the measurement device 20, and extracts and outputs second data having color information in a set range from the second data acquired by the acquisition unit 11. It is desirable to further include a color extraction unit 123 that is displayed on the screen of the device 17.
- the user can determine the measurement point extraction range for obtaining the plane expression of the surface 3 based on the color of the image displayed on the screen of the output device 17. In other words, measurement points that should be removed when obtaining the plane expression of the surface 3 can be excluded using the color information.
- the vertex extraction unit 13 interactively input a plurality of surfaces sharing the vertex from the input device 18 in a state where an image is displayed on the output device 17.
- the vertex extraction unit 13 calculates a three-dimensional coordinate value related to the vertex common to the surface input from the input device 18 by obtaining a solution of simultaneous equations composed of planar equations.
- the vertex extraction unit 13 automatically obtains the vertex by combining the surfaces extracted from the three-dimensional object 30.
- the processing load of the vertex extraction unit 13 is reduced. That is, it becomes possible to easily determine the coordinate values of the vertices necessary for generating the model.
- the vertex extraction unit 13 may employ a configuration that calculates three-dimensional coordinate values related to vertices that share a surface by finding a solution of simultaneous equations consisting of planes by combining three surfaces 3 each. .
- a verification unit 131 that excludes the vertex when there is no measurement point on at least one of the surfaces 3 sharing the vertex is provided. desirable.
- the vertex extraction unit 13 automatically combines the surfaces 3 and a solution of simultaneous equations can be obtained, the corresponding solution can be automatically obtained as a vertex. Further, since the verification unit 131 verifies whether or not the vertex automatically extracted by the vertex extraction unit 13 exists, the burden on the user is small.
- the first data is a coordinate value of an orthogonal coordinate system in which a direction orthogonal to the floor surface 31 is one coordinate axis.
- the surface extraction unit 12 obtains the frequency at which the measurement point appears for each section divided along the coordinate axis, and the frequency exceeds a predetermined reference value in the section, and the section satisfies the floor condition.
- the ceiling surface 32 and the wall surface 33 are easily extracted with the floor surface 31 as a reference.
- the room has a structure in which the floor 31 and the ceiling 32 are parallel.
- the surface extraction unit 12 it is desirable to determine that the measurement points included in are belonging to the ceiling surface 32.
- the floor surface 31 and the ceiling surface 32 of the surfaces 3 constituting the room are automatically determined. Therefore, by treating the remaining plane as the wall surface 33, a room model can be easily generated.
- the modeling apparatus 10 includes a dimension calculation unit 15 that calculates a distance between vertices in a model, and a drawing generation unit 16 that generates information that describes a drawing to which the distance calculated by the dimension calculation unit 15 is applied to the three-dimensional object 30. It is desirable to further comprise.
- the modeling device 10 having this configuration can automatically generate a drawing of the three-dimensional object 30 and can add dimensions to the drawing.
- This configuration has an advantage of facilitating procurement of materials for renovation, for example, because it is possible to automatically create a room drawing with dimensions when renovating the room.
- the modeling method of the present embodiment means a procedure for generating a model based on the first data obtained by the measurement device 20 performing three-dimensional measurement on the three-dimensional object 30 including the plurality of surfaces 3. Therefore, in this modeling method, first, a three-dimensional coordinate value related to a plurality of measurement points belonging to the three-dimensional object 30 is obtained from the measurement apparatus 20 that performs three-dimensional measurement of the three-dimensional object 30 including the plurality of surfaces 3 as the first data. Is acquired by the acquisition unit 11. Next, in this modeling method, for each surface 3, the surface extraction unit 12 generates a plane expression representing the surface 3 using the first data regarding the measurement points belonging to the surface 3.
- the vertex extracting unit 13 extracts the point as a vertex shared by the plurality of surfaces. Further, the modeling unit 14 generates a model representing the three-dimensional object 30 using the plane formula and the vertex.
- the surface extraction unit 12 uses a plane defined by a plane equation obtained using the coordinate values of three measurement points belonging to one surface as a candidate surface, and further, a plurality of planes existing within a predetermined distance range from the candidate surface. The plane formula is determined by performing recalculation using the measurement points.
- the layout simulator 40 of the present embodiment described above includes a display control unit 42, an article placement unit 43, a texture expression unit 44, and an attribute adjustment unit 45, as shown in FIG.
- the display control unit 42 displays the virtual space that is the computer graphics of the three-dimensional object 30 on the screen of the monitor device 41 using the model information generated by the modeling device 10.
- the article arrangement unit 43 arranges a three-dimensional article 34 (see FIG. 9) having three-dimensional information at a desired position in the virtual space, and the texture expression unit 44 sets the texture on the surface surrounded by the boundary line in the virtual space. Paste information.
- the attribute adjusting unit 45 adjusts attributes including the position of the article 34 and the texture.
- the layout simulator 40 of this configuration pastes texture information on the surface of the three-dimensional object 30 represented by the model, the appearance of the three-dimensional object 30 can be changed in the virtual space. As a result, it becomes possible to perform simulation of the result in the virtual space when considering the renovation of the room.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Geometry (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Graphics (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Structural Engineering (AREA)
- Civil Engineering (AREA)
- Architecture (AREA)
- Computational Mathematics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Pure & Applied Mathematics (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Image Generation (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims (14)
- 複数の面を備える立体物の3次元計測を行う計測装置から、前記立体物に属する複数個の測定点に関する3次元の座標値を第1のデータとして取得する取得部と、
前記面ごとに、前記面に属する前記測定点に関する前記第1のデータを用いて前記面を表す平面式を生成する面抽出部と、
前記面のうち隣接する複数面をそれぞれ表す前記平面式を同時に満たす点を算出することにより、前記点を前記複数面が共有する頂点として抽出する頂点抽出部と、
前記平面式と前記頂点とを用いて前記立体物を表すモデルの情報を生成するモデル化部とを備え、
前記面抽出部は、
一つの面に属する3個の測定点の座標値を用いて得られる平面式で定められる平面を候補面とし、さらに、候補面から所定の距離範囲に存在する複数個の測定点を用いた再計算を行うことにより平面式を決定する
ことを特徴とするモデリング装置。 - 前記立体物は、床面と天井面と複数の壁面とに囲まれた部屋であって、
前記面抽出部は、前記床面と前記天井面と前記壁面とをそれぞれ表す平面式を生成し、
前記頂点抽出部は、前記床面と前記天井面と前記壁面とのうち隣接する3面が共有する頂点をそれぞれ抽出し、
前記モデル化部は、前記床面と前記天井面と前記壁面とのうち隣接する2面が共有する境界線を定める
請求項1記載のモデリング装置。 - 前記計測装置は、前記立体物を撮像した画像の画素の値を前記測定点ごとに対応付けた第2のデータを出力する機能を有し、
前記面抽出部は、
前記第2のデータを出力装置の画面に表示させる出力部と、
前記出力装置に表示された画像に対して前記面に属する前記測定点を指定する情報を入力装置から対話的に入力させる入力部とを備える
請求項1又は2記載のモデリング装置。 - 前記第2のデータは色情報を含み、
前記取得部は、前記第2のデータを前記計測装置から取得する機能を有し、
前記取得部が取得した前記第2のデータから設定された範囲の色情報を持つ前記第2のデータを抽出して前記出力装置の画面に表示させる色抽出部をさらに備える
請求項3記載のモデリング装置。 - 前記頂点抽出部は、
前記出力装置に画像が表示されている状態において、前記頂点を共有する複数の前記面を前記入力装置から対話的に入力させ、
前記入力装置から入力された前記面に共通する頂点に関する3次元の座標値を、前記平面式からなる連立方程式の解を求めることにより算出する
請求項3又は4記載のモデリング装置。 - 前記頂点抽出部は、
前記面を3枚ずつ組み合わせて前記平面式からなる連立方程式の解を求めることにより前記面を共通にする頂点に関する3次元の座標値を算出する
請求項1~4のいずれか1項に記載のモデリング装置。 - 前記頂点抽出部が抽出した前記頂点のそれぞれについて、当該頂点を共有する前記面のうちの少なくとも1面に前記測定点が存在しない場合に、当該頂点を除外する検証部をさらに備える
請求項6記載のモデリング装置。 - 前記第1のデータは、前記床面に直交する方向を1つの座標軸とする直交座標系の座標値であって、
前記面抽出部は、前記座標軸に沿って分割された区間ごとに前記測定点が出現する度数を求め、前記区間のうち前記度数が所定の基準値を超え、かつ前記区間が前記床面の条件を満たす場合に、当該区間に含まれる前記測定点を前記床面に属すると判断する
請求項2記載のモデリング装置。 - 前記部屋は、前記床面と前記天井面とが平行する構造であって、
前記面抽出部は、前記区間のうち前記度数が所定の基準値を超え、かつ前記区間が前記床面の条件を満たす区間よりも上方であって前記天井面の条件を満たす場合に、当該区間に含まれる前記測定点を前記天井面に属すると判断する
請求項8記載のモデリング装置。 - 前記モデルにおける頂点間の距離を求める寸法算出部と、
前記立体物に関して前記寸法算出部が求めた前記距離を適用した図面を記述する情報を生成する図面生成部とをさらに備える
請求項1~9のいずれか1項に記載のモデリング装置。 - 請求項1~10のいずれか1項に記載のモデリング装置と、
前記計測装置とを備える
ことを特徴とする3次元モデル生成装置。 - 複数の面を備える立体物の3次元計測を行う計測装置から、前記立体物に属する複数個の測定点に関する3次元の座標値を第1のデータとして取得部が取得し、
次に、前記面ごとに、前記面に属する前記測定点に関する前記第1のデータを用いて前記面を表す平面式を面抽出部が生成し、
その後、前記面のうち隣接する複数面をそれぞれ表す前記平面式を同時に満たす点を算出することにより、前記点を前記複数面が共有する頂点として頂点抽出部が抽出し、
前記平面式と前記頂点とを用いて前記立体物を表したモデルの情報をモデル化部が生成し、
前記面抽出部は、
一つの面に属する3個の測定点の座標値を用いて得られる平面式で定められる平面を候補面とし、さらに、候補面から所定の距離範囲に存在する複数個の測定点を用いた再計算を行うことにより平面式を決定する
ことを特徴とするモデリング方法。 - コンピュータを、
複数の面を備える立体物の3次元計測を行う計測装置から、前記立体物に属する複数個の測定点に関する3次元の座標値を第1のデータとして取得する取得部と、前記面ごとに、前記面に属する前記測定点に関する前記第1のデータを用いて前記面を表す平面式を生成する面抽出部と、前記面のうち隣接する複数面をそれぞれ表す前記平面式を同時に満たす点を算出することにより、前記点を前記複数面が共有する頂点として抽出する頂点抽出部と、前記平面式と前記頂点とを用いて前記立体物を表したモデルの情報を生成するモデル化部とを備え、前記面抽出部は、一つの面に属する3個の測定点の座標値を用いて得られる平面式で定められる平面を候補面とし、さらに、候補面から所定の距離範囲に存在する複数個の測定点を用いた再計算を行うことにより平面式を決定する
モデリング装置として機能させる
プログラム。 - 請求項1~10のいずれか1項に記載したモデリング装置が生成した前記モデルの情報を用いて前記立体物のコンピュータグラフィックスである仮想空間をモニタ装置の画面に表示させる表示制御部と、
前記仮想空間の所望位置に3次元の情報を持つ立体的な物品を配置する物品配置部と、
前記仮想空間において前記面にテクスチャの情報を貼り付ける質感表現部と、
前記物品および前記テクスチャの位置を含む属性を調節する属性調節部とを備える
ことを特徴とするレイアウトシミュレータ。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014553557A JP5821012B2 (ja) | 2013-05-31 | 2014-05-30 | モデリング装置、3次元モデル生成装置、モデリング方法、プログラム、レイアウトシミュレータ |
US14/893,199 US9984177B2 (en) | 2013-05-31 | 2014-05-30 | Modeling device, three-dimensional model generation device, modeling method, program and layout simulator |
EP14804644.4A EP3007129A4 (en) | 2013-05-31 | 2014-05-30 | MODELING DEVICE, THREE-DIMENSIONAL MODEL GENERATION DEVICE, MODELING METHOD, PROGRAM AND LAYOUT SIMULATOR |
CN201480031260.3A CN105264566B (zh) | 2013-05-31 | 2014-05-30 | 建模装置、三维模型生成装置、建模方法和布局模拟器 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-115400 | 2013-05-31 | ||
JP2013115400 | 2013-05-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014192316A1 true WO2014192316A1 (ja) | 2014-12-04 |
Family
ID=51988366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/002895 WO2014192316A1 (ja) | 2013-05-31 | 2014-05-30 | モデリング装置、3次元モデル生成装置、モデリング方法、プログラム、レイアウトシミュレータ |
Country Status (5)
Country | Link |
---|---|
US (1) | US9984177B2 (ja) |
EP (1) | EP3007129A4 (ja) |
JP (2) | JP5821012B2 (ja) |
CN (1) | CN105264566B (ja) |
WO (1) | WO2014192316A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105572725A (zh) * | 2016-02-17 | 2016-05-11 | 西南石油大学 | 一种地面微地震监测台站分布设计方法 |
KR20160073025A (ko) * | 2014-12-16 | 2016-06-24 | 이주성 | 실측을 통한 증강현실기반의 객체 생성장치 및 그 방법 |
JP2016212086A (ja) * | 2015-04-28 | 2016-12-15 | 三菱電機株式会社 | シーン内の寸法を求める方法 |
JP2018124985A (ja) * | 2017-01-31 | 2018-08-09 | 三菱電機株式会社 | 平面セグメントを用いて点群を完成させる方法およびシステム |
JP2019105876A (ja) * | 2017-12-08 | 2019-06-27 | 株式会社Lifull | 情報処理装置、情報処理方法、及び情報処理用プログラム |
JP2020017276A (ja) * | 2018-07-23 | 2020-01-30 | 3アイ インコーポレイテッド | 適応的三次元空間生成方法及びそのシステム |
Families Citing this family (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101679741B1 (ko) * | 2015-05-06 | 2016-11-28 | 고려대학교 산학협력단 | 외곽 공간 특징 정보 추출 방법 |
CN108510433B (zh) * | 2017-02-28 | 2020-03-24 | 贝壳找房(北京)科技有限公司 | 空间展示方法、装置及终端 |
CN108736993B (zh) * | 2017-04-17 | 2022-01-25 | 中兴通讯股份有限公司 | 一种可见面判定方法、反向射线追踪方法及装置 |
JP6969157B2 (ja) * | 2017-05-24 | 2021-11-24 | 富士フイルムビジネスイノベーション株式会社 | 三次元形状データの編集装置、及び三次元形状データの編集プログラム |
JP7133971B2 (ja) * | 2018-04-27 | 2022-09-09 | 清水建設株式会社 | 3次元モデル生成装置及び3次元モデル生成方法 |
CN111246197B (zh) | 2018-05-06 | 2022-03-22 | Oppo广东移动通信有限公司 | 三维视频通信方法及系统、电子装置、服务器和可读存储介质 |
US10679372B2 (en) | 2018-05-24 | 2020-06-09 | Lowe's Companies, Inc. | Spatial construction using guided surface detection |
CN108961379B (zh) * | 2018-06-14 | 2023-03-31 | 广东韩丽家居集团股份有限公司 | 一种基于自动测量而定制家具的方法 |
CN108961395B (zh) * | 2018-07-03 | 2019-07-30 | 上海亦我信息技术有限公司 | 一种基于拍照重建三维空间场景的方法 |
JP7187234B2 (ja) * | 2018-09-28 | 2022-12-12 | エーティーラボ株式会社 | 三次元形状作成装置、三次元形状作成方法および三次元形状作成プログラム |
WO2020217651A1 (ja) * | 2019-04-25 | 2020-10-29 | パナソニックIpマネジメント株式会社 | 寸法測定装置及び荷物発送用ロッカー |
CN110704896A (zh) * | 2019-09-06 | 2020-01-17 | 久瓴(上海)智能科技有限公司 | 龙骨立柱模型和墙龙骨模型之间连接节点放置方法和产品 |
CN110704897A (zh) * | 2019-09-06 | 2020-01-17 | 久瓴(上海)智能科技有限公司 | 墙龙骨模型和底导梁模型之间的连接节点放置方法和产品 |
CN110704900B (zh) * | 2019-09-06 | 2023-11-21 | 久瓴(江苏)数字智能科技有限公司 | 龙骨立柱模型和墙龙骨模型之间连接节点放置方法和产品 |
CN111126450B (zh) * | 2019-11-29 | 2024-03-19 | 上海宇航系统工程研究所 | 一种基于九线构型的长方体空间飞行器的建模方法及装置 |
US11763478B1 (en) | 2020-01-17 | 2023-09-19 | Apple Inc. | Scan-based measurements |
US11551422B2 (en) * | 2020-01-17 | 2023-01-10 | Apple Inc. | Floorplan generation based on room scanning |
JP7433121B2 (ja) | 2020-04-07 | 2024-02-19 | 株式会社キーエンス | 変位測定装置 |
CN112231787B (zh) * | 2020-10-16 | 2024-04-19 | 深圳金装科技装饰工程有限公司 | 一种应用于家装系统中墙体辅助绘制方法、装置 |
CN113935097B (zh) * | 2021-10-26 | 2022-12-06 | 山东同圆数字科技有限公司 | 一种基于bim引擎数据的建筑空间分析方法及系统 |
CN114332428B (zh) * | 2021-12-30 | 2022-08-26 | 北京发现角科技有限公司 | 虚拟房屋房间分割效果的实现方法和装置 |
CN116561995B (zh) * | 2023-04-25 | 2024-03-08 | 国网黑龙江省电力有限公司经济技术研究院 | 一种基于仿真建模的共享杆塔安全使用检测方法 |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04133184A (ja) | 1990-09-26 | 1992-05-07 | Secom Co Ltd | 室内3次元モデル作成方法とその装置 |
Family Cites Families (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61175778A (ja) | 1985-01-30 | 1986-08-07 | Fujitsu Ltd | 形状モデル作成方法 |
JPH07152810A (ja) * | 1993-11-26 | 1995-06-16 | Toshiba Corp | 環境モデル作成装置 |
JP3030485B2 (ja) * | 1994-03-17 | 2000-04-10 | 富士通株式会社 | 3次元形状抽出方法及び装置 |
JP4025442B2 (ja) * | 1998-12-01 | 2007-12-19 | 富士通株式会社 | 三次元モデル変換装置及び方法 |
JP2000306111A (ja) | 1999-04-16 | 2000-11-02 | Armonicos:Kk | 輪郭線抽出装置、輪郭線抽出方法、輪郭線抽出プログラムを記録したコンピュータ読み取り可能な記録媒体及び輪郭線データを記録したコンピュータ読み取り可能な記録媒体 |
JP2003006680A (ja) * | 2001-06-20 | 2003-01-10 | Zenrin Co Ltd | 3次元電子地図データの生成方法 |
JP2003156333A (ja) | 2001-11-21 | 2003-05-30 | Nikon Corp | 測距装置、並びに、これを用いた室内計測装置及びトータルステーション |
JP2003323461A (ja) * | 2002-05-01 | 2003-11-14 | Mitsubishi Heavy Ind Ltd | Cadデータ作成装置および情報加工方法 |
AU2003248269A1 (en) * | 2002-07-12 | 2004-02-02 | Iwane Laboratories, Ltd. | Road and other flat object video plan-view developing image processing method, reverse developing image conversion processing method, plan-view developing image processing device, and reverse developing image conversion processing device |
JP3965686B2 (ja) * | 2002-12-19 | 2007-08-29 | 株式会社日立製作所 | 視覚情報処理装置および適用システム |
JP2004206551A (ja) * | 2002-12-26 | 2004-07-22 | Toppan Printing Co Ltd | 三次元内装空間シミュレーションシステム |
JP2006003263A (ja) * | 2004-06-18 | 2006-01-05 | Hitachi Ltd | 視覚情報処理装置および適用システム |
US7728833B2 (en) | 2004-08-18 | 2010-06-01 | Sarnoff Corporation | Method for generating a three-dimensional model of a roof structure |
JP2006058244A (ja) | 2004-08-23 | 2006-03-02 | 3D Media Co Ltd | 画像処理装置及びコンピュータプログラム |
US8805894B2 (en) * | 2004-11-05 | 2014-08-12 | Michael Valdiserri | Interactive 3-dimensional object-oriented database information storage/retrieval system |
US8248403B2 (en) * | 2005-12-27 | 2012-08-21 | Nec Corporation | Data compression method and apparatus, data restoration method and apparatus, and program therefor |
CN1945213B (zh) * | 2006-11-02 | 2010-12-22 | 武汉大学 | 基于可量测实景图像的可视化位置服务的实现方法 |
CN101034208A (zh) * | 2007-04-04 | 2007-09-12 | 大连东锐软件有限公司 | 三维仿真沙盘系统 |
US8107735B2 (en) * | 2007-04-10 | 2012-01-31 | Denso Corporation | Three dimensional shape reconstitution device and estimation device |
JP4418841B2 (ja) * | 2008-01-24 | 2010-02-24 | キヤノン株式会社 | 作業装置及びその校正方法 |
US8368712B2 (en) * | 2008-08-28 | 2013-02-05 | Pixar | Mesh transfer in n-D space |
CN101551916B (zh) * | 2009-04-16 | 2012-02-29 | 浙江大学 | 一种基于本体技术的三维场景建模方法及系统 |
JP5343042B2 (ja) * | 2010-06-25 | 2013-11-13 | 株式会社トプコン | 点群データ処理装置および点群データ処理プログラム |
JP5462093B2 (ja) | 2010-07-05 | 2014-04-02 | 株式会社トプコン | 点群データ処理装置、点群データ処理システム、点群データ処理方法、および点群データ処理プログラム |
CN101887597B (zh) * | 2010-07-06 | 2012-07-04 | 中国科学院深圳先进技术研究院 | 建筑物三维模型构建方法及系统 |
-
2014
- 2014-05-30 US US14/893,199 patent/US9984177B2/en active Active
- 2014-05-30 JP JP2014553557A patent/JP5821012B2/ja not_active Expired - Fee Related
- 2014-05-30 EP EP14804644.4A patent/EP3007129A4/en not_active Ceased
- 2014-05-30 WO PCT/JP2014/002895 patent/WO2014192316A1/ja active Application Filing
- 2014-05-30 CN CN201480031260.3A patent/CN105264566B/zh not_active Expired - Fee Related
-
2015
- 2015-04-28 JP JP2015092023A patent/JP6286805B2/ja not_active Expired - Fee Related
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH04133184A (ja) | 1990-09-26 | 1992-05-07 | Secom Co Ltd | 室内3次元モデル作成方法とその装置 |
Non-Patent Citations (3)
Title |
---|
See also references of EP3007129A4 |
WANG CAIHUA: "Extraction of Polyhedral Description from Panoramic Range Data in Polar Coordinate System", IPSJ SIG NOTES, 22 January 2002 (2002-01-22) - 25 January 2002 (2002-01-25), pages 1 - 7, XP055254028 * |
WANG CAIHUA: "Scene Rikai no Tameno Range Data kara no Tamentai Kijutsu no Chushutsu", THE INSTITUTE OF ELECTRICAL ENGINEERS OF JAPAN KENKYUKAI SHIRYO, vol. IIS-00-2, 11 August 2000 (2000-08-11), pages 13 - 18, XP008179218 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20160073025A (ko) * | 2014-12-16 | 2016-06-24 | 이주성 | 실측을 통한 증강현실기반의 객체 생성장치 및 그 방법 |
KR101665399B1 (ko) * | 2014-12-16 | 2016-10-13 | 이주성 | 실측을 통한 증강현실기반의 객체 생성장치 및 그 방법 |
JP2016212086A (ja) * | 2015-04-28 | 2016-12-15 | 三菱電機株式会社 | シーン内の寸法を求める方法 |
CN105572725A (zh) * | 2016-02-17 | 2016-05-11 | 西南石油大学 | 一种地面微地震监测台站分布设计方法 |
JP2018124985A (ja) * | 2017-01-31 | 2018-08-09 | 三菱電機株式会社 | 平面セグメントを用いて点群を完成させる方法およびシステム |
JP2019105876A (ja) * | 2017-12-08 | 2019-06-27 | 株式会社Lifull | 情報処理装置、情報処理方法、及び情報処理用プログラム |
JP2020017276A (ja) * | 2018-07-23 | 2020-01-30 | 3アイ インコーポレイテッド | 適応的三次元空間生成方法及びそのシステム |
Also Published As
Publication number | Publication date |
---|---|
JP6286805B2 (ja) | 2018-03-07 |
US20160092608A1 (en) | 2016-03-31 |
JP2015165420A (ja) | 2015-09-17 |
EP3007129A1 (en) | 2016-04-13 |
EP3007129A4 (en) | 2016-07-27 |
US9984177B2 (en) | 2018-05-29 |
CN105264566A (zh) | 2016-01-20 |
CN105264566B (zh) | 2018-06-12 |
JPWO2014192316A1 (ja) | 2017-02-23 |
JP5821012B2 (ja) | 2015-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6286805B2 (ja) | モデリング装置、3次元モデル生成装置、モデリング方法、プログラム、レイアウトシミュレータ | |
Hong et al. | Semi-automated approach to indoor mapping for 3D as-built building information modeling | |
JP6238183B2 (ja) | モデリング装置、3次元モデル生成装置、モデリング方法、プログラム | |
JP7294396B2 (ja) | 画像処理装置、画像処理方法、及びプログラム | |
JP6489552B2 (ja) | シーン内の寸法を求める方法 | |
JP5799273B2 (ja) | 寸法計測装置、寸法計測方法、寸法計測システム、プログラム | |
US20150302636A1 (en) | 3d modeling and rendering from 2d images | |
CN111033536A (zh) | 用于在施工现场生成自适应投射现实的方法和系统 | |
Hołowko et al. | Application of multi-resolution 3D techniques in crime scene documentation with bloodstain pattern analysis | |
JP2015201850A (ja) | 撮像装置のパラメータ推定方法 | |
KR20140102108A (ko) | 3차원 실내공간 정보 구축을 위한 라이다 데이터 모델링 방법 및 시스템 | |
JP6185385B2 (ja) | 空間構造推定装置、空間構造推定方法及び空間構造推定プログラム | |
WO2020075252A1 (ja) | 情報処理装置、プログラム及び情報処理方法 | |
Hübner et al. | Evaluation of the microsoft hololens for the mapping of indoor building environments | |
JP6132246B2 (ja) | 寸法計測方法 | |
KR101189167B1 (ko) | 메타정보 없는 단일 영상에서 3차원 개체정보 추출방법 | |
JP2021064267A (ja) | 画像処理装置、及び画像処理方法 | |
Kaiser et al. | Automated Alignment of Local Point Clouds in Digital Building Models | |
Sgherri et al. | The Fortress of Riolo Terme, near Ravenna: digital survey and 3D printing for cultural dissemination | |
Hirose | Simple room shape modeling with sparse 3D point information using photogrammetry and application software | |
Bosché et al. | LASER SCANNING FOR BIM. | |
Yonezawa et al. | Real-time 3D data reduction and reproduction of spatial model using line detection in RGB image | |
Shashkov et al. | Semi-autonomous digitization of real-world environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480031260.3 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2014553557 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14804644 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014804644 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14893199 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |