JP4284644B2 - 3D model construction system and 3D model construction program - Google Patents

3D model construction system and 3D model construction program Download PDF

Info

Publication number
JP4284644B2
JP4284644B2 JP2003146531A JP2003146531A JP4284644B2 JP 4284644 B2 JP4284644 B2 JP 4284644B2 JP 2003146531 A JP2003146531 A JP 2003146531A JP 2003146531 A JP2003146531 A JP 2003146531A JP 4284644 B2 JP4284644 B2 JP 4284644B2
Authority
JP
Japan
Prior art keywords
data
point
dimensional
image data
object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
JP2003146531A
Other languages
Japanese (ja)
Other versions
JP2004348575A (en
Inventor
卉菁 ▲趙▼
亮介 柴崎
Original Assignee
財団法人生産技術研究奨励会
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 財団法人生産技術研究奨励会 filed Critical 財団法人生産技術研究奨励会
Priority to JP2003146531A priority Critical patent/JP4284644B2/en
Publication of JP2004348575A publication Critical patent/JP2004348575A/en
Application granted granted Critical
Publication of JP4284644B2 publication Critical patent/JP4284644B2/en
Application status is Expired - Fee Related legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

[0001]
BACKGROUND OF THE INVENTION
The present invention relates to a three-dimensional model construction system and a three-dimensional model construction program for constructing a three-dimensional model of an object from three-dimensional point cloud data and image data acquired by a laser scanner or the like.
[0002]
[Prior art]
The real world consists of diverse and complex objects, as represented by urban spaces composed of buildings, roads, signs, and various facilities. There are many fields in which such objects are measured and used as a three-dimensional model, including computer graphics. However, since the cost that can be input to a three-dimensional model is limited in many fields, a method for efficiently measuring and modeling a three-dimensional shape is indispensable. Speaking of cities, there have been many practical methods for acquiring 3D data of cities from the sky by using techniques such as aerial photogrammetry. This is an excellent method for efficiently covering a wide area. However, the use of 3D space data in the future is expected to increase the number of cases that require detailed 3D representation from the viewpoint of pedestrians and drivers on the ground. Therefore, it is necessary to more accurately model complex objects that serve as landmarks such as traffic lights.
[0003]
On the other hand, systems that acquire three-dimensional information from the ground using techniques such as stereo photography and moving images and extract signboards and signs, utility poles, and white lines of roads as landmarks have been developed (eg, patent documents). 1, 2). In recent years, the price of laser range scanners using eye-safe lasers has been reduced and is being applied to three-dimensional measurement systems in urban areas, large indoor environments, and ruins. Since the laser range scanner can acquire the shape of an object as point cloud data (three-dimensional point cloud data) directly and with high accuracy, the laser range scanner can be used to obtain a relatively large scale and simple shape. It is applied to object modeling (for example, Patent Document 3).
[0004]
[Patent Document 1]
JP 2002-081941 A
[Patent Document 2]
JP 09-319896 A
[Patent Document 3]
JP 2003-021507 A
[0005]
[Problems to be solved by the invention]
However, since modeling using point cloud data has limited spatial resolution, it is not always easy to automate the process of generating a surface from a point cloud and building a three-dimensional model. Furthermore, it is very difficult to automatically read the object from the image and the point cloud data. For this reason, model creation technology with full automation in mind is very effective for targets that can be easily automated, but in most cases, both simple and complex are mixed. There is a problem that the scope of application is limited to a small part. As a result, many remaining objects and objects for which automation has failed must be modeled manually, and in most cases, the efficiency improvement effect due to automation can hardly be exhibited. For this reason, there is a demand for a method that makes use of human's superior ability to identify and classify objects while simplifying and improving human work.
[0006]
The present invention has been made in view of such circumstances, and a three-dimensional model construction system and a three-dimensional model construction program capable of easily constructing a three-dimensional model of an object from three-dimensional point cloud data and image data. The purpose is to provide.
[0007]
[Means for Solving the Problems]
Book The present invention relates to a three-dimensional model construction system in which image data and three-dimensional point cloud data are stored in association with each other, and a three-dimensional model is constructed from these data. A data file for storing three-dimensional model data Display means for displaying the image data and the three-dimensional point cloud data on a screen, designation means for designating a polygonal area on the screen on which the image data is displayed, and 3 included in the polygonal area. Extraction means for extracting three-dimensional point cloud data, and a plane object that specifies a plane object by obtaining a three-dimensional coordinate value of the polygonal area based on the extracted three-dimensional point cloud data and writes it to the data file And generating means.
[0008]
Book The present invention provides the 3D model construction system, wherein the line segment 3D is based on means for designating an end point or bending point of the line segment on the screen on which the image data is displayed, and the 3D point cloud data. A line object generating means for specifying a line object by obtaining coordinate values and writing to the data file;
Is further provided.
[0009]
Book According to the invention, the 3D model construction system obtains a 3D coordinate value of the feature point based on the means for designating the feature point on the screen on which the image data is displayed and the 3D point cloud data. And a point object generating means for specifying the point object and writing it to the data file.
[0010]
Book The invention is characterized in that the plane object generation means extracts image data in the polygonal region and writes the image data as a texture to the data file.
[0011]
Book The invention is characterized in that the three-dimensional model construction system further comprises data complementing means for complementing three-dimensional polygonal plane data generated by other measuring means based on the image data.
[0012]
Book The invention relates to a three-dimensional model construction program in which image data and three-dimensional point cloud data are associated and stored, a three-dimensional model is constructed from these data, and the obtained three-dimensional model data is written in a data file. A display process for displaying the image data and the three-dimensional point cloud data on a screen, a designation process for designating a polygonal area on the screen on which the image data is displayed, and a polygonal area. A plane object is identified by obtaining a three-dimensional coordinate value of the polygonal area based on the extracted three-dimensional point cloud data and the extracted three-dimensional point cloud data, and written to the data file A plane object generation process is performed by a computer.
[0013]
Book In the invention, the 3D model construction program is configured to specify the end point or the bending point of the line segment on the screen on which the image data is displayed, and the 3D line segment based on the 3D point cloud data. A line object is identified by obtaining a coordinate value, and a line object generating process for writing to the data file is further performed by a computer.
[0014]
Book The invention is such that the 3D model construction program obtains a 3D coordinate value of the feature point based on the process of designating the feature point on the screen on which the image data is displayed and the 3D point cloud data. A point object is specified by the above, and a point object generation process for writing to the data file is further performed by the computer.
[0015]
Book The invention is characterized in that the plane object generation processing extracts image data in the polygonal region and writes the image data as a texture to the data file.
[0016]
Book The invention is characterized in that the three-dimensional model construction program further causes the computer to perform a data complementing process for complementing the three-dimensional polygonal plane data generated by other measuring means based on the image data.
[0017]
DETAILED DESCRIPTION OF THE INVENTION
Hereinafter, a three-dimensional model construction system according to an embodiment of the present invention will be described with reference to the drawings. FIG. 1 is a block diagram showing the configuration of the embodiment. In this figure, reference numeral 1 denotes an image data file in which image data of a modeling target object obtained using a line CCD camera is stored. Reference numeral 2 denotes a distance measurement data file storing distance measurement data of the modeling target object acquired using a laser range scanner. Reference numeral 3 denotes a position / orientation data file that stores data of the position and orientation of the line CCD camera and the laser range scanner when image data and distance measurement data are acquired. Reference numeral 4 is a data integration unit that converts distance measurement data into coordinate values of the earth coordinate system based on the position and orientation data, and integrates the data acquired by associating the image data with the coordinate values of the earth coordinate system. is there. Reference numeral 5 denotes a three-dimensional point cloud data file that stores data obtained by converting the measurement point group of the modeling target object obtained by the data integration unit 4 into the earth coordinate system (referred to herein as three-dimensional point cloud data). is there. Reference numeral 6 denotes a position / orientation stored in the position / orientation data file 3 while associating the image data stored in the image data file 1 with the distance measurement data stored in the distance measurement data file 2 by the data integration unit 4. It is an integrated image data file that stores image data (referred to herein as integrated image data) corrected based on the data.
[0018]
Reference numeral 7 denotes a model generation unit that generates 3D model data using the 3D point cloud data and the integrated image data. Reference numeral 8 denotes a three-dimensional model data file that stores the generated three-dimensional model data. Reference numeral 9 denotes an input unit including a keyboard and a mouse. Reference numeral 10 denotes a screen processing unit that performs screen display and position designation processing on the screen. Reference numeral 11 denotes a display capable of displaying an image. Reference numeral 12 denotes a plane object generation unit that generates plane objects constituting the three-dimensional model. Reference numeral 13 denotes a line object generation unit that generates line objects constituting the three-dimensional model. Reference numeral 14 denotes a point object generation unit that generates point objects constituting the three-dimensional model. Reference numeral 15 denotes a data complementing unit that performs data complementing based on the integrated image data. Reference numeral 16 denotes a data reading unit that reads the integrated image data and the three-dimensional point cloud data from the integrated image data file 6 and the three-dimensional point cloud data file 5. Reference numeral 17 denotes a logo database in which data such as a signboard serving as a mark is stored in advance. Reference numeral 18 denotes a three-dimensional polygonal plane data file in which three-dimensional polygonal plane data generated by other measuring means is stored.
[0019]
Next, a method of acquiring each data stored in the image data file 1, the distance measurement data file 2, and the position / orientation data file 3 will be briefly described. Data is acquired by attaching three laser range scanners, six line CCD cameras, and a navigation device (GPS / INS / Ometer) to the roof of the measurement vehicle. This is done by acquiring measurement data. The laser range scanner used here is a single scan method, and the measurement rate is 20 Hz (20 rotations per second), and one rotation measures 480 laser range points within a distance range of 70 m within a measurement range of 300 degrees. it can. The line CCD camera is equipped with an 8mm F4 fisheye lens, and the measurement rate is about 80 Hz (80 lines per second). Each line image can measure 2048 RGB pixels within 180 degrees. The measurement surface of each laser range scanner and line CCD camera is arranged on the vertical direction of 45, 90, 135 degrees with the traveling direction of the measurement vehicle, respectively, and the sectional view of the roadside object is measured from three directions while the measurement vehicle is running, Also, the position and orientation and time data output from the navigation device, or the laser range data obtained by measuring each point by the relative orientation element between the sensors and the line CCD image are integrated into the earth coordinate system. FIG. 7 shows a layout when three laser range scanners, six line CCD cameras, and a navigation device (GPS / INS / Ometer) are attached to the roof of the measurement vehicle. The data acquisition method described in Japanese Patent Laid-Open No. 2002-031528 is applied to the acquisition method of each data stored in the image data file 1, the distance measurement data file 2, and the position / orientation data file 3 shown in FIG. Therefore, a detailed description of the operation and the like is omitted.
[0020]
Next, an operation for generating the three-dimensional point cloud data file 5 and the integrated image data file 6 from the data stored in the image data file 1, the distance measurement data file 2, and the position / orientation data file 3 will be described. First, the data integration unit 4 refers to the position and orientation data and corrects the distortion of the image data. The image data measured by mounting the line CCD on the measurement vehicle is to acquire a two-dimensional image when the vehicle travels. However, the roll, pitch, bounce (vertical vibration) of the measurement vehicle, and travel Due to the change in speed, the image data becomes distorted. Therefore, it is possible to correct this distortion by performing correction based on the position and orientation data for each pixel data. Subsequently, the data integration unit 4 refers to the position and orientation data, converts all the distance measurement data of each measurement point measured by the three laser range scanners into coordinate values of one earth coordinate system, and 3 Dimension point cloud data.
[0021]
Next, the data integration unit 4 projects each distance measurement data (three-dimensional point cloud data) onto image data photographed on the measurement surface in the same direction. For example, distance measurement data (three-dimensional point cloud data) obtained by measurement by the second laser range scanner is projected on the second and fifth line CCD images. Then, for each laser range scan line i (one rotation data), the image data j2 and j5 photographed on the side of the total coincidence are searched from the image data of the second and fifth line CCDs. Next, for each laser range point s in each laser range scan line i, the pixel t that most closely matches the measurement direction in j2 and j5 is searched, and the laser range point s is projected onto the pixel t. Are stored in the integrated image data file 6 and the three-dimensional point cloud data file 5.
[0022]
By this data integration processing, the coordinate values of the measurement points in the earth coordinate system are stored in the three-dimensional point cloud data file 5, and the measurement points and the image data are associated and stored in the integrated image data file 6. .
The generation method of the three-dimensional point cloud data file 5 and the integrated image data file 6 is not limited to the method described above, and may be data generated by another method.
When complementing the three-dimensional polygonal plane data file 18 generated by other measuring means based on the CCD image, the data integration unit 4 projects the three-dimensional polygonal plane data onto the CCD image data. First, for the center point p of the three-dimensional polygonal plane data f, the pixel t that most closely matches the shooting direction is searched from each line CCD image data. The distance measurement data s associated with the pixel t is the shooting center of the pixel t. It is verified whether or not it matches the distance d from to p. The line CCD image data j having the shortest distance d and the largest incident angle with respect to the plane f is associated with the three-dimensional polygonal plane data f from the line CCD image data whose projection is verified. Next, for each vertex of the three-dimensional polygonal plane data f, a pixel with the best shooting direction is found from the line CCD image data j and associated. The result is stored in the integrated image data file 6.
Note that the method for integrating the three-dimensional polygonal plane data file 18 and the image data file 1 generated by other measuring means and the method for generating the integrated image data file 6 is not limited to the above-described method, and data generated by other methods. It may be.
[0023]
Here, the three-dimensional model generated by the method of the present invention will be described. The three-dimensional model generated by the present invention is composed of a combination of a plane object expressed by a polygonal plane, a line object expressed by a line segment, and a point object expressed by a feature point. The image data is used as a texture to obtain a more realistic feeling.
[0024]
Next, an operation in which the model generation unit 7 shown in FIG. 1 generates a three-dimensional model from data stored in the three-dimensional point cloud data file 5 and the integrated image data file 6 will be described. Here, in order to simplify the explanation of the operation principle, description will be made with reference to FIGS. FIG. 2 is a diagram illustrating an example of image data stored in the integrated image data file 6. 3 plots laser range data obtained by measuring a building imaged in FIG. 2 in the earth coordinate system by installing a laser range scanner at the measurement position shown in FIG. 2, and the viewpoint position becomes higher than the measurement position. In this example, the display unit 11 is displayed with the coordinate conversion for changing the viewpoint position. The image data in FIG. 2 is an image in which the viewpoint position is higher than the actual image data in order to make it easy to understand the state of the building. An image obtained by capturing a measurement range scanned by a laser range scanner by installing a line CCD camera at the measurement position is stored in the integrated image data file 6.
[0025]
Next, an operation for generating each object will be described. First, when the operator inputs an instruction for generating a three-dimensional model from the input unit 9, the model generation unit 7 instructs the data reading unit 16 to read out the image data and the three-dimensional point cloud data. In response to this, the data reading unit 16 reads the image data shown in FIG. 2 and the three-dimensional point cloud data shown in FIG. Then, the model generation unit 7 outputs the received data to the screen processing unit 10. At this time, the screen processing unit 10 performs arbitrary display position conversion on the three-dimensional point cloud data received from the model generation unit 7 and then outputs the data to the display unit 11 to display the data. By this operation, the images shown in FIGS. 2 and 3 are simultaneously displayed on the display unit 11.
[0026]
Next, the operator designates generation of a plane object from the input unit 9. In response to this, the model generation unit 7 displays a message on the display unit 11 so as to designate a vertex of the polygon. Subsequently, the operator uses the mouse of the input unit 9 to specify the vertex position of the polygon indicating the range of the plane object on the screen (FIG. 2) on which the image data is displayed. Here, it is assumed that symbols A, B, C, and D shown in FIG. The screen processing unit 10 reads the coordinate value on the screen and notifies the model generation unit 7 of the coordinate value. Then, the model generation unit 7 passes the four coordinate values to the plane object generation unit 12. In response to this, the plane object generation unit 12 extracts the laser range points projected in the polygonal area specified by the coordinate values of the vertices, and applies the spatial plane. FIG. 5 shows a screen display example when a laser range point projected in a polygonal region specified by the vertex coordinate values is extracted and a spatial plane is applied.
[0027]
Next, the plane object generation unit 12 projects the pixels of the image data in the corresponding polygonal area onto the spatial plane along the projection direction line, and cuts out the texture of the corresponding plane from the image data. Subsequently, the image is projected onto a plane along the projection direction line of the pixel of the image data corresponding to the vertex of the corresponding polygon area, and the vertex of the spatial plane polygon area is obtained and transferred to the model creation unit 7.
[0028]
Next, the model generation unit 7 matches the clipped texture image with the data stored in the existing logo database 17, finds a type such as a shop signboard or a road traffic sign, and assigns an attribute to the corresponding plane object To do. If the matching fails, the model generation unit 7 gives the attribute input from the input unit 9 to this plane object. Then, the model generation unit 7 writes the plane object obtained here into the three-dimensional model data file 8. As a result, the plane object is written in the three-dimensional model data file 8, and when this three-dimensional model data is displayed, the plane object indicated by reference numeral P1 in FIG. 6 is displayed. This plane object not only displays a polygon but also uses image data as a texture.
[0029]
Most of the planar objects in the urban space are divided into two types, almost vertical and almost horizontal, so the data of pedestrians, passing vehicles, trees, utility poles, electric wires, etc. are especially small in the surroundings. You may make it obtain | require as follows about the fitting of the plane which is doing. First, an extracted laser range point is projected onto a grid-like horizontal plane with respect to a substantially vertical plane object, thereby creating an image called Z-image. The value of the pixel in Z-image is the number of laser range points projected on the corresponding grid. The line segment in Z-image represents a vertical plane in space. In this process, the vertical plane with many accumulated points in the vertical direction is strongly emphasized, and other objects are weakened. Next, the sharpest and longest line segment is extracted from the Z-image, and the plane parameter is obtained by the least square method at the laser range points projected around the line segment. In addition, for a nearly horizontal surface object, a histogram is created from the extracted laser range point elevation values to obtain the highest peak elevation value, and the elevation value within the allowable range centered on the relevant elevation value. The laser range point is extracted and the plane parameter is obtained by the least square method.
Note that although only a plane has been described as a target here, a curved object such as a cylinder or a sphere may be generated by applying the same processing.
[0030]
Next, the operator designates generation of a line object from the input unit 9. In response to this, the model generation unit 7 displays a message on the display unit 11 so as to designate the bending point or end point of the line object. Subsequently, the operator designates the bending point or end point position of the line object using the mouse of the input unit 9 on the screen (FIG. 2) on which the image data is displayed. Here, it is assumed that the positions (outside lights) of symbols J and K in FIG. 4 are designated. In response to this, the screen processing unit 10 reads the coordinate value on the screen and notifies the model generation unit 7 of the coordinate value. Then, the model generation unit 7 passes the designated coordinate value to the line object generation unit 13. In response to this, the line object generation unit 13 takes the center point of the laser range point (three-dimensional point cloud data) projected within a predetermined allowable range around the pixel of the image data corresponding to the bending point or the end point, The inflection point or end point of the line object in space. However, in such a process, there is a possibility that a wrong point may be extracted due to a pedestrian, a passing vehicle, a tree, an error caused during data integration, or a measurement error due to direct sunlight. Therefore, the following auxiliary procedure ensures point extraction even in an environment where miscellaneous data is mixed.
[0031]
The line object generation unit 13 extracts laser range points (three-dimensional point cloud data) projected within a predetermined allowable range around a plurality of designated line segments, obtains a spatial plane by the least square method, and projects A surface. Then, the obtained projection plane is converted into a grid, and the extracted laser range point (three-dimensional point group data) is projected onto the projection plane to generate an image called hybrid Z-image. Therefore, the value of the pixel in the hybrid Z-image is the number of laser range points (three-dimensional point group data) projected on the corresponding grid. By such a process, the shape of a plurality of line segments in the three-dimensional space can be expressed as a two-dimensional image at the maximum. Subsequently, the previously determined bending point or end point is projected onto the hybrid Z-image, and it is confirmed whether or not it matches the shape of the line segment represented by the laser range point (three-dimensional point group data). If they do not match, the operator corrects the polyline projected on the hybridZ-image. The center point of the laser range point (three-dimensional point cloud data) projected on the corresponding hybrid Z-image is obtained with respect to the corrected bending point or end point of the line segment, and is set as the bending point or end point in space. . A plurality of line segments specified by the bending points or end points are line objects.
[0032]
Next, the line object generation unit 13 transfers the data of the line object to the model generation unit 7. Subsequently, the model generation unit 7 matches the line object and the data stored in the logo database 7 to classify street lights, sidewalks, and the like, and assign attributes to the corresponding line object. When matching fails, the attribute input from the input unit 9 is given. Then, the model generation unit 7 writes the line object obtained here into the three-dimensional model data file 8. As a result, the line object is written in the three-dimensional model data file 8, and when this three-dimensional model data is displayed, the line object indicated by reference numeral L1 in FIG. 6 is displayed. Similarly, in FIG. 4, when the codes G, H, and I and the codes E and F are designated, line objects indicated by the codes L2 and L3 in FIG. 6 are generated, respectively.
Although a straight line has been described here as an example, a curved object such as an arc may be generated by a collection of short line objects.
[0033]
Next, the operator designates generation of a point object from the input unit 9. In response to this, the model generation unit 7 displays a message on the display unit 11 so as to designate the position of the feature point. Subsequently, the operator designates the position of the feature point using the mouse of the input unit 9 on the screen (FIG. 2) on which the image data is displayed. In response to this, the screen processing unit 10 reads the coordinate value on the screen and notifies the model generation unit 7 of the coordinate value. The model generation unit 7 then passes the designated coordinate value to the point object generation unit 14. In response to this, the point object generation unit 13 takes the center point of the laser range point projected within the predetermined allowable range around the pixel of the image data corresponding to the coordinate value, and obtains the spatial coordinate value of the point object. And The point object generation unit 14 passes this coordinate value to the model generation unit 7. In response to this, the model generation unit 7 writes this spatial coordinate value to the three-dimensional model data file 8.
[0034]
Next, an operation for complementing the 3D polygon surface data stored in the 3D polygon surface data file 18 generated by the other measuring means will be described. 3D polygon surface data stored in the 3D polygon surface data file 18 generated by other measurement means due to measurement errors of distance measurement data or limitations of existing 3D model data production means, etc. Is included, and there is no texture data, and the sense of reality may be lacking. Therefore, the interpolation is performed based on the line CCD image data. First, upon receiving a data complement instruction from the input unit 9, the model generation unit 7 issues a data complement instruction to the data complement unit 15. In response to this, the data complementing unit 15 projects the three-dimensional model data generated by the other measurement means onto the image data, and the object shape drawn thereby coincides with the image captured on the image data. Check whether or not.
[0035]
Next, the operator designates the correct projection location on the screen with the mouse for the non-matching polygonal vertexes projected on the image data. Then, a new spatial coordinate value is given to the vertex by obtaining the intersection of the shooting direction line of the pixel of the corresponding image data and the polygonal spatial plane to which the corresponding vertex belongs. Subsequently, the pixel of the image data surrounded by the projected polygon is projected onto the space plane of the corresponding polygon along the shooting direction line, and the texture image of the corresponding polygon is generated by resampling.
In this way, even when the error of the three-dimensional polygonal plane data stored in the three-dimensional polygonal plane data file 18 generated by other measuring means is large, it is possible to complement based on the image data. Become. It is also possible to obtain a more realistic feeling by generating texture data.
[0036]
By repeating the above operation, three-dimensional model data is generated as a collection of objects composed of surfaces, lines, and points. FIG. 8 and FIG. 9 show examples in which a three-dimensional model data is generated based on the acquired data, actually traveled in a city area with a measurement vehicle, and displayed on the screen. According to the present invention, as shown in FIG. 8, since the CCD image is used as it is as the texture on the surface of the building, realistic three-dimensional model data can be created. In addition, since the positional relationship is obtained for objects such as planes and lines based on the three-dimensional point cloud data, accurate three-dimensional model data can be obtained. Further, as shown in FIG. 9, a CCD image can be used as it is for a signboard or the like in a city area, and a sign, a signal, a pedestrian crossing, or the like can be realistically expressed. It is possible to generate optimal data as display data of the system.
[0037]
Conventionally, there are many development examples in which a three-dimensional model is automatically constructed by performing three-dimensional measurement of various objects using a distance image obtained by a laser scanner or the like and a multi-stereoscopic image obtained by a CCD camera or the like. When the shape and structure are complex, complete automatic construction is extremely difficult, and as a result, there are many manual operations. After all, even if an automation system is introduced, the labor saving effect is often not obtained.
[0038]
However, in the present invention, a superimposition of 3D point cloud data and a CCD image is prepared, and a 3D model of the object can be obtained by simply tracing the outline of the object on a CCD image that is easy for humans to read. It is possible to construct it, and the efficiency of manual work can be greatly improved. In particular, 3D point cloud data corresponding to the input contour line is cut out, and a 3D model is constructed by appropriately assigning surfaces (surfaces), line segments, points, etc. thereto, so that it appears in an outdoor image, for example. In addition to buildings and roads, a wide range of three-dimensional space models including complex shapes such as store signs, traffic lights, road traffic signs, signals, sidewalks, and benches can be constructed very efficiently. This can be used to measure a three-dimensional model of an actual object such as computer graphics, a numerical map, surveying, or CAD, or to create display data for a navigation system.
[0039]
A three-dimensional model is obtained by recording a program for realizing the functions of the processing units in FIG. 1 on a computer-readable recording medium, causing the computer system to read and execute the program recorded on the recording medium. Data generation processing may be performed. Here, the “computer system” includes an OS and hardware such as peripheral devices. The “computer system” includes a WWW system having a homepage providing environment (or display environment). The “computer-readable recording medium” refers to a portable medium such as a flexible disk, a magneto-optical disk, a ROM, and a CD-ROM, and a storage device such as a hard disk built in the computer system. Further, the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
[0040]
The program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line. The program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, and what is called a difference file (difference program) may be sufficient.
[0041]
【The invention's effect】
As described above, according to the present invention, three-dimensional point cloud data obtained by conversion from laser range data having high-precision three-dimensional information is projected onto image data having texture information, and the image data screen is displayed. As a work screen, manually input the area of the measurement object, and obtain the 3D shape of the object from the 3D point cloud data integrated in the area, so that not only buildings and roads, but also shop signs, traffic lights, There is an effect that a wide range of three-dimensional spatial data such as road traffic signs and sidewalks can be generated efficiently and easily.
[Brief description of the drawings]
FIG. 1 is a block diagram showing a configuration of an embodiment of the present invention.
FIG. 2 is an explanatory view showing an example of a display screen for explaining the operation of the apparatus shown in FIG. 1;
3 is an explanatory diagram showing an example of a display screen for explaining the operation of the apparatus shown in FIG. 1; FIG.
4 is an explanatory view showing an example of a display screen for explaining the operation of the apparatus shown in FIG. 1; FIG.
FIG. 5 is an explanatory diagram showing an example of a display screen for explaining the operation of the apparatus shown in FIG. 1;
6 is an explanatory view showing an example of a display screen for explaining the operation of the apparatus shown in FIG. 1; FIG.
FIG. 7 is a layout view when three laser range scanners, six line CCD cameras, and a navigation device are attached to the roof of a measurement vehicle.
FIG. 8 is an explanatory diagram showing an example of generated three-dimensional model data.
FIG. 9 is an explanatory diagram showing an example of generated three-dimensional model data.
[Explanation of symbols]
1 ... Image data file
2 ... Ranging data file
3 ... Position and orientation data file
4. Data integration part
5 ... 3D point cloud data file
6 ... Integrated image data file
7 ... Model generator
8 ... 3D model data file
9 ... Input section
10 ... Screen processing unit
11 ... Display section
12 ... Plane object generator
13: Line object generator
14: Point object generator
15 ... Data complement part
16 ... Data reading part
17 ... Logo database
18 ... 3D polygon data file

Claims (8)

  1. Image data and 3D point cloud data are stored in association with each other, and a 3D model construction system for constructing a 3D model from these data,
    A data file for storing 3D model data;
    Display means for displaying the image data and the three-dimensional point cloud data on a screen;
    Designating means for designating a polygonal area on the screen on which the image data is displayed;
    Extracting means for extracting three-dimensional point cloud data included in the polygonal area;
    Based on the extracted three-dimensional point cloud data, a plane object generating means for specifying a plane object by obtaining a three-dimensional coordinate value of the polygonal area and writing to the data file ;
    Line segment position specifying means for specifying the position of the end point or bending point of the line segment to be generated on the screen on which the image data is displayed;
    A coordinate value on the screen of the position of the designated end point or bending point is obtained, and among the point data in the three-dimensional point group data projected within a predetermined allowable range around the coordinate value, the predetermined allowable value A line object generating unit that selects a point data of a center within a range to obtain a three-dimensional coordinate value of the end point or the bending point, specifies a line object, and writes the line object to the data file ; 3D model construction system.
  2. The three-dimensional model construction system
    Feature point position specifying means for specifying the position of the feature point to be generated on the screen on which the image data is displayed;
    A coordinate value on the screen of the position of the designated feature point is obtained, and the point data in the three-dimensional point group data projected within the predetermined allowable range around the coordinate value is within the predetermined allowable range. claim to identify a point object by selecting the point data of the center of the determined three-dimensional coordinate values of the feature points, characterized by comprising further a object generation means that write to the data file 1 3D modeling system described.
  3. The surface object generating means extracts the image data of the polygonal region, the three-dimensional model construction according to claim 1 or 2, characterized in that writing to the data file the image data as a texture system.
  4. The three-dimensional model construction system
    The three-dimensional model according to any one of claims 1 to 3 , further comprising data complementing means for complementing three-dimensional polygonal plane data generated by other measuring means based on the image data. Construction system.
  5. A three-dimensional model construction program that stores image data and three-dimensional point cloud data in association with each other, constructs a three-dimensional model from these data, and writes the obtained three-dimensional model data to a data file. ,
    Display processing for displaying the image data and the three-dimensional point cloud data on a screen;
    A designation process for designating a polygonal area on the screen on which the image data is displayed;
    Extraction processing for extracting three-dimensional point cloud data included in the polygonal area;
    Based on the extracted three-dimensional point cloud data, a plane object is generated by specifying a plane object by obtaining a three-dimensional coordinate value of the polygonal area, and writing to the data file ;
    A line segment position designation process for designating the position of the end point or bending point of the line segment to be generated on the screen on which the image data is displayed;
    A coordinate value on the screen of the position of the designated end point or bending point is obtained, and among the point data in the three-dimensional point group data projected within a predetermined allowable range around the coordinate value, the predetermined allowable value By selecting point data of the center within the range, a three-dimensional coordinate value of the end point or inflection point is obtained, a line object is specified, and a line object generation process to be written to the data file is performed by a computer. A three-dimensional model construction program.
  6. The three-dimensional model construction program is
    A feature point position specifying process for specifying a position of a feature point to be generated on the screen on which the image data is displayed;
    A coordinate value on the screen of the position of the designated feature point is obtained, and the point data in the three-dimensional point group data projected within the predetermined allowable range around the coordinate value is within the predetermined allowable range. A point object generating process for obtaining a three-dimensional coordinate value of the feature point by selecting the point data of the feature point, specifying a point object, and writing to the data file is further performed by the computer. Item 6. The three-dimensional model construction program according to item 5 .
  7. 7. The three-dimensional model construction according to claim 5, wherein the plane object generation processing extracts image data in the polygon area and writes the image data as a texture to the data file. program.
  8. The three-dimensional model construction program is
    Based on the image data, 3 according to any 5 to claim, characterized in that to perform the further computer data interpolation processing to complement the three-dimensional polygonal surface data generated 7 of the other measuring means Dimensional model building program.
JP2003146531A 2003-05-23 2003-05-23 3D model construction system and 3D model construction program Expired - Fee Related JP4284644B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003146531A JP4284644B2 (en) 2003-05-23 2003-05-23 3D model construction system and 3D model construction program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003146531A JP4284644B2 (en) 2003-05-23 2003-05-23 3D model construction system and 3D model construction program

Publications (2)

Publication Number Publication Date
JP2004348575A JP2004348575A (en) 2004-12-09
JP4284644B2 true JP4284644B2 (en) 2009-06-24

Family

ID=33533357

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003146531A Expired - Fee Related JP4284644B2 (en) 2003-05-23 2003-05-23 3D model construction system and 3D model construction program

Country Status (1)

Country Link
JP (1) JP4284644B2 (en)

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006031580A1 (en) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Method and device for the three-dimensional detection of a spatial area
JP2010507127A (en) * 2006-10-20 2010-03-04 テレ アトラス ベスローテン フエンノートシャップTele Atlas B.V. Computer apparatus and method for matching position data of different sources
CN101617197B (en) 2007-02-16 2011-06-22 三菱电机株式会社 Feature identification apparatus, measurement apparatus and measuring method
JP5538667B2 (en) * 2007-04-26 2014-07-02 キヤノン株式会社 Position / orientation measuring apparatus and control method thereof
JP5116555B2 (en) * 2008-04-25 2013-01-09 三菱電機株式会社 Location device, location system, location server device, and location method
JP4978615B2 (en) * 2008-11-27 2012-07-18 三菱電機株式会社 Target identification device
DE102009010465B3 (en) 2009-02-13 2010-05-27 Faro Technologies, Inc., Lake Mary Laser scanner
JP5339953B2 (en) * 2009-02-17 2013-11-13 三菱電機株式会社 3D map correction apparatus and 3D map correction program
DE102009015920B4 (en) 2009-03-25 2014-11-20 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015921A1 (en) * 2009-03-25 2010-09-30 Faro Technologies, Inc., Lake Mary Method for optically scanning and measuring an environment
JP5464915B2 (en) * 2009-06-09 2014-04-09 三菱電機株式会社 Object detection apparatus and object detection method
DE102009035337A1 (en) 2009-07-22 2011-01-27 Faro Technologies, Inc., Lake Mary Method for optically scanning and measuring an object
DE102009055988B3 (en) 2009-11-20 2011-03-17 Faro Technologies, Inc., Lake Mary Device, particularly laser scanner, for optical scanning and measuring surrounding area, has light transmitter that transmits transmission light ray by rotor mirror
DE102009055989B4 (en) 2009-11-20 2017-02-16 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
DE102009057101A1 (en) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Device for optically scanning and measuring an environment
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
TWI391874B (en) * 2009-11-24 2013-04-01 Ind Tech Res Inst Method and device of mapping and localization method using the same
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
DE112011100302T5 (en) 2010-01-20 2012-10-25 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with multiple communication channels
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
DE102010020925B4 (en) 2010-05-10 2014-02-27 Faro Technologies, Inc. Method for optically scanning and measuring an environment
DE102010032723B3 (en) 2010-07-26 2011-11-24 Faro Technologies, Inc. Device for optically scanning and measuring an environment
DE102010032726B3 (en) 2010-07-26 2011-11-24 Faro Technologies, Inc. Device for optically scanning and measuring an environment
DE102010032725B4 (en) 2010-07-26 2012-04-26 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
WO2012077238A1 (en) * 2010-12-10 2012-06-14 富士通株式会社 3d moving image creation device, 3d moving image creation method, and 3d moving image creation program
JP5566353B2 (en) * 2011-09-02 2014-08-06 株式会社パスコ Data analysis apparatus, data analysis method, and program
KR101740259B1 (en) * 2011-10-07 2017-05-29 한국전자통신연구원 Auto segmentation method of 3d point clouds
JP5808656B2 (en) * 2011-11-29 2015-11-10 株式会社アスコ Three-dimensional laser measurement system and road profile profile creation method
DE102012100609A1 (en) 2012-01-25 2013-07-25 Faro Technologies, Inc. Device for optically scanning and measuring an environment
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
DE102012107544B3 (en) 2012-08-17 2013-05-23 Faro Technologies, Inc. Optical scanning device i.e. laser scanner, for evaluating environment, has planetary gears driven by motor over vertical motor shaft and rotating measuring head relative to foot, where motor shaft is arranged coaxial to vertical axle
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102012109481A1 (en) 2012-10-05 2014-04-10 Faro Technologies, Inc. Device for optically scanning and measuring an environment
CN104315974A (en) * 2014-10-22 2015-01-28 合肥斯科尔智能科技有限公司 Three dimension scan data processing method
CN105069842A (en) * 2015-08-03 2015-11-18 百度在线网络技术(北京)有限公司 Modeling method and device for three-dimensional model of road
KR101693811B1 (en) * 2015-12-08 2017-01-06 한국기술교육대학교 산학협력단 Valve modeling method and apparatus
DE102015122844A1 (en) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D measuring device with battery pack
JP2018105804A (en) * 2016-12-28 2018-07-05 首都高Etcメンテナンス株式会社 Measurement information acquisition method and work vehicle for electric field intensity measurement

Also Published As

Publication number Publication date
JP2004348575A (en) 2004-12-09

Similar Documents

Publication Publication Date Title
El-Hakim et al. Detailed 3D reconstruction of large-scale heritage sites with integrated techniques
Hu et al. Approaches to large-scale urban modeling
KR101159379B1 (en) System, computer program and method for 3d object measurement, modeling and mapping from single imagery
CN1275206C (en) Three-dimensional electronic map data creation method
US6456288B1 (en) Method and apparatus for building a real time graphic scene database having increased resolution and improved rendering speed
JP4185052B2 (en) Enhanced virtual environment
Arayici An approach for real world data modelling with the 3D terrestrial laser scanner for built environment
US7986825B2 (en) Model forming apparatus, model forming method, photographing apparatus and photographing method
US8818076B2 (en) System and method for cost-effective, high-fidelity 3D-modeling of large-scale urban environments
KR100912715B1 (en) Method and apparatus of digital photogrammetry by integrated modeling for different types of sensors
EP1242966B1 (en) Spherical rectification of image pairs
US9520000B2 (en) Systems and methods for rapid three-dimensional modeling with real facade texture
CA2678156C (en) Measurement apparatus, measurement method, and feature identification apparatus
Zhao et al. Reconstructing a textured CAD model of an urban environment using vehicle-borne laser range scanners and line cameras
US7509241B2 (en) Method and apparatus for automatically generating a site model
US8665263B2 (en) Aerial image generating apparatus, aerial image generating method, and storage medium having aerial image generating program stored therein
Behzadan et al. Georeferenced registration of construction graphics in mobile outdoor augmented reality
Paparoditis et al. Stereopolis II: A multi-purpose and multi-sensor 3D mobile mapping system for street visualisation and 3D metrology
US8649610B2 (en) Methods and apparatus for auditing signage
US8693806B2 (en) Method and apparatus of taking aerial surveys
US4970666A (en) Computerized video imaging system for creating a realistic depiction of a simulated object in an actual environment
EP2208021B1 (en) Method of and arrangement for mapping range sensor data on image sensor data
Haala et al. 3D urban GIS from laser altimeter and 2D map data
Gross et al. Extraction of lines from laser point clouds
US8958980B2 (en) Method of generating a geodetic reference database product

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060420

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20081023

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20081104

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20081225

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20090224

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20090312

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120403

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130403

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130403

Year of fee payment: 4

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130403

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140403

Year of fee payment: 5

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

LAPS Cancellation because of no payment of annual fees