CN113066112B - Indoor and outdoor fusion method and device based on three-dimensional model data - Google Patents

Indoor and outdoor fusion method and device based on three-dimensional model data Download PDF

Info

Publication number
CN113066112B
CN113066112B CN202110320705.1A CN202110320705A CN113066112B CN 113066112 B CN113066112 B CN 113066112B CN 202110320705 A CN202110320705 A CN 202110320705A CN 113066112 B CN113066112 B CN 113066112B
Authority
CN
China
Prior art keywords
indoor
outdoor
data
image data
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110320705.1A
Other languages
Chinese (zh)
Other versions
CN113066112A (en
Inventor
刘俊伟
王娟
邬丽娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Terry Digital Technology Beijing Co ltd
Original Assignee
Terra It Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Terra It Technology Beijing Co ltd filed Critical Terra It Technology Beijing Co ltd
Priority to CN202110320705.1A priority Critical patent/CN113066112B/en
Publication of CN113066112A publication Critical patent/CN113066112A/en
Application granted granted Critical
Publication of CN113066112B publication Critical patent/CN113066112B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The invention relates to an indoor and outdoor fusion method and device based on three-dimensional model data. The method comprises the steps of establishing an outdoor coordinate system of an outdoor model by using an outdoor three-dimensional CityGML model according to a calibrated number; and establishing an indoor coordinate system of the indoor model by using the calibrated number for the indoor three-dimensional CityGML model constructed through the CAD drawing. And performing array matching, graph comparison and screening by adopting the serial number of each data, performing correct position registration on the indoor and outdoor data, and performing fusion of the indoor and outdoor data again to unify matching serial numbers. According to the numbering relationship between the outdoor coordinate system and the indoor coordinate system of the model, when the model is displayed outdoors, the indoor data is subjected to visual presentation of data graphs according to the numbering of the outdoor data; when the indoor display is carried out, outdoor data are dynamically and graphically visualized according to indoor numbers, so that the problem of data incompatibility caused by different data sources is solved, and indoor and outdoor three-dimensional model data with any reference can be dynamically and accurately displayed.

Description

Indoor and outdoor fusion method and device based on three-dimensional model data
Technical Field
The invention relates to the technical field of three-dimensional geographic information, in particular to an indoor and outdoor fusion method of three-dimensional model data.
Background
The digital earth and the smart city are research hotspots in recent years, the three-dimensional model forms a basic framework of the digital earth and the smart city, and large-scale, regularized and rapid three-dimensional visual expression is a research focus of the digital earth and the smart city. Along with the construction and the application of digital earth and smart cities, indoor positioning and mapping, position-based services, virtual reality, augmented reality and the like are more and more practical, so that the visual expression of the three-dimensional world is gradually developed from large-scale outdoor roaming to indoor and outdoor integrated roaming, and therefore, the integration of indoor and outdoor three-dimensional scene data is necessarily organized, the requirement of overall data scheduling in the indoor and outdoor visual process is met, and efficient real-time three-dimensional visualization is achieved. Because the indoor and outdoor scene data visualization organization modes are different, in the process of roaming switching of indoor and outdoor scenes, effective continuity is lacked, the three-dimensional rendering scene is often required to be reconstructed, including rendering data updating and coordinate system conversion, and the three-dimensional visualization efficiency is seriously influenced. CN110595458A gathers the integration of locating data and compares the indoor outer position judgement that realizes the location label through the border position of actual position and building through the navigation unit of indoor outer different techniques, and the location cost is high, and the accuracy completely depends on indoor outer location technology resolution.
The city geographic Markup language (city geographic Markup language) is an international open standard for virtual three-dimensional city model storage and exchange, which is introduced by the open geospatial information alliance (OGC), and is a general semantic information model for expressing city three-dimensional templates. For indoor and outdoor integrated modeling, at present, a laser three-dimensional scanner is generally adopted to collect indoor and outdoor integrated modeling data, then three-dimensional modeling is carried out according to the collected data, the accuracy of the built model is often not high, and the absolute position coordinates of a building cannot be known. CN109903382A utilizes the indoor and outdoor point cloud data to transform to the target coordinate system for data fusion, thus unifying indoor and outdoor under one coordinate system. Because point cloud data is adopted, the resolution ratio still cannot be improved, and the indoor and outdoor distinction cannot be accurately realized. CN106646562A adopts outdoor GPS location and indoor several reference points to find and place GPS receiver to realize the construction of platform coordinate, and adopts ultra wide band indoor location to combine platform coordinate and ultra wide band indoor location system self-carrying coordinate to establish conversion relation, the precision is only decimeter level.
The indoor and outdoor scene data integrated organization technology is not mature at present, data fusion is carried out by adopting different indoor and outdoor data sources, the data source collection method is various, the data reference is various, and the accurate requirement of the visual dynamic scheduling of massive indoor and outdoor three-dimensional scene data cannot be met. Therefore, the invention provides an indoor and outdoor fusion method based on three-dimensional model data, which aims to overcome the defects in the prior art.
Disclosure of Invention
Aiming at the defects, the invention provides an indoor and outdoor fusion method based on three-dimensional data, which comprises the steps of establishing an indoor coordinate system and an outdoor coordinate system by using data of automatic calibration numbering of a three-dimensional model, replacing point cloud data with indoor and outdoor three-dimensional model data to carry out correct position registration according to the numbering relation between the coordinate systems, carrying out fusion of the indoor and outdoor data again, uniformly matching and numbering, eliminating the problem of data mismatch caused by different data sources and realizing integrated accurate display of the indoor and outdoor three-dimensional models.
The invention adopts the following technical scheme: a method of indoor-outdoor fusion of three-dimensional model data, the method comprising:
step 1: establishing an outdoor three-dimensional model coordinate system and an outdoor non-building geographical block number: constructing an external three-dimensional CityGML model, acquiring the edges of the south and west of the base of a randomly selected standard building, namely an X axis and a Y axis respectively, and establishing an outdoor three-dimensional model coordinate system E by taking the right-angle vertex of the standard building as an origin as an outdoor reference point O; acquiring four vertex coordinates of four vertexes of a rectangular surface of each outdoor non-building geographic block, wherein the four vertexes of the rectangular surface are formed by four points of a plane which is parallel to an XOY coordinate plane and has the largest area and is closest to the XOY coordinate plane, a point N which is farthest away from an O point in the four vertexes is used as a serial number of each non-building geographic block under E, and the serial number of each non-building geographic block is automatically generated in an outdoor three-dimensional CityGML model and corresponds to a coordinate under E; expressed by the abscissa of the point N with the number; establishing a serial number library K;
step 2: establishing an indoor three-dimensional model coordinate system and an indoor building number: an indoor three-dimensional CityGML model constructed through an indoor software drawing is constructed, an indoor three-dimensional model coordinate system I corresponding to each building is constructed according to the base of each building model and the same as a standard building base, four vertex coordinates of a rectangular surface of each indoor building model base are obtained, the four vertexes of the rectangular surface are formed by four points of a plane, parallel to the xoy coordinate plane of the coordinate system I, of the base, the area of the plane is the largest, the plane is the closest to the xoy coordinate plane, a point M' which is the farthest away from an o point in the four vertexes is used as a serial number point of each building under I, and the serial number of each building is automatically generated in the indoor three-dimensional CityGML model, wherein each serial number corresponds to a coordinate under I; expressed by the abscissa of the point M' numbered under I; establishing a serial number library Q;
and step 3: fusing indoor and outdoor three-dimensional models: establishing a transformation formula between coordinate systems E and I, carrying out matrix modeling or matrix matching, graph comparison and screening by adopting the serial numbers of each serial number library, carrying out correct position registration on indoor and outdoor data, and carrying out fusion of the indoor and outdoor data again to unify matching serial numbers;
and 4, step 4: the indoor and outdoor three-dimensional models are presented in an integrated manner: according to the transformation formula between the outdoor coordinate system and the indoor coordinate system of the model, when the outdoor display is carried out, the indoor data is visually presented according to the serial number under E, and when the indoor display is carried out, the outdoor data is dynamically and graphically visually presented according to the serial number under I, so that the problem of data incompatibility caused by different data sources is solved.
Further, step 1 includes acquiring outdoor image data.
Furthermore, in step 1, outdoor image data is acquired by using equipment such as satellites and aerial cameras.
Further, the step 1 further comprises data processing of the collected outdoor image data, including color-homogenizing, defogging, correcting, bad line and stripe removing, and noise removing of the outdoor image data.
Further, the step 1 also comprises the steps of constructing an outdoor three-dimensional CityGML model, and constructing the outdoor three-dimensional CityGML model based on E according to the processed outdoor image data. Preferably, an outdoor three-dimensional CityGML model is constructed by adopting an outdoor software drawing.
Further, step 2 further comprises acquiring indoor image data.
Furthermore, in step 2, various cameras, monitoring equipment, camera equipment or laser three-dimensional scanners and other equipment are used for collecting indoor image data.
Further, the step 2 further comprises data processing of the collected indoor image data, including removing image data with low definition from the indoor image data, preprocessing drawing data, and cleaning drawing data.
Further, the step 2 also comprises the steps of constructing an indoor three-dimensional CityGML model, and constructing the indoor three-dimensional CityGML model based on I through a software drawing according to the processed indoor image data.
Further, in step 3, the serial numbers of the data are adopted to perform matrix modeling, graph comparison and screening, correct position registration is performed on indoor and outdoor data, fusion of the indoor and outdoor data is performed again, and the serial numbers are matched in a unified manner, and the method comprises the following steps:
s1, dividing buildings or building groups in the area E to form a plurality of sub-areas L, dividing indoor software drawings of each sub-area L into training set drawings and verification set drawings, and arranging and constructing number matrixes M =on the buildings of each sub-area in the sequence from west to east and from north to south
Figure 205542DEST_PATH_IMAGE001
Wherein each matrix element aijEach number N is included; each matrix element aijConstructing a graph matrix p =by taking a corresponding indoor software drawing as a matrix element
Figure 937875DEST_PATH_IMAGE002
Wherein, the place where no building exists is represented by a zero matrix element;
the S2 matrix between M and p is trained by generating a confrontation network (GAN) and establishes a recognition model GAN (M) = p (1) by using a verification set drawing,
s3, establishing an identification model GAN1(M1) = p (2) in an L area and a surrounding adjacent L area, establishing an identification model GAN2(M2) = p (3) in the L area and the surrounding adjacent L area and a secondary adjacent area according to the same principle of S1-S2, and sequentially expanding the area range until the area is expanded to the whole area under E, wherein the q-th identification model GANq (Mq) = p (q +1), and matrix elements in a numbering matrix M are at least two different sub-areas;
s4 similarly, according to steps S1-S3, q +1 outdoor recognition models GAN '(M') = p (1 '), GAN' 1(M '1) = p (2'). GAN 'q (M' q) = p (q +1 '), where matrix elements in M' are non-architectural geographical areas, including roads, rivers, mountains, forests, fields, lakes, etc., but not including any non-architectural geographical areas inside sub-areas, such as rivers, roads, ponds, artificial landscapes, etc., inside the sub-areas;
s5, selecting at least one number in the number library Q to be substituted into (1), (2).. Q, respectively obtaining map results pR1 and pR2.. pRq, comparing the indoor image data or the software drawing with pR1 and pR2.. pRq, screening indoor image data or software drawing pi1 with the highest similarity (corresponding to a plurality of pi1 if a plurality of numbers are selected at one time), and converting to E according to a relational expression to perform correct position registration;
s6, selecting at least one number in the number library K to be substituted into (1 '), (2 '). multidot. (q '), respectively obtaining map results pR1 ', pR2 '. multidot.. pRq ', screening outdoor image data or software drawings pi2 with highest similarity (corresponding to a plurality of pi2 if a plurality of numbers are selected at one time) by comparing the outdoor image data or software drawings with pR1 ', pR2 '. multidot.. pRq ', and converting to the corresponding I of the pi1 with the position registration in an optional step S5 according to a relational expression to carry out correct position registration;
s7 repeats S5 and S6 until registration is completed for all numbered building locations;
and S8, for the indoor image data or the software drawing under E, the number under I is converted to the number under E through a conversion relation, and for the outdoor image data or the software drawing under I, the number under E is converted to the number under I through a conversion relation.
Converting the image data to E according to the relational expression to perform correct position registration, wherein the converting to E according to the relational expression comprises aligning the number point under I corresponding to pi1 with the highest corresponding similarity with the number point of the corresponding building model in the image data or the software drawing under E according to the relational expression, and aligning the pi1 with the abscissa axis of the building model to perform position registration; and the conversion into the corresponding I of pi1 of the position registration in the optional step S5 according to the relation to perform the correct position registration comprises the conversion of the E lower numbered point corresponding to the pi2 with the highest corresponding similarity into the corresponding I of pi1 of the position registration according to the relation, and the pi2 is coincident with the abscissa axis of the corresponding non-building geographical area under E.
In one embodiment, the serial numbers of the data are adopted for matrix matching, graph comparison and screening, and the indoor and outdoor data are subjected to correct position registration, wherein the method comprises the steps 1 and 2, and the four vertex coordinates are respectively numbered according to clockwise horizontal coordinates, then an indoor serial number matrix M established in the step S1 and an outdoor serial number matrix M' established in the step S2 are established, and each matrix element comprises four serial numbers; further comprising step S3': by selecting the indoor number, the image data of the corresponding number matrix M under E or the number matrix matching in the software drawing, and the image data or the software drawing under E is compared with the corresponding indoor image data or the software drawing, the registration position with the most consistent serial number matrix matching and the graphic comparison is screened out to obtain the registration position of the selected indoor serial number under E, a corresponding I coordinate system is established, S4' selects the outdoor serial number, the corresponding numbering matrix M' matches the numbering matrix in the image data or software drawing under the corresponding I at the registration position, comparing the image data or the software drawing under the corresponding I with the corresponding outdoor image data or the software drawing, and screening out the registration position where the array number matrix matching and the figure comparison are most matched again to obtain the registration position of the selected outdoor number under the corresponding I of the registration position; s5 'repeats the steps S3' and S4 'until the indoor number and the outdoor number are completely selected to finish registration, the whole registration procedure is finished by an algorithm, confirmation is carried out according to the image similarity percentage, manual check is carried out when the percentage is more than 95%, otherwise comparison is carried out again, S6' converts the numbers under I to the numbers under E through a conversion relation for indoor image data or software drawings under E, and the numbers under E are converted to the numbers under I through a conversion relation for outdoor image data or software drawings under I.
It can be understood that the two technical solutions of the matrix modeling or the matrix matching may also be implemented by performing position registration on the outdoor non-building geographic area under one or more optional positions I of the registered positions after all indoor building configuration positions under E are performed, so as to complete the entire indoor and outdoor three-dimensional model fusion. However, regardless of the location registration method, the selected I may be I that is close to the registered outdoor non-building geographic zone, i.e., may be the I that is closest to the origin of the registered outdoor non-building geographic zone and the origin of I.
The coordinate systems E and I are rectangular coordinate systems or non-European geometric coordinate systems, and the software drawing comprises a CAD drawing or a 3dMAX drawing. The reference point O of the standard building arbitrarily selected in step 1 is also the origin O of its own I coordinate system.
The invention also provides a device for realizing the method, which comprises a processor, a display and an operation panel, and is characterized in that the device also comprises:
an outdoor three-dimensional model coordinate system E and an outdoor non-building geographic block number establishing module;
an indoor three-dimensional model coordinate system I and an indoor building number establishing module;
the indoor and outdoor three-dimensional model fusion module is used for establishing a transformation formula between coordinate systems E and I, carrying out matrix modeling or matrix matching, graph comparison and screening by adopting the serial numbers of all data, carrying out correct position registration on the indoor and outdoor data, and carrying out fusion of the indoor and outdoor data again to unify matching serial numbers;
the indoor and outdoor three-dimensional model integrated presentation module: according to the transformation formula between the outdoor coordinate system and the indoor coordinate system of the model, when outdoor display is carried out, indoor data is visually presented according to the serial number under E, and when indoor display is carried out, outdoor data is dynamically and graphically visually presented according to the serial number under I.
The apparatus includes a computer-readable non-transitory storage medium having stored therein a program executable by the processor to implement the above-described method for indoor and outdoor fusion based on three-dimensional model data.
The invention has the following beneficial effects: when the indoor and outdoor semantic CityGML three-dimensional model data are fused, the problem that indoor and outdoor scenes cannot be fused due to different data sources and different coordinate precisions is solved by using the dynamic visual model fusion display method, complete matching of data is realized without correcting indoor and outdoor data coordinates, and the operating efficiency of operators is greatly improved. And the semantic CityGML three-dimensional model is established based on a software drawing, and the resolution is far higher than that of image data based on an image technology.
Drawings
FIG. 1 is a flow chart of a method for indoor and outdoor fusion of three-dimensional model data according to the present invention.
Detailed Description
The technical solutions of the present invention will be described clearly and completely with reference to the accompanying drawings, and it should be understood that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example 1
A method of indoor-outdoor fusion of three-dimensional model data, the method comprising:
step 1: and establishing an outdoor three-dimensional model coordinate system and an outdoor non-building geographic block number.
According to the information related to the map scene, outdoor image data is acquired by using equipment such as a satellite and aerial photography, for example, a photography device is mounted by an unmanned aerial vehicle, the photography device can be a common digital camera, a panoramic digital camera, a single-lens oblique photography camera, a double-lens oblique photography camera, a five-lens oblique photography camera and the like, images are acquired from different angles such as vertical and oblique angles, and complete and accurate information of an outdoor ground object is acquired. The method comprises the following steps of sending collected outdoor image data to a client through a wired network or a wireless network, receiving the outdoor image data by the client, and carrying out data processing on the received outdoor image data, wherein the data processing comprises the following steps: and carrying out image color equalization, image defogging, image correction, bad line and strip removal and noise removal on the outdoor image data, and constructing an outdoor three-dimensional CityGML model according to the processed outdoor image data. Acquiring edges of a municipal building base, namely the south and west edges of the municipal building base, namely an X axis and a Y axis respectively, and taking a right-angle vertex of the edges as an origin as an outdoor reference point O to establish an outdoor three-dimensional model coordinate system E; acquiring four vertex coordinates of four vertexes of a rectangular surface of each outdoor non-building geographic block, wherein the four vertexes of the rectangular surface are formed by four points of a plane which is the largest in area and closest to an XOY coordinate plane, the plane is parallel to the XOY coordinate plane, a point N which is the farthest away from an O point in the four vertexes is used as a number of each non-building geographic block under E, and the number of each non-building geographic block is automatically generated in an outdoor three-dimensional CityGML model, wherein each number corresponds to a coordinate under E; expressed by the abscissa of the point N with the number; and establishing a serial number library K.
Step 2: and establishing an indoor three-dimensional model coordinate system and an indoor building number.
According to map scene relevant information, utilizing equipment such as ultra wide band, all kinds of inductors, supervisory equipment or laser three-dimensional scanner to gather indoor image data, send the indoor image data who gathers for the customer end through wired network or wireless network, the customer end receives indoor image data, carries out data processing to the indoor image data who receives, and data processing includes: removing image data with low definition in the indoor image, preprocessing drawing data, cleaning drawing data, and constructing an indoor three-dimensional CityGML model through CAD drawing according to the processed indoor image data.
An indoor three-dimensional CityGML model constructed through CAD drawings is used for constructing an indoor three-dimensional model coordinate system I corresponding to each building according to the base of each building model and a standard building base, four vertex coordinates of four vertexes of a rectangular surface of each indoor building model base are obtained, the four vertexes of the rectangular surface are formed by four points of a plane, the area of the plane is the largest, the plane is parallel to the xoy coordinate plane of the coordinate system I, the plane is the closest to the xoy coordinate plane, a point M' which is the farthest away from an o point in the four vertexes is used as the number of each building under I, the number of each building is automatically generated in the indoor three-dimensional CityGML model, and each number corresponds to a coordinate under I; expressed by the abscissa of the point M' numbered under I; and establishing a number library Q.
And step 3: establishing a Cartesian between coordinate systems E and IAnd (3) a standard system transformation formula is adopted, S1 divides buildings or building groups in the area under E to form a plurality of sub-areas L, indoor software drawings of the sub-areas L of a cell are divided into training set drawings and verification set drawings, and the buildings of the cell are arranged and constructed into a number matrix M = according to the sequence from west to east and from north to south
Figure 990144DEST_PATH_IMAGE003
Wherein each matrix element aijEach number N is included; each matrix element aijConstructing a graph matrix p =by taking a corresponding indoor software drawing as a matrix element
Figure 833335DEST_PATH_IMAGE004
Between S2 matrixes M and p, a recognition model GAN (M) = p (1) is established by generating a confrontation network (GAN) training and using a verification set drawing, and S3 establishes M1= p in L and a surrounding adjacent L area according to S1-S2
Figure 287450DEST_PATH_IMAGE005
Built-in recognition model GAN1(M1) = p (2), L and surrounding adjacent L, sub-adjacent region M2=
Figure 190684DEST_PATH_IMAGE006
Establishing an identification model GAN2(M2) = p (3) in the city E, and sequentially expanding the area range until the area range is expanded to all areas under the city E, wherein the q-th identification model GANq (Mq) = p (q), at least two matrix elements in a number matrix M come from different sub-areas, a first column of zero vectors represents a river channel, and other zero matrix elements represent roads in a cell;
s4 similarly establishes q +1 recognition models GAN '(M') = p (1 ') (corresponding M' = p) outdoors according to steps S1-S3
Figure 464671DEST_PATH_IMAGE007
Representing a motor vehicle lane with M north, the four vertex matrix elements representing the four vertices of the lane surface), GAN ' 1(M ' 1) = p (2 ') (corresponding to M ' 1 =: (M ' 1)
Figure 783657DEST_PATH_IMAGE008
f 11-f 22Representing the four vertices of the lane surface east of M2,g 11 f 22 g 21 g 22GAN ' q ' (M ' q ') = p (q '); the matrix elements in M' are non-architectural geographical areas, including motorways, non-motorways, pedestrian roads, green belts and lawns outside the cell, but not including riverways, roads, ponds and artificial landscapes inside the cell inside any sub-area.
S5, selecting at least one number in the number library Q to be substituted into (1) and (2).. Q, respectively obtaining image results pR1 and pR2.. pRq, screening indoor image data or software drawing pi1 with the highest similarity by comparing the indoor image data or software drawing with pR1 and pR2.. pRq, converting the indoor image data or software drawing pi into E according to a rectangular coordinate system transformation formula to perform correct position registration until the number in the number library Q is selected and the position registration is finished;
s6, selecting at least one number in the number library K to be substituted into (1 '), (2 '). multidot. (q '), respectively obtaining map results pR1 ', pR2 '. multidot.. pRq ', screening outdoor image data or software drawing with highest similarity by comparing the outdoor image data or software drawing with pR1 ', pR2 '. multidot.. pRq ', and converting the outdoor image data or software drawing into pi2 with the number registered in an optional step S5 according to a rectangular coordinate system transformation formula to perform correct position registration under the condition that I corresponds to pi1 with the number registered in the step S5; until the number in the number library K is selected and the position is registered.
And 4, step 4: the indoor and outdoor three-dimensional models are presented in an integrated manner: according to the transformation formula between the outdoor coordinate system and the indoor coordinate system of the model, when the outdoor display is carried out, the indoor data is visually presented according to the serial number under E, and when the indoor display is carried out, the outdoor data is dynamically and graphically visually presented according to the serial number under I, so that the problem of data incompatibility caused by different data sources is solved.
Example 2
The difference between the embodiment 2 and the embodiment 1 is that the serial numbers of the data are adopted in the step 3 for matrix matching, graph comparison and screening, and the indoor and outdoor data are subjected to correct position registration; step 1, numbering the four vertex coordinates in the step 2 according to clockwise horizontal coordinates respectively, and establishing an indoor numbering matrix M in the step S1 and an outdoor numbering matrix M' in the step S2, wherein each matrix element comprises four numbers;
further comprising step S3': screening out a registration position which is most matched with the serial number matrix matching and the graph comparison to obtain a corresponding I coordinate system of the registration position by selecting the serial number matrix matching of the indoor serial number matrix in the image data or the CAD drawing under E and comparing the image data or the CAD drawing under E with the graph of the corresponding indoor image data or the CAD drawing; namely, the difference of the serial numbers matched with the serial number matrix is just the distance between the original points of the E and the I, the error is within the range of the preset threshold value, and the comparison of the graphs is carried out according to the difference of two graphs after the graying between the image data front view and the CAD front view, and the comparison is determined to be successful when the total gray level in the obtained gray level difference graph is smaller than the preset threshold value.
S4' selecting the registration position where the serial number matrix matching and the graph comparison are most matched by selecting the number matrix matching of the outdoor number matrix in the image data or the CAD drawing under the corresponding I of the registration position and comparing the image data or the CAD drawing under the corresponding I with the graph of the corresponding outdoor image data or the CAD drawing; s5 ' finally repeats the above steps S3' and S4' until the indoor number and the outdoor number are completely selected to complete the registration,
s6' transforms the numbers under I to the numbers under E by a transformation relationship for the indoor image data or software drawing under E, and transforms the numbers under E to the numbers under I by a transformation relationship for the outdoor image data or software drawing under I.
According to the method, when the indoor and outdoor CityGML three-dimensional model data are fused, the problem that indoor and outdoor scenes cannot be fused due to different data sources and different coordinate precisions is solved by using the dynamic visual model fusion display method, complete matching of data is achieved without correcting indoor and outdoor data coordinates, and compared with the prior art, the working efficiency of operators is greatly improved.
The foregoing illustrates and describes the principles, general features, and advantages of the present invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are merely illustrative of the principles of the invention, but that various changes and modifications may be made without departing from the spirit and scope of the invention, which fall within the scope of the claims. The scope of the invention is to be determined by the following claims.

Claims (6)

1. An indoor and outdoor fusion method based on three-dimensional model data comprises the following steps:
step 1: establishing an outdoor three-dimensional model coordinate system E and an outdoor non-building geographical block number;
step 2: establishing an indoor three-dimensional model coordinate system I and an indoor building number;
and step 3: fusing indoor and outdoor three-dimensional models;
and 4, step 4: the indoor and outdoor three-dimensional models are presented in an integrated manner: according to a transformation formula between an outdoor coordinate system and an indoor coordinate system of the model, when outdoor display is carried out, indoor data is subjected to visual presentation of data graphs according to the serial numbers under E, and when indoor display is carried out, outdoor data is subjected to dynamic graphical visual presentation according to the serial numbers under I;
the step 1 specifically comprises the following steps: constructing an outdoor three-dimensional CityGML model, acquiring the edges of the south and west of the base of a randomly selected standard building, namely an X axis and a Y axis respectively, and establishing an outdoor three-dimensional model coordinate system E by taking the right-angle vertex of the standard building as an origin as an outdoor reference point O; acquiring four vertex coordinates of four vertexes of a rectangular surface of each outdoor non-building geographic block, wherein the four vertexes of the rectangular surface are formed by four points of a plane which is parallel to an XOY coordinate plane and has the largest area and the closest distance to the XOY coordinate plane, a point N which is farthest away from an O point in the four vertexes is used as a serial number of each building under E, and the serial number of each building is automatically generated in an outdoor three-dimensional CityGML model, wherein each serial number corresponds to a coordinate under E; expressed by the abscissa of the point N with the number; establishing a serial number library K;
the step 2 specifically comprises the following steps: an indoor three-dimensional CityGML model constructed through an indoor software drawing is constructed, an indoor three-dimensional model coordinate system I corresponding to each building is constructed according to the base of each building model and the same as a standard building base, four vertex coordinates of a rectangular surface of each indoor building model base are obtained, the four vertexes of the rectangular surface are formed by four points of a plane, parallel to the xoy coordinate plane of the coordinate system I, of the base, the area of the plane is the largest, the plane is the closest to the xoy coordinate plane, a point M' which is the farthest away from the o point in the four vertexes is used as the number of each building under I, the number of each building is automatically generated in the indoor three-dimensional CityGML model, and each number corresponds to a coordinate under I; expressed by the abscissa of the point M' numbered under I; establishing a serial number library Q;
the step 3 specifically comprises the following steps: establishing a transformation between coordinate systems E and I, and then carrying out the following steps:
s1, dividing buildings or building groups in the area E to form a plurality of sub-areas L, dividing indoor software drawings of each sub-area L into training set drawings and verification set drawings, and arranging and constructing number matrix of the buildings of each sub-area from west to east and from north to south
Figure FDA0003216755220000021
Wherein each element aij includes each number N; constructing a graph matrix by taking the indoor software drawing corresponding to each matrix element aij as the matrix element
Figure FDA0003216755220000022
Figure FDA0003216755220000023
Wherein, the place where no building exists is represented by a zero matrix element;
between the S2 matrixes M and p, a recognition model GAN (M) is established by generating a confrontation network (GAN) training and using a verification set drawing paper (1),
s3 sets up recognition models GAN1(M1) ═ p (2) in L and surrounding adjacent L regions, and sets up recognition models GAN2(M2) ═ p (3) in L and surrounding adjacent L and next adjacent regions according to the same principle of S1-S2, and sequentially expands the region range until extending to the whole region under E, the q-th recognition model ganq (Mq) ═ p (q +1), wherein the numbering matrixes M1, M2.
S4 similarly establishes q +1 outdoor recognition models GAN ' (M ') ═ p (1 '), GAN ' 1(M ' 1) ═ p (2 '). GAN ' q (M ' q) ═ p (q +1 '); wherein the matrix elements in M' are non-building geographical areas, but do not include non-building geographical areas inside any sub-areas;
s5, selecting at least one number in the number library Q to be substituted into (1), (2).. Q, respectively obtaining map results pR1 and pR2.. pRq, comparing indoor image data or software drawings with pR1 and pR2.. pRq, screening indoor image data or software drawings pi1 with the highest similarity, and converting the indoor image data or software drawings pi1 into E according to a relational expression to perform correct position registration;
s6, selecting at least one number in the number library K to be substituted into (1 '), (2 '). multidot. (q '), respectively obtaining map results pR1 ', pR2 '. multidot.. pRq ', screening outdoor image data or software drawing with highest similarity by comparing the outdoor image data or software drawing with pR1 ', pR2 '. multidot.. pRq ', and converting the outdoor image data or software drawing to pi2 corresponding to the pi1 with the number registered in an optional step S5 according to a relational expression to carry out correct position registration;
s7 repeats S5 and S6 until registration is completed for all numbered building locations;
s8, for indoor image data or software drawing under E, the serial number under I is converted to the serial number under E through a conversion relation, and for outdoor image data or software drawing under I, the serial number under E is converted to the serial number under I through a conversion relation;
converting the image data to E according to the relational expression to perform correct position registration, wherein the converting to E according to the relational expression comprises aligning the number point under I corresponding to pi1 with the highest corresponding similarity with the number point of the corresponding building model in the image data or the software drawing under E according to the relational expression, and aligning the pi1 with the abscissa axis of the building model to perform position registration; and the conversion into the pi1 corresponding I of the position registration in the optional step S5 according to the relation to perform the correct position registration comprises the conversion of the E lower numbered point corresponding to the pi2 with the highest corresponding similarity into the corresponding I of the pi1 of the position registration according to the relation, and the pi2 is coincident with the abscissa axis of the corresponding non-building geographic area under E.
2. The method according to claim 1, wherein the four vertex coordinates in step 1 and step 2 are numbered according to clockwise abscissa, the indoor numbering matrix M established in step S1 and the outdoor numbering matrix M' established in step S2 are respectively numbered, each matrix element includes four numbers, the numbers of the data in step 3 are used for matrix matching, graph comparison and screening, the indoor and outdoor data are registered in correct positions, and the fusion of the indoor and outdoor data is performed again, and the matching numbers are unified, specifically comprising:
step S3': screening out a registration position where the sequence number matrix matching and the graph comparison are most matched to obtain a corresponding I coordinate system of the registration position by selecting an indoor number, matching the number matrix of the corresponding number matrix M in image data or a software drawing under E and comparing the image data or the software drawing under E with the graph of the corresponding indoor image data or the software drawing;
s4 'selects the registration position where the sequence number matrix matching and the graph matching are most matched again by selecting the outdoor number, matching the number matrix in the image data or the software drawing corresponding to the number matrix M' at the registration position under the corresponding I, and comparing the image data or the software drawing under the corresponding I with the graph of the corresponding outdoor image data or the software drawing;
s5 ' repeating the steps S3' and S4' until the indoor number and the outdoor number are completely selected and the registration is completed, and obtaining the registration position of the selected outdoor number under the corresponding I of the registration position;
s6' transforms the numbers under I to the numbers under E by a transformation relationship for the indoor image data or software drawing under E, and transforms the numbers under E to the numbers under I by a transformation relationship for the outdoor image data or software drawing under I.
3. The method according to claim 1 or 2, characterized in that: the step 1 of constructing the outdoor three-dimensional CityGML model comprises the following steps: acquiring outdoor image data, wherein the outdoor image data is acquired by utilizing a satellite or aerial camera equipment, and the acquired outdoor image data is subjected to data processing, including color homogenizing, defogging, correction, bad line and strip removal and noise removal on the outdoor image data, so that an outdoor three-dimensional CityGML model is constructed, and the outdoor three-dimensional CityGML model is constructed according to the processed outdoor image data; or
And constructing an outdoor three-dimensional CityGML model by adopting an outdoor software drawing, wherein the software drawing comprises a CAD drawing or a 3dMAX drawing.
4. The method according to claim 1 or 2, characterized in that: the indoor three-dimensional CityGML model constructed through the indoor software drawing in the step 2 comprises the following steps: acquiring indoor image data, wherein the indoor image data is acquired by using a camera, monitoring equipment, camera equipment or laser three-dimensional scanner equipment, performing data processing on the acquired indoor image data, performing image data with low definition removal, drawing data preprocessing and drawing data cleaning on the indoor image data, and constructing an indoor three-dimensional CityGML model through a software drawing according to the processed indoor image data; the coordinate systems E and I are rectangular coordinate systems or non-European geometric coordinate systems, and the software drawing comprises a CAD drawing or a 3dMAX drawing.
5. An apparatus for implementing the method of any one of claims 1-4 based on the three-dimensional model data for indoor and outdoor fusion, comprising a processor, a display, and an operation panel, wherein the apparatus further comprises:
an outdoor three-dimensional model coordinate system E and an outdoor building number establishing module;
an indoor three-dimensional model coordinate system I and an indoor building number establishing module;
the indoor and outdoor three-dimensional model fusion module is used for establishing a transformation formula between coordinate systems E and I, carrying out matrix modeling or matrix matching, graph comparison and screening by adopting the serial numbers of all data, carrying out correct position registration on the indoor and outdoor data, and carrying out fusion of the indoor and outdoor data again to unify matching serial numbers;
the indoor and outdoor three-dimensional model integrated presentation module: according to the transformation formula between the outdoor coordinate system and the indoor coordinate system of the model, when outdoor display is carried out, indoor data is visually presented according to the serial number under E, and when indoor display is carried out, outdoor data is dynamically and graphically visually presented according to the serial number under I.
6. An apparatus according to claim 5, further comprising a computer-readable non-transitory storage medium in which a program executable by the processor to implement the method for performing the indoor/outdoor fusion method based on three-dimensional model data according to any one of claims 1 to 4 is stored.
CN202110320705.1A 2021-03-25 2021-03-25 Indoor and outdoor fusion method and device based on three-dimensional model data Active CN113066112B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110320705.1A CN113066112B (en) 2021-03-25 2021-03-25 Indoor and outdoor fusion method and device based on three-dimensional model data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110320705.1A CN113066112B (en) 2021-03-25 2021-03-25 Indoor and outdoor fusion method and device based on three-dimensional model data

Publications (2)

Publication Number Publication Date
CN113066112A CN113066112A (en) 2021-07-02
CN113066112B true CN113066112B (en) 2021-10-22

Family

ID=76563472

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110320705.1A Active CN113066112B (en) 2021-03-25 2021-03-25 Indoor and outdoor fusion method and device based on three-dimensional model data

Country Status (1)

Country Link
CN (1) CN113066112B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272591B (en) * 2022-05-10 2023-09-05 泰瑞数创科技(北京)股份有限公司 Geographic entity polymorphic expression method based on three-dimensional semantic model
CN115937461B (en) * 2022-11-16 2023-09-05 泰瑞数创科技(北京)股份有限公司 Multi-source fusion model construction and texture generation method, device, medium and equipment
CN116051734B (en) * 2022-12-28 2023-11-17 中建三局集团华南有限公司 Engineering quantity rapid statistical method based on three-dimensional laser scanning and live-action three-dimensional modeling

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824327A (en) * 2014-03-10 2014-05-28 武汉大学 Indoor and outdoor integrated organization method for urban three-dimensional scene data
CN110702078A (en) * 2019-10-25 2020-01-17 中山大学 Indoor and outdoor integrated map construction system based on vision and construction method thereof
CN111737790A (en) * 2020-05-12 2020-10-02 中国兵器科学研究院 Method and equipment for constructing simulated city model
CN112052508A (en) * 2020-09-17 2020-12-08 正元地理信息集团股份有限公司 Seamless data fusion method and system in three-dimensional GIS platform

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015531052A (en) * 2012-06-01 2015-10-29 ランダウアー インコーポレイテッド Wireless, motion and position sensing integrated radiation sensor for occupational and environmental dosimetry
US10139384B1 (en) * 2015-01-16 2018-11-27 Airviz Inc. Data fusion for personal air pollution exposure
CN106646562A (en) * 2016-09-09 2017-05-10 华东师范大学 High-precision three-dimensional real scene indoor and outdoor integrated positioning method and device
CN108090959B (en) * 2017-12-07 2021-09-10 中煤航测遥感集团有限公司 Indoor and outdoor integrated modeling method and device
CN108230247B (en) * 2017-12-29 2019-03-15 达闼科技(北京)有限公司 Generation method, device, equipment and the computer-readable storage medium of three-dimensional map based on cloud
CN111919225B (en) * 2018-03-27 2024-03-26 辉达公司 Training, testing, and validating autonomous machines using a simulated environment
CN109147030B (en) * 2018-07-05 2020-06-30 厦门大学 Indoor and outdoor scene joint modeling method based on line characteristics
CN109699007B (en) * 2018-12-29 2021-01-15 北京航空航天大学苏州创新研究院 Indoor and outdoor seamless gradual change navigation transition method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103824327A (en) * 2014-03-10 2014-05-28 武汉大学 Indoor and outdoor integrated organization method for urban three-dimensional scene data
CN110702078A (en) * 2019-10-25 2020-01-17 中山大学 Indoor and outdoor integrated map construction system based on vision and construction method thereof
CN111737790A (en) * 2020-05-12 2020-10-02 中国兵器科学研究院 Method and equipment for constructing simulated city model
CN112052508A (en) * 2020-09-17 2020-12-08 正元地理信息集团股份有限公司 Seamless data fusion method and system in three-dimensional GIS platform

Also Published As

Publication number Publication date
CN113066112A (en) 2021-07-02

Similar Documents

Publication Publication Date Title
CN113066112B (en) Indoor and outdoor fusion method and device based on three-dimensional model data
Chen et al. A methodology for automated segmentation and reconstruction of urban 3-D buildings from ALS point clouds
US8427505B2 (en) Geospatial modeling system for images and related methods
KR101165523B1 (en) Geospatial modeling system and related method using multiple sources of geographic information
CN112085845A (en) Outdoor scene rapid three-dimensional reconstruction device based on unmanned aerial vehicle image
CN105046251B (en) A kind of automatic ortho-rectification method based on environment No.1 satellite remote-sensing image
CN113192193B (en) High-voltage transmission line corridor three-dimensional reconstruction method based on Cesium three-dimensional earth frame
US20030014224A1 (en) Method and apparatus for automatically generating a site model
WO2018061010A1 (en) Point cloud transforming in large-scale urban modelling
CN112465970B (en) Navigation map construction method, device, system, electronic device and storage medium
JP2002157576A (en) Device and method for processing stereo image and recording medium for recording stereo image processing program
CN110660125B (en) Three-dimensional modeling device for power distribution network system
CN112288637A (en) Unmanned aerial vehicle aerial image rapid splicing device and rapid splicing method
CN115937288A (en) Three-dimensional scene model construction method for transformer substation
CN116468869A (en) Live-action three-dimensional modeling method, equipment and medium based on remote sensing satellite image
CN108629742B (en) True ortho image shadow detection and compensation method, device and storage medium
Cao et al. Quantifying visual environment by semantic segmentation using deep learning
KR102587445B1 (en) 3d mapping method with time series information using drone
CN111612894B (en) Vegetation model auxiliary generation method and system based on aerial image and CIM
CN113345084A (en) Three-dimensional modeling system and three-dimensional modeling method
Dhruwa et al. Generation of 3-D Large-Scale Maps using LiDAR Point Cloud Data
Wu et al. Building Facade Reconstruction Using Crowd-Sourced Photos and Two-Dimensional Maps
CN111143913A (en) Three-dimensional modeling method and system for transformer substation building
Kossieris et al. Developing a low-cost system for 3d data acquisition
CN117115243B (en) Building group outer facade window positioning method and device based on street view picture

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 22 / F, building 683, zone 2, No. 5, Zhongguancun South Street, Haidian District, Beijing 100086

Patentee after: Terry digital technology (Beijing) Co.,Ltd.

Address before: 100089 22 / F, building 683, zone 2, 5 Zhongguancun South Street, Haidian District, Beijing

Patentee before: Terra-IT Technology (Beijing) Co.,Ltd.

CP03 Change of name, title or address