CN115393887B - Model bounding box method for calculating building CAD (computer aided design) block identifiers - Google Patents

Model bounding box method for calculating building CAD (computer aided design) block identifiers Download PDF

Info

Publication number
CN115393887B
CN115393887B CN202210794826.4A CN202210794826A CN115393887B CN 115393887 B CN115393887 B CN 115393887B CN 202210794826 A CN202210794826 A CN 202210794826A CN 115393887 B CN115393887 B CN 115393887B
Authority
CN
China
Prior art keywords
bounding box
layer
geometry
processing
cad
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210794826.4A
Other languages
Chinese (zh)
Other versions
CN115393887A (en
Inventor
刘俊伟
李同高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Terry Digital Technology Beijing Co ltd
Original Assignee
Terry Digital Technology Beijing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Terry Digital Technology Beijing Co ltd filed Critical Terry Digital Technology Beijing Co ltd
Priority to CN202210794826.4A priority Critical patent/CN115393887B/en
Publication of CN115393887A publication Critical patent/CN115393887A/en
Application granted granted Critical
Publication of CN115393887B publication Critical patent/CN115393887B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/40Document-oriented image-based pattern recognition
    • G06V30/42Document-oriented image-based pattern recognition based on the type of document
    • G06V30/422Technical drawings; Geographical maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19147Obtaining sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/19Recognition using electronic means
    • G06V30/191Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
    • G06V30/19153Design or setup of recognition systems or techniques; Extraction of features in feature space; Clustering techniques; Blind source separation using rules for classification or partitioning the feature space
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/30Character recognition based on the type of data
    • G06V30/302Images containing characters for discriminating human versus automated computer access

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Geometry (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Computer Hardware Design (AREA)
  • Mathematical Analysis (AREA)
  • Biomedical Technology (AREA)
  • Computational Mathematics (AREA)
  • Civil Engineering (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Architecture (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Structural Engineering (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a model bounding box method for calculating a building CAD block identifier, which comprises the following steps of: step 101: acquiring CAD file data of a building plan; step 102: performing hierarchical algorithm processing on the data file in the step 101; step 103: generating a plurality of layers by the processing of the step 102, wherein each layer meets the data requirement of the further processing of a block algorithm, the plurality of layers comprise a block data layer and a wall layer, and the block data layer comprises a layer of a door frame and a window frame identifier; step 104: performing blocking processing on the block data layer generated in the step 103; step 105: the generated final model bounding box data is used for the subsequent primitive construction entity algorithm. And on the basis of the multiple groups of model bounding box data obtained in the step 105, the GAN network automatically gives out a CAD drawing after bounding box processing according to the CAD drawing, and the bounding box processing of building CAD is efficiently and accurately realized.

Description

Model bounding box method for calculating building CAD (computer aided design) block identifiers
Technical Field
The invention relates to a method for batch processing of identifiers in CAD drawings, in particular to a model bounding box method for calculating building CAD block identifiers, and belongs to the field of building drawing processing algorithms.
Background
The building drawings, especially the common CAD drawings, have block data that need to be manually marked with a model, and these block data are only used to show a mark. Such as door frames, window frames, etc., which not only identify their extent, but also identify the extent of movement of the door and window frames by a quarter-arc, moving open. Modeling by CAD drawings requires a door to be placed in the door frame. But the effect of manual processing is slow and efficiency is not guaranteed. Therefore, a computer algorithm is needed to be considered to realize batch processing, so that labor investment is saved, and a building CAD processing diagram reaching standards is efficiently obtained.
Disclosure of Invention
In order to solve the technical problems, the invention provides a computer batch cyclic processing-based block data representation algorithm, namely an automatic bounding box processing method, which is used for automatically generating required model bounding box data from data to be processed so as to solve the problem of manual identification liberation of the prior art personnel. The step number designations of the present invention are merely representative of one step and are not representative of having to be numerically sequential. To this end, the present invention provides a model bounding box method for computing a building CAD tile identifier, comprising the steps of:
step 101: acquiring CAD file data of a building plan;
step 102: carrying out hierarchical algorithm processing on the file data in the step 101;
step 103: generating a plurality of layers through the processing of the step 102, so that each layer meets the data requirement of the further processing of the block algorithm, wherein the plurality of layers comprise a block data layer and a wall layer, and the block data layer comprises a layer of a door frame and a window frame identifier;
step 104: performing block algorithm processing on the block data layer generated in the step 103;
step 105: the final model bounding box data generated in step 104 is used in subsequent primitive construction entity algorithms.
The step 104 specifically includes the following steps:
step 401: inputting a group of input parameters of the associated block data layer;
step 402: according to the previous step 401, an input layer derived from the list parameters, i.e. a block data layer, is analyzed for model bounding box processing; preferably, the block data layer comprises any one of a persistence layer, a memory layer, a context layer and a database layer.
Step 403: according to step 401, a referenced layer derived from the list parameters is analyzed;
step 404: according to step 401, the basic bounding box types derived from the list parameters are analyzed;
step 405: according to step 401, a Buffer from the list parameter, i.e. a Buffer parameter, is analyzed;
it should be emphasized that, since each bounding box algorithm has its own advantages and disadvantages and its own adapted scenario, the user can be given the calculation method of the underlying bounding box by means of this input parameter. The input parameters in step 401 and the parameter Buffer of step 405 are used in combination. The input parameters are associated with a list, which contains the associated block data layer, the referenced layer, the base bounding box type, and the Buffer.
Step 407: explosion processing is performed on each of the primitives in the input layer of step 402 and in the multiple layers of the reference layer of step 403, where each primitive is a primitive with a fixed area formed in a specified CAD drawing enlarged state, and each primitive is processed into a non-repartitionable line segment (where the line segment includes both straight and curved line segments), i.e., each line segment has only a start point and an end point, and then a spatial index is established on the exploded geometry.
It will be appreciated that each primitive is herein defined by decomposing into non-repartitionable line segments, on the basis of which the spatial coordinate position of each primitive is defined. So that subsequent primitive continuation cycles identify each primitive and bounding box algorithm operations. For example, a door is decomposed into two parallel line segments, so that each line segment completely determines the position and orientation of the door in the CAD drawing as long as the coordinates of the two end points are defined. Since each of the plurality of primitives of the reference layer is fixed, when the matching calculation is performed in the imported CAD, the input layer is enlarged to a prescribed CAD drawing enlarged state since the enlargement magnification of the input layer may not match the prescribed CAD enlarged state.
Step 410: a primitive looping step, wherein each primitive of the input layer of step 402 is looped, the looping step comprising the steps of:
step 411: classifying the shape of the primitive, which classification is used in the subsequent steps 416, 417; wherein the shape classification includes a door, a window;
step 412: generating a basic bounding box of the primitive according to the basic bounding box type designated by the user in step 404, and performing buffer processing according to the parameter buffer in step 405 to form a basic bounding box geometry; the geometry of the primitives of the input layer prior to forming the base bounding box is called the primitive geometry in the input layer.
Step 413: inquiring in the spatial index by utilizing the exploded primitives in the reference layer of the step 407 according to the basic bounding box geometry processed in the previous step, wherein the inquiring result is called reference geometry;
step 414: pairing the basic bounding box geometry and the reference geometry in pairs;
the matching values of the two mating geometries are calculated in a subsequent step thereafter.
Step 415: the location type between the underlying bounding box geometry corresponding to the primitive geometry in the input layer (i.e., the block data layer) of the loop step 410 and the reference geometry paired therewith is calculated. In practice, the location type is 10 more redundant, and different processing is performed in the subsequent processing for different location types.
And judging according to the position and the orientation of at least one line segment in the basic bounding box geometry of the primitive forming the input layer. For example, the door is in a horizontal type position prior to processing and in a vertical type after processing.
Step 416: according to the shape classification of the step 411 and the position type of the step 415, additional needed matching parameters are calculated, wherein the matching parameters comprise the coordinates of at least one specific point in the basic bounding box geometry formed by each primitive in the input layer, at least one comparison point in the reference primitive geometry after explosion processing, and the area occupied by the primitive in the CAD graph is calculated based on the coordinates.
Step 417: the matching value is calculated for the pairing geometry of step 414 based on the shape classification of step 411, the location type of step 415, the additional matching parameters of step 416.
Specifically, the matching value calculation process includes:
417-1: amplifying an input image layer to a specified CAD graph amplifying state, calculating the overlapping area between the basic bounding box geometry corresponding to the shape type and the reference geometry corresponding to the position type, and if the difference value between the overlapping area and the preset overlapping area is not more than a preset range, recording that the type pairing is successful; 417-2: calculating the position relation between the coordinate of a specific point and a contrast point in the reference primitive geometry after explosion treatment, such as the distance between the specific point and the contrast point;
417-3: and taking the area difference and the position relationship as matching values.
It will be appreciated that the match value is in fact a set of data that reflects whether the match is good or not during the matching process, i.e. the degree of agreement is characterized by the match value.
Preferably, the specific matching includes overlapping at least one contrast point with a specific point, and rotating the reference primitive by a prescribed rotation direction and an angular step according to the overlapping point, calculating the area difference at each step continuously during rotation, decreasing the step rotation in the opposite direction if there is a difference between two consecutive calculated overlapping areas and a preset overlapping area that is gradually increasing, continuing to decrease the step rotation in the opposite direction if it is found that the difference starts to increase again, and cycling until the difference is within the preset range.
Or alternatively, the process may be performed,
the matching value calculation process includes:
417-1': amplifying an input image layer to a specified CAD image amplifying state, calculating the position relation between the coordinate of a first specific point and a first contrast point in the reference image element geometry after explosion treatment, and calculating the position relation between a second specific point and a second contrast point in the reference image element geometry after explosion treatment, for example, corresponding to the distance between the specific point and the contrast point;
417-2': and taking the position relation as a matching value. The matching value is then in fact a set of data comprising two distance values
Preferably, the specific matching comprises if the first distance between the first specific point and the first contrast point and the second distance between the second specific point and the second contrast point both meet a preset range, then the matching is successful, resulting in an optimal matching correct geometry.
Step 418: according to the matching value in step 417, the best matching correct geometry is calculated, which specifically includes calculating whether the matching value matches with the preset matching value in the matching process, so as to obtain the best matching correct geometry with the highest matching degree. For example, both the difference and the distance are minimal.
Step 419: the geometric data which is successfully matched is processed by a bounding box algorithm, which comprises deleting the part which belongs to the basic bounding box in the successfully matched geometric data, and establishing the bounding box which belongs to the part of the reference primitive
Step 420: steps 411-419 are repeated to complete bounding box algorithm processing for all primitives of the input layer and to persist the spatial geometry data for all primitives of the input layer for subsequent processing according to the preselected file criteria.
In one embodiment, to achieve bounding box matching faster, there is also provided a model bounding box method for computing building CAD tile identifiers based on generating a countermeasure network GAN, comprising the steps of:
step 430: forming multiple sets of bounding box data according to steps 101-105, wherein the multiple sets of bounding box data comprise CAD images before and after bounding box processing, and the bounding box data and the CAD images are respectively divided into a training set and a verification set, and the ratio of the training set to the verification set is 5-2:1:1-2;
step 440: inputting the CAD image before processing the bounding box in the training set selected together with given noise into a generator G to form a set of first pseudo-images;
preferably, the given noise is a uniform distribution function or a normal distribution function.
Step 450: inputting a group of first pseudo-images and a plurality of groups of real processed CAD images into a discriminator D for identification and classification, repeating steps 440 and 450, wherein each time the CAD images before the processing of the bounding boxes in a selected group of training sets are different, each time a loss function value is calculated, and back propagation sequentially and finely adjusts the parameters of the D network until a corresponding group of first pseudo-images which are originally input are identified in the plurality of groups of real processed CAD images with a certain accuracy, so as to complete the training of the discriminator D to form a discriminator D', and the basis of the discriminator D is that the pseudo-images and the real processed CAD images are differentiated and input into the FC for output to a softmax function for two classification;
step 460: inputting the CAD image before processing the bounding box in the training set selected together with given noise into a generator G to form a set of second pseudo-images;
step 470: and (3) inputting a group of second pseudo-graphs and a plurality of groups of real processed CAD images into a discriminator D 'together for identification and classification, repeating the steps 460 and 470, wherein each time the CAD images before bounding box processing in the selected group training set are different, each time a loss function value is calculated, and back propagation sequentially and finely adjusts G network parameters until a corresponding first group of second pseudo-graphs are identified in the plurality of groups of real processed CAD images with a certain accuracy, and finishing training of the generator G to form the generator G' as a bounding box processing model.
The generator G and the discriminator D are both convolutional neural network models, and the specific training process of the discriminator D in step 450 is that a plurality of sets of real processed CAD images and feature extraction results obtained by inputting a plurality of sets of first pseudograms into a CNN model form respective sets of feature graphs, the respective sets of feature graphs are differentiated to form a differential graph, the differential graph is input into a fully connected FC, classified by a softmax function, the accuracy is verified by a verification set, and the loss function value is calculated to perform counter propagation to adjust the CNN network until the accuracy reaches a threshold value and the loss function value is stable.
It will be appreciated that as the parameters of the G network are adjusted, the sets of pseudograms each get closer to the CAD after processing of their bounding boxes, so that the processed CAD can be obtained as long as the pre-processed image is input into the G.
Advantageous effects
(1) Adopting the ideas of bounding boxes and graphic primitives, optimally matching the basic bounding boxes formed by input with the reference graphic primitives, and carrying out bounding box processing to obtain the processed door and window dynamic display effect;
(2) And the GAN network is adopted to autonomously learn the bounding box processing, so that the characteristics of the bounding box processing can be learned, and the correct image processed by the bounding box processing can be output in G.
Drawings
Figure 1 a general flow chart of a model bounding box method for computing building CAD tile identifiers of embodiment 1 of the invention,
figure 2 is a detailed flowchart of a bounding box method including loop processing in embodiment 1 of the present invention,
fig. 3a shows the states of the reference gate and the base bounding box geometry before matching, respectively the contrast point and the specific point, fig. 3b shows the state after matching is successful,
figure 4 is a schematic diagram of a matching process,
figure 5 a GAN modeling process diagram of embodiment 2 of the invention,
figures 6 and 7 are the enclosure states of the door office department before and after the process according to embodiment 1,
fig. 8 and 9 are divided into the states of the CAD drawing of the building before and after the G' model is input in example 2.
Detailed Description
Example 1
The model bounding box method for calculating building CAD tile identifiers of the present invention as shown in fig. 1 comprises the steps of:
step 101: acquiring CAD file data of a building plan;
step 102: carrying out hierarchical algorithm processing on the file data in the step 101;
step 103: generating a plurality of layers through the processing of the step 102, so that each layer meets the data requirement of the further processing of the block algorithm, wherein the plurality of layers comprise a block data layer and a wall layer, and the block data layer comprises a layer of a door frame and a window frame identifier;
step 104: performing block algorithm processing on the block data layer generated in the step 103;
step 105: the final model bounding box data generated in step 104 is used in subsequent primitive construction entity algorithms.
As shown in fig. 2, step 104 specifically includes the following steps:
step 401: inputting a group of input parameters of the associated block data layer;
step 402: according to the previous step 401, an input layer derived from the list parameters, i.e. a block data layer, is analyzed for model bounding box processing; preferably, the block data layer comprises any one of a persistence layer, a memory layer, a context layer and a database layer.
Step 403: according to step 401, a referenced layer derived from the list parameters is analyzed;
step 404: according to step 401, the basic bounding box types derived from the list parameters are analyzed;
step 405: according to step 401, a Buffer from the list parameter, i.e. a Buffer parameter, is analyzed;
wherein input parameters are associated with a list (not shown by way of example) comprising the associated block data layer, the referenced layer, the base bounding box type, and the Buffer.
Step 407: explosion processing is performed on each of the primitives in the input layer of step 402 and in the multiple layers of the reference layer of step 403, each of the primitives is processed into two straight and curved line segments which are not repartitionable, each line segment has only a start point and an end point, and then a spatial index is established on the geometry after the explosion processing, wherein each of the primitives in the multiple layers of the reference layer is a primitive with a fixed area formed in a specified CAD drawing enlarged state.
Next, step 410 is entered: a primitive looping step, wherein each primitive of the input layer of step 402 is looped, the looping step comprising the steps of:
step 411: classifying the shape of the primitive, which classification is used in the subsequent steps 416, 417;
step 412: generating a basic bounding box of the primitive according to the basic bounding box type designated by the user in step 404, and performing buffer processing according to the parameter buffer in step 405 to form a basic bounding box geometry;
step 413: inquiring in the spatial index by utilizing the exploded primitives in the reference layer of the step 407 according to the basic bounding box geometry processed in the previous step, wherein the inquiring result is called reference geometry;
step 414: pairing the basic bounding box geometry and the reference gate pairwise (fig. 3 b);
the matching values of the two mating geometries are calculated in a subsequent step thereafter.
Step 415: the location type between the primitive geometry and the paired two geometries in the input layer (i.e., block data layer) of loop step 410 is calculated. In practice, the location type is 10 more redundant, and different processing is performed in the subsequent processing for different location types.
As shown in fig. 3a, taking a door in a horizontal direction as an example, before matching, specifically, the position and the orientation of at least one line segment in the basic bounding box geometry constituting the door in the input layer are determined, where the door is in a horizontal type position before processing, and is in a vertical type after processing, which shows an openable and closable open state. Wherein the dashed line in the basic bounding box geometry represents the boundary of the basic bounding box at which the computer recognizes when there is no straight line segment of the gate in fact, and the boundary along with the selected segment of the gate encloses the basic bounding box geometry as the basis for the following overlapping area calculation.
Step 416: additional required matching parameters are calculated based on the shape classification of step 411, the location type of step 415.
As shown in fig. 3a, the matching parameters include, inputting coordinates of a specific point 1 and a specific point 2 in a basic bounding box geometry formed by each gate in the layer, comparing coordinates of the point 1 and a comparison file 2 in the reference primitive geometry after explosion processing, and calculating an area occupied by the gate in a CAD drawing in a specified CAD drawing enlarged state based on the coordinates.
Step 417: the matching value is calculated for the pairing geometry of step 414 based on the shape classification of step 411, the location type of step 415, the additional matching parameters of step 416.
Specifically, the matching value calculation process includes:
417-1: amplifying an input image layer to a specified CAD graph amplifying state, calculating the overlapping area between the basic bounding box geometry corresponding to the shape type and the reference geometry corresponding to the position type, and if the difference between the overlapping area and the preset overlapping area is not more than a preset range (< 1%), recording that the type pairing is successful;
417-2: calculating the position relation between the coordinates of the specific point 1 and one contrast point 1 in the reference door geometry after explosion treatment, namely whether the coordinates coincide;
417-3: and taking whether the difference value sum is coincident or not as a matching value.
The specific matching comprises, as shown in fig. 4, overlapping the comparison point 1 with the specific point 2, rotating the reference door by the overlapping point by a step distance of 2-5 degrees clockwise, continuously calculating the difference value under each step distance in the rotating process, if two continuous calculated difference values are gradually increased (upper graph), reducing the step distance anticlockwise to the original half rotation (lower graph), if the difference value is found to start to be increased again, continuing to reduce the half step distance clockwise and continuing to rotate, and circulating until the difference value is within a preset range (< 1%).
Alternatively, as shown in FIG. 3b
The matching value calculation process includes:
417-1': calculating the position relation between the coordinates of the specific point 2 and the contrast point 1 in the reference door geometry after the explosion treatment, whether the positions of the specific point 1 and the contrast point 2 in the reference door geometry after the explosion treatment coincide, and whether the distance L2 is a preset value L=half of the geometric width of the basic bounding box;
417-2': and taking whether the coincidence and the L2 are the preset values as the matching values.
The specific matching comprises that if the specific point 2 coincides with the contrast point 1, and the second distance between the specific point 1 and the contrast point 2 is a preset value L, the matching is successful, and the optimal matching correct geometry is formed.
It will be appreciated that the specific point 2 coincides with the contrast point 1, and thus the second distance between the point 1 and the contrast point 2 is only one location L. Since the width of the gate is standard, when the magnification of the input layer has been matched to the prescribed CAD magnification, the length of the reference gate and the length of the base bounding box both agree when the reference gate is closed.
Step 418: according to the matching value in the step 417, the best matching correct geometry is calculated, which specifically includes calculating whether the matching value is matched with the preset matching value in the matching process, so as to obtain the best matching correct geometry with the highest matching degree;
step 419: carrying out bounding box algorithm processing on the successfully matched geometric data, wherein the processing specifically comprises deleting a part belonging to a basic bounding box in the successfully matched geometric data, and establishing a bounding box belonging to a part of a reference primitive;
step 420: steps 411-419 are repeated to complete bounding box algorithm processing for all primitives of the input layer and to persist the spatial geometry data for all primitives of the input layer for subsequent processing according to the preselected file criteria.
Example 2
As shown in fig. 5, a model bounding box method for calculating building CAD tile identifiers based on generating a countermeasure network GAN, is characterized by comprising the steps of:
step 430: forming multiple groups of bounding box data according to steps 101-105, wherein the multiple groups of bounding box data comprise CAD images before and after bounding box processing, the bounding box data and the CAD images are respectively divided into a training set and a verification set, and the ratio of the training set to the verification set is 3:1:1;
step 440: inputting CAD images before processing of a selected group of training set bounding boxes and normal distribution functions into a generator G to form a group of first pseudo-graphs;
step 450: inputting a group of first pseudo-images and a plurality of groups of real processed CAD images into a discriminator D for identification and classification, repeating steps 440 and 450, wherein each time the CAD images before the processing of the bounding boxes in a selected group of training sets are different, each time the loss function value a is calculated, the back propagation sequentially and finely adjusts the parameters of the D network until the corresponding first group of first pseudo-images which are initially input are identified in the plurality of groups of real processed CAD images with a certain accuracy acc1 (> 90%), and finishing the training of the discriminator D to form the discriminator D'.
The specific training process is that a plurality of groups of real CAD images and a plurality of groups of feature extraction results obtained by inputting a first pseudo-graph into a CNN model form respective plurality of groups of feature graphs, the feature extraction results are differentiated to form a differential graph, the differential graph is input into a fully-connected FC, the differential graph is classified by a softmax function, the accuracy is verified by a verification set, and a loss function value a is calculated to perform back propagation to adjust the CNN network until the accuracy reaches a threshold value (more than 90%) and the loss function value is stabilized.
Step 460: inputting the CAD image before processing the bounding box in the training set selected with the normal distribution function into a generator G to form a set of second pseudo-graphs;
step 470: and (3) inputting a group of second pseudo-graphs and a plurality of groups of real processed CAD images into a discriminator D 'together for identification and classification, repeating the steps 460 and 470, wherein each time the CAD images before bounding box processing in the selected group training set are different, each time a loss function value b is calculated, and back propagation sequentially and finely adjusts G network parameters until a corresponding group of second pseudo-graphs which are originally input are identified in the plurality of groups of real processed CAD images with a certain accuracy acc2 (> 95%), and completing training of the generator G to form the generator G' as a bounding box processing model.
Fig. 6 and 7 are enclosure states of the door office department before and after the process according to embodiment 1, respectively. Fig. 8 and 9 are divided into the states of the CAD drawing of the building before and after the G' model is input in example 2.

Claims (5)

1. A model bounding box method for computing a building CAD tile identifier, comprising the steps of:
step 101: acquiring CAD file data of a building plan;
step 102: carrying out hierarchical algorithm processing on the file data in the step 101;
step 103: generating a plurality of layers through the processing of the step 102, so that each layer meets the data requirement of the further processing of the block algorithm, wherein the plurality of layers comprise a block data layer and a wall layer, and the block data layer comprises a layer of a door frame and a window frame identifier;
step 104: performing block algorithm processing on the block data layer generated in the step 103;
step 105: the final model bounding box data generated in the step 104 are used for the subsequent primitive construction entity algorithm;
step 104 specifically includes the following steps:
step 401: inputting a group of input parameters of the associated block data layer; the input parameters are associated with a list, wherein the list comprises the associated block data layers, the referenced layers, the basic bounding box types and Buffer;
step 402: according to the previous step 401, an input layer derived from the list parameters, i.e. a block data layer, is analyzed for model bounding box processing; the block data layer comprises any one of a persistence layer, a memory layer, a context layer and a database layer;
step 403: according to step 401, a referenced layer derived from the list parameters is analyzed;
step 404: according to step 401, the basic bounding box types derived from the list parameters are analyzed;
step 405: according to step 401, a Buffer from the list parameter, i.e. a Buffer parameter, is analyzed;
step 407: performing explosion processing on each primitive in the input layers of the step 402 and in a plurality of layers of the reference layers of the step 403, processing each primitive into non-repartitionable line segments, that is, each line segment has only a start point and an end point, and then establishing a spatial index on the geometry after the explosion processing, wherein each primitive in the plurality of layers of the reference layers is a primitive with a fixed area formed in a specified CAD drawing enlarged state; step 410: a primitive looping step, namely looping each primitive of the input layer of step 402;
the cyclic process comprises the steps of:
step 411: performing shape classification on each primitive of an input layer, wherein the shape classification comprises a door and a window;
step 412: generating a basic bounding box of the primitive according to the basic bounding box type designated by the user in step 404, and performing buffer processing according to the parameter buffer in step 405 to form a basic bounding box geometry;
step 413: inquiring in the spatial index by using the primitives in the exploded reference layer of the step 407 according to the basic bounding box, wherein the inquiring result is called reference geometry;
step 414: pairing the basic bounding box geometry and the reference geometry in pairs;
step 415: calculating position types between the basic bounding box geometry corresponding to the primitive geometry in the input layer in the loop step 410 and the reference geometry matched with the basic bounding box geometry, wherein the position types are more than 10 and less than 20, and particularly judging according to the position and the orientation of at least one line segment in the basic bounding box geometry forming the primitive in the input layer;
step 416: calculating additional required matching parameters according to the shape classification of the step 411 and the position type of the step 415, wherein the matching parameters comprise the coordinates of at least one specific point in the basic bounding box geometry formed by each primitive in the input layer, the coordinates of a contrast point in the reference primitive geometry after explosion treatment, and the area occupied by each primitive in the CAD graph is calculated based on the coordinates of the specific point and the coordinates of the contrast point of each primitive;
step 417: calculating a matching value for the pairing geometry of step 414 based on the shape classification of step 411, the location type of step 415, the additional matching parameters of step 416;
step 418: according to the matching value in the step 417, the best matching correct geometry is calculated, which specifically includes calculating whether the matching value is matched with the preset matching value in the matching process, so as to obtain the best matching correct geometry with the highest matching degree;
step 419: carrying out bounding box algorithm processing on the successfully matched geometric data, wherein the processing specifically comprises deleting a part belonging to a basic bounding box in the successfully matched geometric data, and establishing a bounding box belonging to a part of a reference primitive;
step 420: steps 411-419 are repeated to complete bounding box algorithm processing for all primitives of the input layer and to persist the spatial geometry data for all primitives of the input layer for subsequent processing according to the preselected file criteria.
2. The method according to claim 1, characterized in that the matching value calculation process comprises:
step 417-1: amplifying an input image layer to a specified CAD graph amplifying state, calculating the overlapping area between the basic bounding box geometry corresponding to the shape type and the reference geometry corresponding to the position type, and if the difference value between the overlapping area and the preset overlapping area is not more than a preset range, recording that the type pairing is successful;
step 417-2: calculating the position relation between the coordinate of a specific point and a contrast point in the reference primitive geometry after explosion treatment;
step 417-3: the area difference and the position relationship are taken as matching values, or,
the matching value calculation process includes:
step 417-1': amplifying an input image layer to a specified CAD image amplifying state, calculating the position relation between the coordinate of a first specific point and a first contrast point in the reference image element geometry after explosion treatment, and calculating the position relation between a second specific point and a second contrast point in the reference image element geometry after explosion treatment;
step 417-2': taking the position relationship as a matching value;
the degree of anastomosis is characterized by a matching value.
3. The method according to claim 2, wherein the positional relationship is a distance between a specific point and a contrast point, the specific matching of step 417-1 to step 417-3 includes overlapping at least one contrast point with the specific point, and rotating the reference primitive by a prescribed rotation direction and angle step according to the overlapping point, continuously calculating the area difference value at each step during rotation, decreasing step rotation in the opposite direction if there are two continuous differences between the calculated overlapping area and the preset overlapping area in the stepwise manner, continuing to decrease step rotation in the opposite direction if it is found that the difference value starts to increase again, and repeating the steps until the difference value is within the preset range;
the matching of steps 417-1 '-417-2' includes successful matching if the first distance between the first specific point and the first comparison point and the second distance between the second specific point and the second comparison point both meet a preset range, resulting in an optimal matching correct geometry.
4. A method according to any one of claims 1-3, characterized in that it further comprises the following step after step 420:
step 430: forming a plurality of groups of bounding box data according to steps 101-105, wherein the bounding box data comprises CAD images before and after bounding box processing, and the bounding box data and the CAD images are respectively divided into a training set and a verification set, and the ratio of the training set to the verification set is 3:1;
step 440: inputting the CAD image before processing the bounding box in the training set selected together with given noise into a generator G to form a set of first pseudo-images;
step 450: inputting a group of first pseudo-images and a plurality of groups of real processed CAD images into a discriminator D for identification and classification, repeating steps 440 and 450, wherein each time the CAD images before the processing of the bounding boxes in a selected group of training sets are different, each time a loss function value is calculated, and back propagation sequentially and finely adjusts the parameters of the D network until a corresponding group of first pseudo-images which are originally input are identified in the plurality of groups of real processed CAD images with a certain accuracy, so as to complete the training of the discriminator D to form a discriminator D', and the basis of the discriminator D is that the pseudo-images and the real processed CAD images are differentiated and input into the FC for output to a softmax function for two classification;
step 460: inputting the CAD image before processing the bounding box in the training set selected together with given noise into a generator G to form a set of second pseudo-images;
step 470: and (3) inputting a group of second pseudo-graphs and a plurality of groups of real processed CAD images into a discriminator D 'together for identification and classification, repeating the steps 460 and 470, wherein each time the CAD images before the bounding box processing in a group of training sets are different, each time a loss function value is calculated, and back propagation sequentially and finely adjusts G network parameters until a corresponding first group of second pseudo-graphs are identified in a plurality of groups of real processed CAD images with a certain accuracy, and finishing the training of the generator G to form a generator G' as a bounding box processing model.
5. The method of claim 4, wherein the given noise is a uniform distribution function or a normal distribution function, the generator G and the discriminator D are both convolutional neural network models, the specific training process of the discriminator D in step 450 is that a plurality of sets of real processed CAD images and feature extraction results obtained by inputting a plurality of sets of first pseudograms into the CNN model form respective sets of feature graphs, the respective sets of feature graphs are differentiated to form a differential graph, the differential graph is input into the fully connected FC, classified by a softmax function, the accuracy is verified by a verification set, and the back propagation adjustment CNN network is performed by calculating a loss function value until the accuracy reaches a threshold and the loss function value is stabilized.
CN202210794826.4A 2022-07-07 2022-07-07 Model bounding box method for calculating building CAD (computer aided design) block identifiers Active CN115393887B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210794826.4A CN115393887B (en) 2022-07-07 2022-07-07 Model bounding box method for calculating building CAD (computer aided design) block identifiers

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210794826.4A CN115393887B (en) 2022-07-07 2022-07-07 Model bounding box method for calculating building CAD (computer aided design) block identifiers

Publications (2)

Publication Number Publication Date
CN115393887A CN115393887A (en) 2022-11-25
CN115393887B true CN115393887B (en) 2023-09-29

Family

ID=84117479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210794826.4A Active CN115393887B (en) 2022-07-07 2022-07-07 Model bounding box method for calculating building CAD (computer aided design) block identifiers

Country Status (1)

Country Link
CN (1) CN115393887B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109993827A (en) * 2019-03-29 2019-07-09 宁波睿峰信息科技有限公司 A kind of elevation recognition methods that architectural drawing is converted to three-dimensional BIM model
WO2019153445A1 (en) * 2018-02-08 2019-08-15 真玫智能科技(深圳)有限公司 Method and device for cloth-human collision
CN110827385A (en) * 2018-08-10 2020-02-21 辉达公司 Query-specific behavior modification of tree traversal
CN113191311A (en) * 2021-05-19 2021-07-30 广联达科技股份有限公司 Filling boundary identification method, device and equipment of vector PDF drawing and storage medium
CN114065344A (en) * 2021-11-11 2022-02-18 北京神州安信科技股份有限公司 Automatic modeling method based on OSG and CAD building plan

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019153445A1 (en) * 2018-02-08 2019-08-15 真玫智能科技(深圳)有限公司 Method and device for cloth-human collision
CN110827385A (en) * 2018-08-10 2020-02-21 辉达公司 Query-specific behavior modification of tree traversal
CN109993827A (en) * 2019-03-29 2019-07-09 宁波睿峰信息科技有限公司 A kind of elevation recognition methods that architectural drawing is converted to three-dimensional BIM model
CN113191311A (en) * 2021-05-19 2021-07-30 广联达科技股份有限公司 Filling boundary identification method, device and equipment of vector PDF drawing and storage medium
CN114065344A (en) * 2021-11-11 2022-02-18 北京神州安信科技股份有限公司 Automatic modeling method based on OSG and CAD building plan

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Junfang Zhu等.A New Reconstruction Method for 3D Buildings from 2D Vector Floor Plan.《Computer-Aided Design and Applications》.2014,704-714. *
曹双辉.基于图元特征的工业CT图像与CAD模型的比对系统的研究与实现.《中国优秀硕士学位论文全文数据库信息科技辑》.2015,I140-616. *

Also Published As

Publication number Publication date
CN115393887A (en) 2022-11-25

Similar Documents

Publication Publication Date Title
US7197165B2 (en) Eye tracking using image data
JP4392886B2 (en) Image extraction method and apparatus
Wu et al. A method of vehicle classification using models and neural networks
CN106803071A (en) Object detecting method and device in a kind of image
KR102262058B1 (en) Method for setting of process optimazation of three-dimensional printer
CN110689492B (en) Image edge smoothing method and device
Majumdar et al. Computing the Skorokhod distance between polygonal traces
Ibragimovich et al. Optimization of identification of micro-objects based on the use of characteristics of images and properties of models
CN115393887B (en) Model bounding box method for calculating building CAD (computer aided design) block identifiers
Ngo et al. Feature-first hole filling strategy for 3D meshes
CN117095247B (en) Numerical control machining-based machining gesture operation optimization method, system and medium
Phillips et al. Bayesian faces via hierarchical template modeling
Zhang et al. A data-driven approach for adding facade details to textured LoD2 CityGML models
CN109886091A (en) Three-dimensional face expression recognition methods based on Weight part curl mode
Kushner Numerical approximations for stochastic differential games: the ergodic case
Abadpour Rederivation of the fuzzy–possibilistic clustering objective function through Bayesian inference
CN115619970A (en) Three-dimensional model repairing method, system, medium, equipment and terminal
CN112767462B (en) Point cloud single-point alignment method based on ridge-valley characteristics and depth characteristic descriptors
Li et al. Reasoning mechanism: An effective data reduction algorithm for on-line point cloud selective sampling of sculptured surfaces
Melero et al. On the Interactive 3D Reconstruction of Iberian Vessels.
EP3076370B1 (en) Method and system for selecting optimum values for parameter set for disparity calculation
US7031529B2 (en) Method for detection optimization in image-based decision systems
Bassier et al. BIM reconstruction: Automated procedural modeling from point cloud data
CN113592705A (en) House type structure analysis method and device
CN113516582A (en) Network model training method and device for image style migration, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant