CN114429638B - Construction drawing examination management system - Google Patents
Construction drawing examination management system Download PDFInfo
- Publication number
- CN114429638B CN114429638B CN202210352817.XA CN202210352817A CN114429638B CN 114429638 B CN114429638 B CN 114429638B CN 202210352817 A CN202210352817 A CN 202210352817A CN 114429638 B CN114429638 B CN 114429638B
- Authority
- CN
- China
- Prior art keywords
- feature extraction
- convolution
- layer
- pooling layer
- construction drawing
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000010276 construction Methods 0.000 title claims abstract description 118
- 238000007689 inspection Methods 0.000 claims abstract description 16
- 238000012552 review Methods 0.000 claims abstract description 13
- 238000000605 extraction Methods 0.000 claims description 118
- 238000011176 pooling Methods 0.000 claims description 78
- 238000012549 training Methods 0.000 claims description 26
- 238000013527 convolutional neural network Methods 0.000 claims description 21
- 230000002776 aggregation Effects 0.000 claims description 15
- 238000004220 aggregation Methods 0.000 claims description 15
- 230000004913 activation Effects 0.000 claims description 11
- 210000002569 neuron Anatomy 0.000 claims description 11
- 230000004927 fusion Effects 0.000 claims description 9
- 238000000034 method Methods 0.000 claims description 8
- 230000009191 jumping Effects 0.000 claims description 6
- 238000006116 polymerization reaction Methods 0.000 claims description 5
- 210000004027 cell Anatomy 0.000 claims description 3
- 230000009286 beneficial effect Effects 0.000 description 7
- 239000000284 extract Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 238000010586 diagram Methods 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/25—Fusion techniques
- G06F18/253—Fusion techniques of extracted features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
The invention discloses a construction drawing examination and management system, which comprises: the construction drawing type inspection system comprises a construction drawing uploading subsystem, a construction drawing type inspection subsystem, a construction drawing classification subsystem, a database subsystem and an inspection distribution subsystem; the construction drawing uploading subsystem is used for uploading construction drawings; the construction drawing formal review subsystem is used for performing formal review on the uploaded construction drawing; the construction drawing classification subsystem is used for classifying the construction drawings which pass formal examination and storing the classified construction drawings into the database subsystem; the examination distribution subsystem is used for distributing the construction drawings stored in the database subsystem; the invention solves the problems that the conventional construction drawing combined examination system lacks a formal examination link for construction drawings and does not store the construction drawings in a classified manner.
Description
Technical Field
The invention relates to construction drawing examination, in particular to a construction drawing examination management system.
Background
At present, most of construction drawing examination adopts a traditional paper blueprint examination mode, and the traditional examination is not only high in cost, but also pollutes the environment; the multi-head examination and the repeated examination result in long period, the drawings are printed by enterprises and sent to departments one by one, and the drawings are repeatedly folded after being modified, so that the enterprises repeatedly run legs, the engineering construction progress is influenced, the opinions are very large, and the complaints are very many; meanwhile, in the traditional paper image examination mode, image examination traces are difficult to keep, and the conditions of not strict examination, lack of fairness and fairness caused by interest relations easily occur in all links, so that the method is not beneficial to the healthy development of the industry.
Although the conventional construction drawing joint inspection system uploads various construction drawings to the system, stores the construction drawings, and distributes the construction drawings to each inspection department or each inspector, the conventional construction drawing joint inspection system lacks a formal inspection link for the construction drawings, and does not classify and store the construction drawings, so that the construction drawings are easily distributed wrongly during distribution.
Disclosure of Invention
Aiming at the defects in the prior art, the construction drawing examination management system provided by the invention solves the problems that the conventional construction drawing combined examination system lacks a formal examination link for construction drawings and does not store the construction drawings in a classified manner.
In order to achieve the purpose of the invention, the invention adopts the technical scheme that: a construction drawing review management system comprising: the construction drawing type inspection system comprises a construction drawing uploading subsystem, a construction drawing type inspection subsystem, a construction drawing classification subsystem, a database subsystem and an inspection distribution subsystem;
the construction drawing uploading subsystem is used for uploading construction drawings; the construction drawing formal review subsystem is used for performing formal review on the uploaded construction drawing; the construction drawing classification subsystem is used for classifying the construction drawings which pass formal examination and storing the classified construction drawings into the database subsystem; the examination distribution subsystem is used for distributing the construction drawings stored in the database subsystem.
Further, the construction drawing form examination subsystem comprises: a file format examining unit, a title examining unit, a line type examining unit and a color examining unit;
the file format examining unit is used for examining the extension of the construction drawing file according to the file extension examining format; the title examination unit is used for examining the title according to the title examination format; the line type examination unit is used for extracting the drawing line data of the construction drawing and examining the line type according to the line type examination format; the color checking unit is used for extracting the color data of the construction drawing and checking the color of the construction drawing according to the color checking format of the construction drawing.
The beneficial effects of the above further scheme are: by means of formal examination of the construction drawing, the construction drawing meets the most basic formal requirements, the task amount of subsequent examination and approval links is reduced, and the construction drawing standard is standardized.
Further, the construction drawing classification subsystem comprises: the system comprises a convolutional neural network feature extraction module, an LSTM feature extraction module, a fusion module and a full-connection module;
the input end of the convolutional neural network feature extraction module and the input end of the LSTM feature extraction module are used as the input ends of the construction drawing classification subsystems; the output end of the convolutional neural network feature extraction module and the output end of the LSTM feature extraction module are both connected with the input end of the fusion module; the output end of the fusion module is connected with the input end of the full-connection module; and the output end of the full-connection module is used as the output end of the construction drawing classification subsystem.
The beneficial effects of the above further scheme are: the invention designs a convolutional neural network feature extraction module and an LSTM feature extraction module to extract features of a construction drawing to obtain multiple features, the multiple features are fused, and classification is carried out according to the multiple features, so that the classification accuracy is improved.
Further, the convolutional neural network feature extraction module comprises: the system comprises a first convolution feature extraction network, a second convolution feature extraction network and a third convolution feature extraction network.
Furthermore, the convolution kernels of the convolution layers adopted in the three convolution feature extraction networks of the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network are different in size.
Further, the convolution kernel size of the convolution layer adopted in the first convolution feature extraction network is 3 x 3; the convolution kernel size of the convolution layer adopted in the second convolution feature extraction network is 5 x 5; the convolution kernel size of the convolution layer employed in the third convolution feature extraction network is 7 × 7.
The beneficial effects of the above further scheme are: the convolution neural network feature extraction module is designed with convolution layers of three different scales, and the smaller convolution kernel extracts less feature information and has high processing speed, so that the invention combines the convolution layers of three different scales, comprehensively considers the processing speed and the feature information, and finally fuses the output feature data of the three parts of networks and the output data of the LSTM feature extraction module.
Further, the training process of the convolutional neural network feature extraction module is as follows:
a1, acquiring character data on a construction drawing to obtain a training data set;
and A2, respectively and independently training the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network by adopting a training data set to obtain the trained first convolution feature extraction network, second convolution feature extraction network and third convolution feature extraction network.
The beneficial effects of the above further scheme are: the three parts of networks are trained independently during training, so that the transmission of parameter oscillation during training is prevented, the training time is long, and the networks are not easy to converge.
Further, the training method in step a2 is:
b1, initializing the weight and bias of the convolution feature extraction network;
b2, inputting the training data set into the convolution feature extraction network to obtain the output of the convolution feature extraction network;
b3, extracting the output of the network according to the convolution characteristics, and calculating a loss function;
b4, judging whether the loss function is smaller than a preset value, if so, finishing the training of the convolution feature extraction network, and if not, jumping to the step B5;
b5, updating the weight and the bias of the convolution characteristic extraction network according to the loss function, and jumping to the step B2.
Further, the formula for updating the weights and biases of the convolution feature extraction network in step B5 is:
wherein,is as followsThe convolved features of the sub-iteration extract the weights of the network,is as followsThe convolution features of the sub-iteration extract the bias of the network,、in order to adjust the factors, the method comprises the following steps,in order to be a function of the loss,the weights are biased for a loss function,the bias is biased for the loss function.
The beneficial effects of the above further scheme are: the weight and the bias in the early stage are changed greatly, and the change of the weight and the bias is smaller along with the increase of the training times, so that the weight and the bias are adaptively adjusted along with the training times, and the larger disturbance of the weight and the bias in the later stage of iteration is avoided.
In conclusion, the beneficial effects of the invention are as follows: the construction drawings are firstly examined formally after the construction drawings are uploaded to a system, the construction drawings which pass the formal examination are classified, and during classification, the construction drawings are extracted in two modes of a convolutional neural network feature extraction module and an LSTM feature extraction module, so that the classification accuracy is guaranteed, and after classification, the construction drawings of different types are examined according to different examination departments or examination personnel, and the construction drawings are distributed to the examination departments or the examination personnel, so that the examination of the examination departments or the examination personnel is facilitated.
Drawings
FIG. 1 is a system block diagram of a construction drawing review management system;
FIG. 2 is a system block diagram of a construction drawing classification subsystem;
FIG. 3 is a schematic structural diagram of a convolutional neural network feature extraction module;
FIG. 4 is a schematic diagram of the structure of neuronal cells in the LSTM cell.
Detailed Description
The following description of the embodiments of the present invention is provided to facilitate the understanding of the present invention by those skilled in the art, but it should be understood that the present invention is not limited to the scope of the embodiments, and it will be apparent to those skilled in the art that various changes may be made without departing from the spirit and scope of the invention as defined and defined by the appended claims, and all changes that can be made by the invention using the inventive concept are intended to be protected.
As shown in fig. 1, a construction drawing review management system includes: the construction drawing type inspection system comprises a construction drawing uploading subsystem, a construction drawing type inspection subsystem, a construction drawing classification subsystem, a database subsystem and an inspection distribution subsystem;
the construction drawing uploading subsystem is used for uploading construction drawings; the construction drawing formal review subsystem is used for performing formal review on the uploaded construction drawing; the construction drawing classification subsystem is used for classifying the construction drawings which pass formal examination and storing the classified construction drawings into the database subsystem; the examination distribution subsystem is used for distributing the construction drawings stored in the database subsystem.
The construction drawing form examination subsystem comprises: a file format examining unit, a title examining unit, a line type examining unit and a color examining unit;
the file format examining unit is used for examining the extension of the construction drawing file according to the file extension examining format; the title examination unit is used for examining the title according to the title examination format; the line type checking unit is used for extracting the drawing line data of the construction drawing and checking the line type according to the line type checking format; the color checking unit is used for extracting the color data of the construction drawing and checking the construction drawing according to the color checking format of the construction drawing
As shown in fig. 2, the construction drawing classification subsystem includes: the system comprises a convolutional neural network feature extraction module, an LSTM feature extraction module, a fusion module and a full-connection module;
the input end of the convolutional neural network feature extraction module and the input end of the LSTM feature extraction module are used as the input ends of the construction drawing classification subsystems; the output end of the convolutional neural network feature extraction module and the output end of the LSTM feature extraction module are both connected with the input end of the fusion module; the output end of the fusion module is connected with the input end of the full-connection module; and the output end of the full-connection module is used as the output end of the construction drawing classification subsystem.
As shown in fig. 3, the convolutional neural network feature extraction module includes: the system comprises a first convolution feature extraction network, a second convolution feature extraction network and a third convolution feature extraction network.
Convolution kernels of convolution layers adopted in the three convolution feature extraction networks of the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network are different in size.
The convolution kernel size of the convolution layer adopted in the first convolution feature extraction network is 3 x 3; the convolution kernel size of the convolution layer adopted in the second convolution feature extraction network is 5 x 5; the convolution kernel size of the convolution layer employed in the third convolution feature extraction network is 7 × 7.
The training process of the convolutional neural network feature extraction module is as follows:
a1, acquiring character data on a construction drawing to obtain a training data set;
and A2, respectively and independently training the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network by adopting a training data set to obtain the trained first convolution feature extraction network, second convolution feature extraction network and third convolution feature extraction network.
The training method in the step a2 is as follows:
b1, initializing the weight and bias of the convolution feature extraction network;
b2, inputting the training data set into a convolution feature extraction network to obtain the output of the convolution feature extraction network;
b3, extracting the output of the network according to the convolution characteristics, and calculating a loss function;
b4, judging whether the loss function is smaller than a preset value, if so, finishing the training of the convolution feature extraction network, and if not, jumping to the step B5;
b5, updating the weight and the bias of the convolution characteristic extraction network according to the loss function, and jumping to the step B2.
The formula for updating the weights and biases of the convolution feature extraction network in step B5 is:
wherein,is a firstThe convolved features of the sub-iteration extract the weights of the network,is a firstThe convolution features of the sub-iteration extract the bias of the network,、in order to adjust the factors, it is preferred that,in order to be a function of the loss,the weights are biased for a loss function,the bias is biased for the loss function.
The following is a specific implementation of the convolutional neural network feature extraction module:
as shown in fig. 3, the first convolution feature extraction network includes: convolutional layer 1-1a, pooling layer 1-1, convolutional layer 1-2a, pooling layer 1-2, convolutional layer 1-3, ensemble weighted average pooling layer 1-31 and ensemble significance aggregation weight 1-32;
the second convolutional feature extraction network comprises: convolutional layer 2-1a, pooling layer 2-1, convolutional layer 2-2a, pooling layer 2-2, convolutional layer 2-3, overall weighted average pooling layer 2-31 and overall significance polymerization weighting 2-32;
the third convolutional feature extraction network includes: convolutional layer 3-1a, pooling layer 3-1, convolutional layer 3-2a, pooling layer 3-2, convolutional layer 3-3, overall weighted average pooling layer 3-31 and overall significance polymerization weighting 3-32;
the input end of the convolutional layer 1-1a is respectively connected with the input end of the convolutional layer 2-1a and the input end of the convolutional layer 3-1a, and the output end of the convolutional layer is connected with the input end of the pooling layer 1-1; the output end of the pooling layer 1-1 is respectively connected with the input end of the convolutional layer 1-2a, the output end of the pooling layer 2-1, the input end of the convolutional layer 2-2a, the output end of the pooling layer 3-1 and the input end of the convolutional layer 3-2 a; the output end of the convolutional layer 2-1a is connected with the input end of the pooling layer 2-1; the output end of the convolutional layer 3-1a is connected with the input end of the pooling layer 3-1; the output end of the convolutional layer 1-2a is connected with the input end of the pooling layer 1-2; the output end of the convolutional layer 2-2a is connected with the input end of the pooling layer 2-2; the output end of the convolution layer 3-2a is connected with the input end of the pooling layer 3-2; the output end of the pooling layer 3-2 is respectively connected with the output end of the pooling layer 2-2, the output end of the pooling layer 1-2, the input end of the convolution layer 1-3, the input end of the convolution layer 2-3 and the input end of the convolution layer 3-3; the output end of the convolutional layer 1-3 is respectively connected with the input end of the overall weighted average pooling layer 1-31 and the input end of the overall significance aggregation weighting layer 1-32; the output end of the convolutional layer 2-3 is respectively connected with the input end of the overall weighted average pooling layer 2-31 and the input end of the overall significance aggregation weighting layer 2-32; the output end of the convolutional layer 3-3 is respectively connected with the input end of the overall weighted average pooling layer 3-31 and the input end of the overall significance aggregation weighting layer 3-32;
and the output end of the overall weighted average pooling layer 1-31, the output end of the overall significance aggregation weighting layer 1-32, the output end of the overall weighted average pooling layer 2-31, the output end of the overall significance aggregation weighting layer 2-32, the output end of the overall weighted average pooling layer 3-31 and the output end of the overall significance aggregation weighting layer 3-32 are used as the output end of the convolutional neural network feature extraction module.
The feature data output by the pooling layer 1-1 can be shared by the second convolution feature extraction network and the third convolution feature extraction network, the convolution layer 2-2a can carry out convolution processing on the feature data output by the pooling layer 2-1 and the feature data output by the pooling layer 1-1, and the convolution layer 3-2a can carry out convolution processing on the feature data output by the pooling layer 1-1, the feature data output by the pooling layer 2-1 and the feature data output by the pooling layer 3-1.
The feature data output by the pooling layer 3-2 can be shared by the first convolution feature extraction network and the second convolution feature extraction network, the convolution layer 2-3 can carry out convolution processing on the feature data output by the pooling layer 3-2 and the feature data output by the pooling layer 2-2, and the convolution layer 1-3 can carry out convolution processing on the feature data output by the pooling layer 1-2, the feature data output by the pooling layer 2-2 and the feature data output by the pooling layer 3-2.
Through the cross sharing of the feature data of the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network, more feature data are reserved.
An integral weighted average pooling layer and an integral significance aggregation weighting pooling layer are added to the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network, integral feature data are reserved through the integral weighted average pooling layer, and significance features are reserved through the integral significance aggregation weighting.
The sizes of convolution kernels of the convolution layers 1-1a, 1-2a and 1-3 are all 3 x 3; the sizes of convolution kernels of the convolution layers 2-1a, 2-2a and 2-3 are all 5 x 5; the convolution kernels of each of the convolution layers 3-1a, 3-2a and 3-3 are 7 × 7 in size.
The following is the formula of the neuron cells in the LSTM unit of the LSTM feature extraction module, and fig. 4 is a schematic structural diagram of the neuron cells.
Wherein,is composed ofThe activation vector value of the gate is forgotten at that moment,in order to forget the biased term of the door,in order to forget the weight term of the door,is composed ofThe literal vector on the construction drawing at the moment,is composed ofThe output of the neuronal cells at the time of day,is composed ofThe state of the neuronal cells at the time of day,in order to activate the function(s),is composed ofThe moment in time inputs the value of the activation vector of the gate,in order to input the weight term of the gate,in order to input the offset term of the gate,is composed ofThe activation vector values for the status gates are updated at time instants,to update the weight terms of the status gates,to update the bias term for the status gate,for the purpose of the hyperbolic tangent activation function,is composed ofThe state of the neuronal cells at the moment,is composed ofThe activation vector value output by the neuron cell at the time,to be the weight term of the output gate,is the bias term of the output gate.
The forgetting gate inputs three variablesThe input gate inputs three variablesThe update status gate takes into account five variables、、、Andthe output gate takes six variables into account、、、、Andsufficiently for the input character vectorNeuronal cell statusAnd output of neuronal cellsThe utilization is carried out, and the characteristic data brought by the utilization is considered, so that the LSTM unit can extract effective characteristic data as much as possible.
Claims (2)
1. A construction drawing examination and management system is characterized by comprising: the construction drawing type inspection system comprises a construction drawing uploading subsystem, a construction drawing type inspection subsystem, a construction drawing classification subsystem, a database subsystem and an inspection distribution subsystem;
the construction drawing uploading subsystem is used for uploading construction drawings; the construction drawing formal review subsystem is used for performing formal review on the uploaded construction drawing; the construction drawing classification subsystem is used for classifying the construction drawings which pass formal examination and storing the classified construction drawings into the database subsystem; the examination distribution subsystem is used for distributing the construction drawings stored in the database subsystem;
the construction drawing classification subsystem comprises: the system comprises a convolutional neural network feature extraction module, an LSTM feature extraction module, a fusion module and a full-connection module;
the input end of the convolutional neural network feature extraction module and the input end of the LSTM feature extraction module are used as the input ends of the construction drawing classification subsystems; the output end of the convolutional neural network feature extraction module and the output end of the LSTM feature extraction module are both connected with the input end of the fusion module; the output end of the fusion module is connected with the input end of the full-connection module; the output end of the full-connection module is used as the output end of the construction drawing classification subsystem;
the convolutional neural network feature extraction module comprises: a first convolution feature extraction network, a second convolution feature extraction network and a third convolution feature extraction network;
convolution kernels of convolution layers adopted in the three convolution feature extraction networks of the first convolution feature extraction network, the second convolution feature extraction network and the third convolution feature extraction network are different in size;
the convolution kernel size of the convolution layer adopted in the first convolution feature extraction network is 3 x 3; the convolution kernel size of the convolution layer adopted in the second convolution feature extraction network is 5 x 5; the convolution kernel size of the convolution layer adopted in the third convolution feature extraction network is 7 x 7;
the first convolution feature extraction network includes: convolutional layer 1-1a, pooling layer 1-1, convolutional layer 1-2a, pooling layer 1-2, convolutional layer 1-3, overall weighted average pooling layer 1-31 and overall significance polymerization weighted pooling layer 1-32;
the second convolutional feature extraction network comprises: convolutional layer 2-1a, pooling layer 2-1, convolutional layer 2-2a, pooling layer 2-2, convolutional layer 2-3, overall weighted average pooling layer 2-31 and overall significance polymerization weighted pooling layer 2-32;
the third convolutional feature extraction network includes: a convolutional layer 3-1a, a pooling layer 3-1, a convolutional layer 3-2a, a pooling layer 3-2, a convolutional layer 3-3, an overall weighted average pooling layer 3-31 and an overall significance polymerization weighted pooling layer 3-32;
the input end of the convolutional layer 1-1a is respectively connected with the input end of the convolutional layer 2-1a and the input end of the convolutional layer 3-1a, and the output end of the convolutional layer is connected with the input end of the pooling layer 1-1; the output end of the pooling layer 1-1 is respectively connected with the input end of the convolutional layer 1-2a, the output end of the pooling layer 2-1, the input end of the convolutional layer 2-2a, the output end of the pooling layer 3-1 and the input end of the convolutional layer 3-2 a; the output end of the convolutional layer 2-1a is connected with the input end of the pooling layer 2-1; the output end of the convolutional layer 3-1a is connected with the input end of the pooling layer 3-1; the output end of the convolutional layer 1-2a is connected with the input end of the pooling layer 1-2; the output end of the convolution layer 2-2a is connected with the input end of the pooling layer 2-2; the output end of the convolution layer 3-2a is connected with the input end of the pooling layer 3-2; the output end of the pooling layer 3-2 is respectively connected with the output end of the pooling layer 2-2, the output end of the pooling layer 1-2, the input end of the convolution layer 1-3, the input end of the convolution layer 2-3 and the input end of the convolution layer 3-3; the output end of the convolutional layer 1-3 is respectively connected with the input end of the overall weighted average pooling layer 1-31 and the input end of the overall significance aggregation weighted pooling layer 1-32; the output end of the convolutional layer 2-3 is respectively connected with the input end of the overall weighted average pooling layer 2-31 and the input end of the overall significance aggregation weighted pooling layer 2-32; the output end of the convolutional layer 3-3 is respectively connected with the input end of the overall weighted average pooling layer 3-31 and the input end of the overall significance aggregation weighted pooling layer 3-32;
the output end of the overall weighted average pooling layer 1-31, the output end of the overall significance aggregation weighted pooling layer 1-32, the output end of the overall weighted average pooling layer 2-31, the output end of the overall significance aggregation weighted pooling layer 2-32, the output end of the overall weighted average pooling layer 3-31 and the output end of the overall significance aggregation weighted pooling layer 3-32 are used as the output end of the convolutional neural network feature extraction module;
the training process of the convolutional neural network feature extraction module is as follows:
a1, acquiring character data on a construction drawing to obtain a training data set;
a2, respectively and independently training a first convolution feature extraction network, a second convolution feature extraction network and a third convolution feature extraction network by adopting a training data set to obtain the trained first convolution feature extraction network, second convolution feature extraction network and third convolution feature extraction network;
the training method in the step a2 is as follows:
b1, initializing the weight and bias of the convolution feature extraction network;
b2, inputting the training data set into the convolution feature extraction network to obtain the output of the convolution feature extraction network;
b3, extracting the output of the network according to the convolution characteristics, and calculating a loss function;
b4, judging whether the loss function is smaller than a preset value, if so, finishing training of the convolution feature extraction network, and if not, jumping to the step B5;
b5, updating the weights and the biases of the convolution feature extraction network according to the loss function, and jumping to the step B2;
the formula for updating the weights and biases of the convolution feature extraction network in step B5 is:
wherein, Wt+1Extracting weights of the network for the convolution features of the t +1 th iteration, bt+1Extracting the bias of the network for the convolution characteristics of the t +1 th iteration, wherein alpha and beta areThe adjustment factor, J, is a loss function,the weights are biased for a loss function,calculating a partial derivative of the bias for the loss function;
formula of neuron cell in LSTM unit of LSTM feature extraction module:
ft=σ[Wf·(ht-1,xt,Ct-1)+bf]
it=σ[Wi·(ht-1,xt,Ct-1)+bi]
wherein f istValue of the activation vector for the forgetting gate at time t, bfBiasing term for forgetting gate, WfWeight term for forgetting gate, xtIs a literal vector, h, on the construction drawing at time tt-1Output of neuronal cells at time t-1, Ct-1For the neuronal cell state at time t-1, σ is the activation function, itFor the value of the activation vector of the input gate at time t, WiAs weight terms for the input gate, biIn order to input the offset term of the gate,updating the value of the activation vector of the status gate for time t, WcTo update the weight term of the status gate, bcTo update the bias term of the state gate, tanh is the hyperbolic tangent activation function, CtNeuronal cell State at time t, OtThe value of activation vector, W, output by neuronal cell at time toAs weight terms of output gates, boIs the bias term of the output gate.
2. The construction drawing review management system according to claim 1, wherein the construction drawing form review subsystem includes: a file format examining unit, a title examining unit, a line type examining unit and a color examining unit;
the file format examining unit is used for examining the extension of the construction drawing file according to the file extension examining format; the title examination unit is used for examining the title according to the title examination format; the line type examination unit is used for extracting the drawing line data of the construction drawing and examining the line type according to the line type examination format; the color examining unit is used for extracting the color data of the construction drawing and examining the color of the construction drawing according to the color examining format of the construction drawing.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210352817.XA CN114429638B (en) | 2022-04-06 | 2022-04-06 | Construction drawing examination management system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210352817.XA CN114429638B (en) | 2022-04-06 | 2022-04-06 | Construction drawing examination management system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114429638A CN114429638A (en) | 2022-05-03 |
CN114429638B true CN114429638B (en) | 2022-07-08 |
Family
ID=81314296
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210352817.XA Expired - Fee Related CN114429638B (en) | 2022-04-06 | 2022-04-06 | Construction drawing examination management system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114429638B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117890A (en) * | 2018-08-24 | 2019-01-01 | 腾讯科技(深圳)有限公司 | A kind of image classification method, device and storage medium |
CN109409381A (en) * | 2018-09-18 | 2019-03-01 | 北京居然之家云地汇新零售连锁有限公司 | The classification method and system of furniture top view based on artificial intelligence |
CN109558938A (en) * | 2018-09-27 | 2019-04-02 | 天津大学 | Convolutional neural networks based on the transmitting of multiple semantic feature |
CN109614869A (en) * | 2018-11-10 | 2019-04-12 | 天津大学 | A kind of pathological image classification method based on multi-scale compress rewards and punishments network |
CN111210104A (en) * | 2019-12-11 | 2020-05-29 | 中兵勘察设计研究院有限公司 | Construction drawing digital combined collaborative inspection system and method based on cloud platform |
CN114118842A (en) * | 2021-12-01 | 2022-03-01 | 悉地(苏州)勘察设计顾问有限公司 | Sponge city construction engineering design examination system |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6805984B2 (en) * | 2017-07-06 | 2020-12-23 | 株式会社デンソー | Convolutional neural network |
CN107644315B (en) * | 2017-09-08 | 2022-07-15 | 泰州市抗震办公室(泰州市建设工程施工图设计审查中心) | Construction drawing joint examination system |
US11034357B2 (en) * | 2018-09-14 | 2021-06-15 | Honda Motor Co., Ltd. | Scene classification prediction |
CN109325547A (en) * | 2018-10-23 | 2019-02-12 | 苏州科达科技股份有限公司 | Non-motor vehicle image multi-tag classification method, system, equipment and storage medium |
CN109461495B (en) * | 2018-11-01 | 2023-04-14 | 腾讯科技(深圳)有限公司 | Medical image recognition method, model training method and server |
CN110347851A (en) * | 2019-05-30 | 2019-10-18 | 中国地质大学(武汉) | Image search method and system based on convolutional neural networks |
CN110390691B (en) * | 2019-06-12 | 2021-10-08 | 合肥合工安驰智能科技有限公司 | Ore dimension measuring method based on deep learning and application system |
US11947061B2 (en) * | 2019-10-18 | 2024-04-02 | Korea University Research And Business Foundation | Earthquake event classification method using attention-based convolutional neural network, recording medium and device for performing the method |
CN111126256B (en) * | 2019-12-23 | 2022-02-15 | 武汉大学 | Hyperspectral image classification method based on self-adaptive space-spectrum multi-scale network |
CN112419208A (en) * | 2020-11-23 | 2021-02-26 | 泰兴市建设工程施工图审查服务中心 | Construction drawing review-based vector drawing compiling method and system |
CN114120033B (en) * | 2021-11-12 | 2024-10-25 | 武汉大学 | Hyperspectral image classification method, hyperspectral image classification device, hyperspectral image classification equipment and storage medium |
-
2022
- 2022-04-06 CN CN202210352817.XA patent/CN114429638B/en not_active Expired - Fee Related
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109117890A (en) * | 2018-08-24 | 2019-01-01 | 腾讯科技(深圳)有限公司 | A kind of image classification method, device and storage medium |
CN109409381A (en) * | 2018-09-18 | 2019-03-01 | 北京居然之家云地汇新零售连锁有限公司 | The classification method and system of furniture top view based on artificial intelligence |
CN109558938A (en) * | 2018-09-27 | 2019-04-02 | 天津大学 | Convolutional neural networks based on the transmitting of multiple semantic feature |
CN109614869A (en) * | 2018-11-10 | 2019-04-12 | 天津大学 | A kind of pathological image classification method based on multi-scale compress rewards and punishments network |
CN111210104A (en) * | 2019-12-11 | 2020-05-29 | 中兵勘察设计研究院有限公司 | Construction drawing digital combined collaborative inspection system and method based on cloud platform |
CN114118842A (en) * | 2021-12-01 | 2022-03-01 | 悉地(苏州)勘察设计顾问有限公司 | Sponge city construction engineering design examination system |
Non-Patent Citations (5)
Title |
---|
Hongdou Yao 等.Parallel Structure Deep Neural Network Using CNN and RNN with an Attention Mechanism for Breast Cancer Histology Image Classification.《cancers》.2019, * |
刘慧婷.基于CNN与LSTM的网络融合的行为识别研究.《中国优秀博硕士学位论文全文数据库(硕士)信息科技辑》.2022,(第03期), * |
刘立 等.一种工程图纸类文档识别分类的技术研究.《电子设计工程》.2020,第28卷(第12期), * |
江姣.常用基础类型施工图审查过程数字化的研究.《中国优秀博硕士学位论文全文数据库(硕士)工程科技Ⅱ辑》.2015,(第04期), * |
高亚琪 等.图像语义特征的探索及其对分类的影响研究.《情报科学》.2021,第39卷(第10期), * |
Also Published As
Publication number | Publication date |
---|---|
CN114429638A (en) | 2022-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | House price prediction approach based on deep learning and ARIMA model | |
CN108710905B (en) | Spare part quantity prediction method and system based on multi-model combination | |
CN111695719A (en) | User value prediction method and system | |
CN112015863A (en) | Multi-feature fusion Chinese text classification method based on graph neural network | |
CN113744251B (en) | Method for predicting microsatellite instability from pathological pictures based on self-attention mechanism | |
CN111274491A (en) | Social robot identification method based on graph attention network | |
CN110610482A (en) | Workpiece flaw detection method based on resnet | |
CN114429638B (en) | Construction drawing examination management system | |
US11836651B2 (en) | Automatic adjustment of replica exchange | |
Imaam et al. | Moderate automobile accident claim process automation using machine learning | |
CN117877744A (en) | Construction method and system of auxiliary reproductive children tumor onset risk prediction model | |
CN117593742A (en) | Method, device, medium and equipment for detecting imperfect rate of grains | |
CN117114705A (en) | Continuous learning-based e-commerce fraud identification method and system | |
Sun et al. | Short-term stock price forecasting based on an SVD-LSTM model | |
CN114372560A (en) | Neural network training method, device, equipment and storage medium | |
CN114757397A (en) | Bad material prediction method, bad material prediction device and electronic equipment | |
CN113836244A (en) | Sample acquisition method, model training method, relation prediction method and device | |
CN114266911A (en) | Embedded interpretable image clustering method based on differentiable k-means | |
Hou | A two-stage model for high-risk prediction in insurance ratemaking: Asymptotics and inference | |
CN113421122A (en) | First-purchase user refined loss prediction method under improved transfer learning framework | |
Ding et al. | Research on Shanghai Stock Exchange 50 Index Forecast Based on Deep Learning | |
CN111160419A (en) | Electronic transformer data classification prediction method and device based on deep learning | |
Zhou et al. | Robust possibilistic programming-based three-way decision approach to product inspection strategy | |
ZHENG et al. | Time-Sensitive Index Future Trading Trend Prediction with Deep Learning methods | |
CN113111957B (en) | Anti-counterfeiting method, device, equipment, product and medium based on feature denoising |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20220708 |
|
CF01 | Termination of patent right due to non-payment of annual fee |