WO2024058556A1 - Procédé et dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel - Google Patents

Procédé et dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel Download PDF

Info

Publication number
WO2024058556A1
WO2024058556A1 PCT/KR2023/013760 KR2023013760W WO2024058556A1 WO 2024058556 A1 WO2024058556 A1 WO 2024058556A1 KR 2023013760 W KR2023013760 W KR 2023013760W WO 2024058556 A1 WO2024058556 A1 WO 2024058556A1
Authority
WO
WIPO (PCT)
Prior art keywords
feature map
information
neural network
window set
artificial neural
Prior art date
Application number
PCT/KR2023/013760
Other languages
English (en)
Korean (ko)
Inventor
김학성
김상일
이세민
김규원
한영구
Original Assignee
한양대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 한양대학교 산학협력단 filed Critical 한양대학교 산학협력단
Publication of WO2024058556A1 publication Critical patent/WO2024058556A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes

Definitions

  • the present invention relates to a method and device for automatically calculating window set information using an artificial neural network. More specifically, the present invention relates to a method and device for automatically calculating window set information using an artificial neural network. More specifically, the present invention classifies the window set area in an input image using a pre-trained artificial neural network, and simultaneously classifies the window set area into the classified window set area. This invention relates to technology that can automatically calculate window set information.
  • the indoor space in a building is a space that maintains the comfort of occupants from changes in the indoor and outdoor environments in terms of architectural environment facilities. Occupants' comfort can change in real time due to a combination of various thermal factors, of which the most representative thermal factor is temperature. Therefore, it is necessary to evaluate temperature distribution to create a comfortable indoor environment for occupants.
  • the light energy characteristics of the building should be analyzed using the design drawings of the existing building, and the building should be designed based on this. In designing this, various factors that affect the light energy characteristics of the building must be considered. Among the various factors, the element that most closely affects the building is the window set, and the window set determines the building's solar radiation gain and daylight area. It has the greatest impact on the light energy flowing into the building.
  • the method and device for automatically calculating window set information using an artificial neural network are inventions designed to solve the problems described above, and automatically classify window set areas for input images and The purpose is to provide an automatic calculation method of window set information that can automatically calculate window set information.
  • the purpose is to provide a method for managing buildings more efficiently by calculating information on solar radiation gain and daylight area that affect the light characteristics of the building based on the calculated window set information.
  • an automatic calculation device for window set information using an artificial neural network performs pre-learning using first input information including image information as input information, and window set information obtained by extracting window set characteristics from the image information as output information.
  • the window set information calculating artificial neural network module and the window set information calculating artificial neural network module includes a feature extraction unit that extracts a 1-1 feature map and a 1-2 feature map having different scale values for the image information.
  • the window set information calculation artificial neural network module generates a 3-1 feature map and a 3-2 feature map by applying convolution to the 1-1 feature map and the 1-2 feature map, respectively. After that, one of the 3-1 feature map and the 3-2 feature map is determined as the final feature map according to a preset standard, and then the window set information can be calculated based on the selected final feature map.
  • the window set information calculation artificial neural network module calculates reliability information for the 3-1 feature map and the 3-2 feature map, and then selects the feature map with a higher reliability value according to the preset standard. It can be determined by the final feature map.
  • the window set information calculation artificial neural network module may include a pre-trained first artificial neural network that uses the first input information as input information and outputs a 1-1 feature map corresponding to the image information as output information. You can.
  • the feature extraction unit may extract a 1-2 feature map having a scale value different from the scale value of the 1-1 feature map from the first artificial neural network.
  • the window set information calculation artificial neural network module uses the 1-1 feature map as input information, the 1-2 feature map as intermediate input information, and generates a 2-1 corresponding to the 1-1 feature map. It may include a pre-trained second artificial neural network that outputs a first feature map and a second-second feature map corresponding to the first-second feature map as output information.
  • the window set information calculation artificial neural network module uses the 2-1 feature map and the 2-2 feature map as input information, a 3-2 feature map corresponding to the 2-2 feature map, and the 2-2 feature map. It may include a previously learned third artificial neural network that outputs the 3-2 feature map corresponding to the 3-1 feature map as output information.
  • the window set information calculation artificial neural network module may further include an FC-layer that determines one of the 3-1 feature map and the 3-2 feature map as the final feature map according to a preset standard. .
  • the automatic window set information calculation device using the artificial neural network calculates window set information including area information and length information of windows and doors corresponding to the image information and area information and length information of frames based on the final feature map. It may further include a characteristic information calculation unit.
  • the light characteristic information calculation unit may calculate solar radiation gain and daylight area based on the window set information.
  • the feature extraction unit extracts a 1-3 feature map having a scale value different from the 1-1 feature map and the 1-2 feature map for the image information in the window set information calculation artificial neural network module. You can.
  • the window set information calculation artificial neural network module calculates reliability information for the 3-1 feature map, the 3-2 feature map, and the 3-2 feature map, and then calculates the feature with a higher reliability value.
  • the map may be determined as the final feature map according to the preset criteria.
  • a method for automatically calculating window set information using an artificial neural network uses first input information including image information as input information, and outputs a 1-1 feature map corresponding to the image information as output information.
  • the window After determining one of the 3-1 feature map and the 3-2 feature map as the final feature map according to the 3rd feature map generation step of generating a map and a preset standard, the window is displayed based on the selected final feature map.
  • a window set information calculation step of calculating set information may be included.
  • a method of automatically calculating window set information using an artificial neural network includes window set information including area information and length information of windows and doors corresponding to the image information and area information and length information of frames based on the final feature map.
  • a window set information calculation step may be further included.
  • an automatic calculation device for window set information using an artificial neural network uses first input information including image information as input information, and uses window set information extracted from the image information as output information.
  • a window set information calculating artificial neural network module the window set information calculating artificial neural network module including a feature extraction unit for extracting a 1-1 feature map and a 1-2 feature map having different scale values for the image information;
  • the window set information calculation artificial neural network module generates a 3-1 feature map and a 3-2 feature map by applying convolution to the 1-1 feature map and the 1-2 feature map, respectively.
  • the set information calculating artificial neural network module can perform learning using labeling information labeled with window set information included in the image information.
  • the window set information calculating artificial neural network module may perform learning by using second input information including the labeling information together with the first input information as input information of the window set information calculating artificial neural network module.
  • the method and device for automatically calculating window set information using an artificial neural network labels the shape of the window in the window set image of the building and learns this with the algorithm of the artificial neural network module, so that the window set shape information of the building can be quickly obtained using only a photographic image. It can be extracted within time, and there is an advantage in being able to detect the shape information of the window set and extract the window set information even when an unlearned window set image is input.
  • the method and device for automatically calculating window set information using an artificial neural network have the effect of continuously improving accuracy by automatically performing additional learning by auto-labeling new images.
  • the method and device for automatically calculating window set information using an artificial neural network facilitates information on the optical characteristics of the window set, such as solar radiation gain and daylight area, by using the extracted window set information.
  • Figure 1 is a diagram illustrating some components of a method and device for automatically calculating window set information using an artificial neural network according to an embodiment of the present invention.
  • Figure 2 is a diagram showing the components of a window set to be measured in the present invention.
  • Figure 3 is a diagram illustrating input information and output information when inferred by an artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • Figure 4 is a diagram illustrating input information and output information when the window set information calculating artificial neural network module learns according to an embodiment of the present invention.
  • Figure 5 is a diagram specifically illustrating the configuration of an artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • Figure 6 is a diagram briefly illustrating the process of an artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • Figure 7 is a diagram illustrating in detail the process of the artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • Figure 8 is a diagram for explaining the process of a first artificial neural network according to an embodiment of the present invention.
  • Figure 9 is a diagram for explaining the process of a second artificial neural network according to an embodiment of the present invention.
  • Figure 10 is a diagram showing the final output information of the artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • Figure 11 is a diagram for explaining a learning method of an artificial neural network module for calculating window set information, according to an embodiment of the present invention.
  • Figure 12 is a diagram illustrating a method of labeling learning data according to an embodiment of the present invention.
  • Figure 13 is a diagram illustrating a method for calculating solar heat gain by a window set information calculation device according to an embodiment of the present invention.
  • FIG. 14 is a graph showing solar radiation gain by month/year of a window set calculated according to the method shown in FIG. 13.
  • Figure 15 is a diagram illustrating a method for calculating a daylight area by a window set information calculation device according to an embodiment of the present invention.
  • Figure 16 is a diagram comparing the value of the daylight area derived by the window set information calculation device and the value derived by the finite element analysis result according to an embodiment of the present invention.
  • 'A device for automatically calculating window set information using an artificial neural network' is referred to as 'Calculating window set information.' It will be referred to as 'device', and 'window set information automatic calculation method using artificial neural network' will be referred to as 'window set information calculation method'.
  • Figure 1 is a diagram showing some components of an automatic window set information calculation device using an artificial neural network according to an embodiment of the present invention.
  • the window set information calculation device 1 using an artificial neural network includes a labeling information generator 100, a processor 200, an optical characteristic information generator 400, and a memory module 500. ) may include.
  • the components of the window set information calculation device 1 according to the present invention are not limited to those configured in FIG. 1, and may be implemented with fewer components than those shown, or by additionally combining other components.
  • the labeling information generation unit 100 refers to a generation unit that generates data necessary for the window set information calculation artificial neural network module 300 (see FIG. 3) according to the present invention to perform learning.
  • the labeling information generator 100 is a component that generates data for the window set information generating artificial neural network module 300 to learn or infer.
  • the data generated by the labeling information generator 100 includes the window set information calculating artificial neural network module 300. It contains specific numerical information and area information regarding the window set corresponding to the image input to the neural network module 300. This can serve as learning data for the window set information calculating artificial neural network module 300 to perform learning.
  • the labeling data generated by the labeling information generation unit 100 may be input as input information together with the image information 10 to the window set information calculation device 1, or may be used as reference information corresponding to inferred information output later. . A detailed explanation of this will be provided later.
  • the processor 200 may serve to control the overall operation of the window set information calculation device 1 according to the present invention.
  • the processor 200 includes a window set information calculation artificial neural network module 300, and can calculate area information and numerical information about the window set from the image input to the processor 200.
  • the specific processor of the window set information calculation artificial neural network module 300 will be described in detail with reference to FIGS. 3 to 9.
  • the light characteristic information generator 400 generates an amount of solar radiation corresponding to the light characteristic value of the window set flowing into the building, based on the area information and numerical information about the window set calculated by the processor 200. heat gas) and daylight area can be calculated. A detailed explanation of this will be provided through FIGS. 13 to 16.
  • the memory module 500 includes input image information 10, labeling data generated by the labeling information generator 100, window set information generated by the process 200, and light generated by the optical characteristic information generator 400. Characteristic information, etc. may be stored, and various information about the building being analyzed may be stored.
  • the memory module 500 and the processor 200 are shown as separate components, but the embodiment of the present invention is not limited thereto, and the memory module 500 and the processor 200 are one component.
  • the present invention may be implemented by combining.
  • Figure 2 is a diagram showing the components of a window set to be measured in the present invention.
  • Window sets installed in buildings include various components such as windows, frames, and panels.
  • information on where these components are located and the area and length they occupy must be known.
  • the window set includes a mullion corresponding to a vertical frame, a window corresponding to a fixed window, a transom corresponding to a horizontal frame, and a T/T window corresponding to an opening/closing window.
  • panels corresponding to spandrels, etc., and their areas and values can be defined in various ways.
  • the window set information includes the total width, total height, total area, mullion/transom frame area, fixed window glass area, T/T window glass area, and T of the window set, as shown in (B) of Figure 2.
  • /TCan include information about window frame area, panel area, panel-mullion/transom length, glass-mullion/transom length, frame-mullion/transom length, glass-frame length, etc.
  • the window set information calculation device 100 When an image is input as input information as shown in (A) of FIG. 2, the window set information calculation device 100 according to the present invention includes division information dividing the components in the image and numerical information for each component. The purpose is to automatically calculate as shown in (B) of Figure 2. Let us learn more about the specific process of the present invention through the drawings below.
  • Figure 3 is a diagram showing input information and output information when inferred by the window set information calculating artificial neural network module according to an embodiment of the present invention
  • Figure 4 is a diagram showing the artificial neural network module calculating window set information according to an embodiment of the present invention. This diagram shows the input information and output information when the neural network module learns.
  • the window set information calculating artificial neural network module 300 uses first input information including image information 10 as input information, and inputs information from the input image information 10. It refers to a pre-trained artificial neural network whose output information includes window set information 40 that extracts area information and numerical information about the window set.
  • the window set information calculating artificial neural network module 300 includes first input information 10 including image information, output information 40 including window set information, and output information. It may include a learning session that performs learning based on reference information (ground truth, 50) corresponding to (40), and an inference session that infers first output information (40) based on first input information (10). You can. Although the drawing shows the learning session and the inference session separately, the embodiment of the present invention is limited to this and the learning session and the inference session may be implemented separately.
  • the window set information calculating artificial neural network module 300 when the window set information calculating artificial neural network module 300 performs learning, learning may be performed based on the first output information 40 and reference information 50 as shown in the figure. , Learning may be performed by inputting numerical information corresponding to image information into the window set information calculation artificial neural network module 300.
  • the input information input to the window set information calculation artificial neural network module 300 includes first input information 10 including image information and second input information labeled with numerical information about the window set included in the image information. Since the input information 20 is included, the window set information calculation artificial neural network module 300 can perform learning based on the two input information.
  • performing learning means adjusting the parameters of the artificial neural network, and the second input information 20 input to the window set information calculation artificial neural network module 300 is the labeling information generator ( 100) may refer to labeling information corresponding to the image information generated. This will be explained later with reference to FIG. 12.
  • Figures 5 to 9 are diagrams for explaining the learning method of the window set information calculating artificial neural network module 300, and Figure 5 specifically shows the configuration of the window set information calculating artificial neural network module according to an embodiment of the present invention. This is a drawing.
  • the window set information calculation artificial neural network module 300 includes a first artificial neural network 310, a second artificial neural network 320, a third artificial neural network 330, and an FC-layer 340. ) may include.
  • the window set information calculating artificial neural network module 300 is shown divided into three artificial neural networks and one FC-layer, but this is for convenience of explanation and the embodiment of the present invention is not limited thereto. Therefore, all three artificial neural networks can be combined to form one artificial neural network, or all three artificial neural networks and one FC-layer can be combined to form one artificial neural network.
  • the first artificial neural network 310 includes first input information 10 including image information and second input information 20 including at least one of numerical information and area information about the window set included in the image information. It refers to a pre-trained artificial neural network that uses input information, applies convolution to the image information, and outputs the 1-1 feature map 11 as output information.
  • the convolution operation since the convolution operation is performed multiple times, a plurality of feature maps with different scales may be generated internally in the first artificial neural network 310. You can.
  • the feature map extraction method (not shown) extracts the 1-2 feature map 12 and the 1-3 feature map 13 from the first artificial neural network 310, and then extracts the 1-2 feature map 12 and the 1-3 feature map 13 from the first artificial neural network 310. It can be output as intermediate output information of the artificial neural network 310, and the intermediate output information output in this way can be input as intermediate input information of the second artificial neural network 320 in parallel.
  • the intermediate output information output as intermediate output information of the first artificial neural network 310 is shown as two pieces (1-2 feature map and 1-3 feature map), but the embodiment of the present invention is like this. It is not limited.
  • the feature maps corresponding to the intermediate output information output by the first artificial neural network 310 refer to feature maps having a scale value different from that of the 1-1 feature map 11, Depending on the purpose of the invention, the number of feature maps corresponding to intermediate output information can be freely modified.
  • two, four, or five feature maps may be implemented with a scale value different from that of the 1-1 feature map 11, and in this case, each feature map may have a different scale value.
  • the description will be limited to two feature maps, which are intermediate output information of the first artificial neural network 310, namely the 1-1 feature map 11 and the 1-2 feature map 12.
  • the second artificial neural network 320 includes the 1-1 feature map 11 output by the first artificial neural network 310 and the 1-2 feature map 12, which is intermediate output information of the first artificial neural network 310.
  • a convolution operation is performed based on the 1-3 feature map 13, and the 2-1 feature map 21, which strengthens the features of the 1-1 feature map 11, the 1-2 feature map ( The 2-2 feature map 22, which enhances the features of 12), and the 2-3 feature map 23, which enhances the features of the 1-3 feature map 13, can be output as output information.
  • the output information output by the second artificial neural network 320 may be a total of two (2-1 feature map and 2- 2 feature map), when the intermediate output information of the first artificial neural network 310 is three, a total of four pieces of output information can be output by the second artificial neural network.
  • the third artificial neural network 330 inputs the 2-1 feature map 21, the 2-2 feature map 22, and the 2-3 feature map 23 output from the second artificial neural network 320.
  • a convolution operation is independently performed on each input feature map to obtain the 3-1 feature map 31, 3-2 feature map 32, and 3-3 including the anchor box image.
  • the feature map 33 can be output as output information.
  • the FC-layer (340, Fully-connected Layer) includes the 3-1 feature map (31), the 3-2 feature map (32), and the 3-3 feature map (33) output by the third artificial neural network (330). ) can be analyzed individually to output a preset standard (a map with the most reliable information output) as the final output information.
  • Figure 6 is a diagram briefly illustrating the process of an artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • the purpose of the artificial neural network module 300 for calculating window set information according to the present invention is to extract area information about the window set from the input image information 10 and calculate numerical information from the extracted area information. To achieve this, window set information is calculated using multiple artificial neural networks.
  • the window set information calculation artificial neural network module 300 extracts a first feature map that extracts image features for the input image information 10 using the first artificial neural network 310, as shown in FIG. 6, and , extracting the second feature map through a process of enhancing image features for the extracted first feature map, and extracting the third feature through a process of extracting final object detection information for the extracted second feature map.
  • the map can be extracted. Let's learn more about the process of each artificial neural network through the drawings below.
  • FIG. 7 is a diagram illustrating in detail the process of the artificial neural network module for calculating window set information according to an embodiment of the present invention.
  • FIG. 8 is a diagram for explaining the process of a first artificial neural network according to an embodiment of the present invention
  • FIG. 9 is a diagram for explaining the process of a second artificial neural network according to an embodiment of the present invention.
  • the first artificial neural network 310 is an artificial neural network that extracts features from the input image information 10, and applies a convolution-based network to generate the 1-1 feature map (11). ) can be extracted.
  • the first artificial neural network 310 can serve to convert input image information from three-dimensional data to one-dimensional data using a convolution-based BottleNeck-CSP (Cross Stage Neterok) network. there is.
  • a convolution-based BottleNeck-CSP Cross Stage Neterok
  • the first process of the first artificial neural network 310 is the focus step, which is a process in which model calculation can be easily performed by unifying several images in a certain format (for example, 3 x 1024 x 1024).
  • the advantage of doing this is that the effect of learning is accelerated.
  • BottleneckCSP is a structure that compresses the channel of the input value, extracts features, and increases the channel again.
  • the convolution layer in the first artificial neural network 310 After the convolution layer in the first artificial neural network 310, it goes through a batch normalization process and proceeds to BottleneckCSP. In this step, four convolutional layers are created, two of which are connected to each other through a short-connection to derive the calculated value, and the calculated values from other convolutional layers are combined to pass through the remaining convolutional layer. 1Artificial neural network (310) was implemented. Afterwards, the calculation was performed by applying batch normalization only to the first and fourth layers.
  • the extracted feature map is arranged in a fixed one-dimensional form through the SPP (Spatial Pyramid Pooling Layer) process, and then the feature map to which the SPP process is applied is an FC-layer (Fully).
  • SPP Spatial Pyramid Pooling Layer
  • FC-layer FC-layer
  • the first artificial neural network 310 can output the 1-2 feature map 12 and the 1-3 feature map 13 as intermediate output information, where the 1-2 feature
  • the map 12 and the 1-3 feature map 13 refer to feature maps having different scale values from the 1-1 feature map 11.
  • the 1-3 feature map 13 is a feature map with a smaller number of channels than the 1-2 feature map 12
  • the 1-2 feature map 12 is a feature map with a smaller number of channels than the 1-2 feature map 12.
  • the second artificial neural network 320 is an artificial neural network composed of a PANet (Path-Aggregation Network) stage that specifies features for the feature map that has passed through the first artificial neural network 310, and strengthens the information flow. It corresponds to an artificial neural network that aims to
  • the second artificial neural network 320 is an artificial neural network that collects and processes feature maps of various sizes (with various scale values) through the first artificial neural network 310, and uses the low level acquired in the previous stage. )'s feature map is not properly input to the convolution layer at the back, so there is a feature that improves the accuracy of localization by strengthening the entire feature layer through bottom-up path augmentation.
  • a shortcut connecting the bottom layer to the top layer and a convolution layer enable smooth flow compared to the existing backbone model.
  • the Concat layer 321 of the second artificial neural network plays the role of combining the upsampling layer 325 corresponding to the immediately preceding layer in the second artificial neural network 320 and the third CSP layer 313 of the first artificial neural network 310. Perform.
  • the second artificial neural network 320 uses a total of four Concat layers (321, 322, 323, 324), and the first Concat layer is connected to the third CSP layer 313 of the first artificial neural network 310, and the first artificial neural network 310 is connected to the third CSP layer 313.
  • the second Concat layer 322 which combines the CSP layer 313 of (310) and the upsampling layer, detects small-scale objects, and the remaining two Concat layers 323 and 324 detect the CSP of the second artificial neural network 320 and
  • the Conv layer is combined to detect medium-scale and large-scale objects.
  • the 2-1 feature map 21 passed through the 2nd concat (322) layer, the 2-2 feature map 22 passed through the 3rd contact (323) layer, and the 2-2 feature map 22 passed through the 4th contact (324) layer.
  • the 2-3 feature maps 23 have different scales (the scale of the 2-3 feature map is the largest, the scale of the 2-2 feature map is medium, and the scale of the 2-1 feature map is (smallest) feature maps are output, respectively, and the feature maps output in this way are each input as input information to the third artificial neural network 330.
  • the third artificial neural network 330 performs convolution on the scales of the three input images and finally converts the resulting values into the 3-1 feature map 31, the 3-2 feature map 32, and the 3- 3 Derived from feature map (33).
  • the resulting value uses a multi-scale feature map to quickly generate anchor boxes with fast inference speed.
  • the three feature maps output from the third artificial neural network 330 are finally input to the FC-layer 340, and in the FC-layer, window set information is calculated based on the most reliable feature map among the three feature maps. (Area information and numerical information) are output as final output information of the window set information calculation artificial neural network module 300.
  • Figure 10 is a diagram showing the final output information of the window set information calculating artificial neural network module 300 according to an embodiment of the present invention.
  • the final output information can be output as an anchor box image that divides the window set into sections, and it can be seen that the image is derived by accurately creating an anchor box only in the portion corresponding to the glass in the image.
  • the discrimination results are good because the obstacles blocking the front of the window set and the images reflected on the windows are also reflected in deep learning learning.
  • the final information may include information about the window number and corresponding X-coordinates, Y-coordinates, width, height, etc., as text file information.
  • FIG. 11 is a diagram illustrating a learning method of an artificial neural network module for calculating window set information, according to an embodiment of the present invention
  • FIG. 12 is a diagram illustrating a method of labeling learning data, according to an embodiment of the present invention. It is a drawing.
  • the window set information calculation artificial neural network module 300 receives a new image as input data. Then, object detection is performed on the input image, and if the uncertainty result for object detection is lower than the target uncertainty, object data is manually created and additional learning is performed.
  • the method of manually generating learning data includes the process of labeling data. As shown in FIG. 12, a part corresponding to the glass area is designated in the image of the exterior wall of the building, and then the data value is labeled for the designated part. and save the data. For example, in this step, data augmentation was performed by regenerating 20 images with the contrast and brightness of the images adjusted using data augmentation.
  • object detection is performed on the input image, and if the uncertainty result for object detection is higher than the target uncertainty, the object data is increased and learned while simultaneously extracting the detected object data. did.
  • the auto-labeling method was used in this process, in the case of the present invention, when an untrained image is input to the artificial neural network model, the resulting labeling data is used for training of the artificial neural network. By using it, there was an effect of improving the accuracy of the artificial neural network model and dramatically shortening the work time.
  • FIG. 13 is a diagram illustrating a method by which a window set information calculation device calculates solar heat gain according to an embodiment of the present invention
  • FIG. 14 is a window set monthly/year calculated according to the method shown in FIG. 13. This is a graph showing the solar radiation gain.
  • the optical characteristic information generator 400 may calculate solar heat gain information based on the output information output by the window set information calculation artificial neural network module 300.
  • the optical characteristics information generator 400 calculates the window area (A) based on the window set information output by the window set information calculation artificial neural network module 300, and then calculates the window area (A) as the average monthly solar radiation (I). s ), the presence or absence of a shading device in the window set, the total effective energy transmittance (g eff ), which includes the characteristics of single or double windows, and the glass area ratio ( F The amount of heat gain (Q S,tr ) can be calculated.
  • Figure 15 is a diagram illustrating a method for calculating a daylight area by a window set information calculation device according to an embodiment of the present invention.
  • the optical characteristics information generator 400 derives the maximum depth of the daylight zone through the window vertical dimension derived by the window set information calculation artificial neural network module 300. . Afterwards, the width of the daylight zone is derived using the derived horizontal dimension of the window and the maximum depth of the daylight zone, and then the daylight area can be derived as shown in Equation 2 below by using the product of the width and depth of the daylight zone. .
  • Equation 2 a D,max refers to the maximum depth of the daylight zone, h Li refers to the height to the top of the window, and h Ta refers to the height to the bottom of the window.
  • optical characteristic information generator 400 can derive the maximum depth of the daylight zone through Equation 2 and the vertical dimension of the window calculated by the window set information calculation artificial neural network module 300.
  • Equation 3 b D means the width of the daylight zone, and b wi means the width of the window.
  • optical characteristics information generation unit 400 uses Equation 3, the horizontal dimension of the window calculated by the window set information calculation artificial neural network module 300, and the derived maximum depth of the daylight zone to determine the width of the daylight zone using the equation below: It can be derived as in 4.
  • a D a D x b D [m 2 ]
  • a D means the daylight area and a D means the depth of the daylight zone.
  • the light characteristic information generator 400 can finally derive information about the daylight area by multiplying the width and depth of the daylight zone using Equation 4 above.
  • Figure 16 is a diagram comparing the value for the daylight area derived by the window set information calculation device and the value derived from the finite element analysis results according to an embodiment of the present invention.
  • the main light area value (2.1100625) calculated by the present invention and the value (2.1) derived by finite element analysis have an error of only 0.48%, showing that the accuracy of the present invention is very high.
  • the method and device for automatically calculating window set information using an artificial neural network labels the shape of the window in the window set image of the building and learns this with the algorithm of the artificial neural network module, so that the window set shape information of the building can be quickly obtained using only a photographic image. It can be extracted within time, and there is an advantage in being able to detect the shape information of the window set and extract the window set information even when an unlearned window set image is input.
  • the method and device for automatically calculating window set information using an artificial neural network have the effect of continuously improving accuracy by automatically performing additional learning by auto-labeling new images.
  • the method and device for automatically calculating window set information using an artificial neural network facilitates information on the optical characteristics of the window set, such as solar radiation gain and daylight area, by using the extracted window set information.
  • devices and components described in embodiments may include, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable array (FPA), It may be implemented using one or more general-purpose or special-purpose computers, such as a programmable logic unit (PLU), microprocessor, or any other device capable of executing and responding to instructions.
  • a processing device may perform an operating system (OS) and one or more software applications that run on the operating system. Additionally, a processing device may access, store, manipulate, process, and generate information in response to the execution of software.
  • OS operating system
  • a processing device may access, store, manipulate, process, and generate information in response to the execution of software.
  • a single processing device may be described as being used; however, those skilled in the art will understand that a processing device includes multiple processing elements and/or multiple types of processing elements. It can be seen that it may include.
  • a processing device may include multiple processors or one processor and one controller. Additionally, other processing configurations, such as parallel processors, are possible.
  • Software may include a computer program, code, instructions, or a combination of one or more of these, which may configure a processing unit to operate as desired, or may be processed independently or collectively. You can command the device.
  • Software and/or information may be used by any type of machine, component, physical device, virtual equipment, computer storage medium or device to be interpreted by or to provide instructions or information to a processing device. It can be embodied in .
  • Software may be distributed over networked computer systems and stored or executed in a distributed manner. Software and information may be stored on one or more computer-readable recording media.
  • Methods according to embodiments may be implemented in the form of program instructions that can be executed through various computer means and recorded on computer-readable media.
  • Examples of computer-readable recording media include hard disks, floppy disks, and magnetic tapes. Magnetic media, optical media such as CD-ROM and DVD, magneto-optical media such as floptical disk, and ROM and RAM ( It includes specially configured hardware devices to store and execute program instructions, such as RAM, flash memory, etc.
  • Examples of program instructions include machine language code, such as that produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Image Analysis (AREA)

Abstract

Un dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel selon un mode de réalisation comprend : un module de réseau neuronal artificiel de calcul d'informations d'ensemble de fenêtres pré-entraîné qui utilise des premières informations d'entrée comprenant des informations d'image en tant qu'informations d'entrée et utilise des informations d'ensemble de fenêtres obtenues par extraction de caractéristiques d'ensemble de fenêtres des informations d'image en tant qu'informations de sortie ; et une unité d'extraction de caractéristiques destinée à extraire une carte de caractéristiques 1-1 et une carte de caractéristiques 1-2 ayant différentes valeurs d'échelle pour les informations d'image du module de réseau neuronal artificiel de calcul d'informations d'ensemble de fenêtres, le module de réseau neuronal artificiel de calcul d'informations d'ensemble de fenêtres pouvant générer une carte de caractéristiques 3-1 et une carte de caractéristiques 3-2 par application d'une convolution à la carte de caractéristiques 1-1 et à la carte de caractéristiques 1-2, respectivement, déterminer l'une de la carte de caractéristiques 3-1 et de la carte de caractéristiques 3-2 en tant que carte de caractéristiques finale selon une norme prédéfinie, puis calculer les informations d'ensemble de fenêtres sur la base de la carte de caractéristiques finale sélectionnée.
PCT/KR2023/013760 2022-09-15 2023-09-13 Procédé et dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel WO2024058556A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0116577 2022-09-15
KR1020220116577A KR20240037716A (ko) 2022-09-15 2022-09-15 인공신경망을 이용한 창세트 정보 자동 산출 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2024058556A1 true WO2024058556A1 (fr) 2024-03-21

Family

ID=90275512

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/013760 WO2024058556A1 (fr) 2022-09-15 2023-09-13 Procédé et dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel

Country Status (2)

Country Link
KR (1) KR20240037716A (fr)
WO (1) WO2024058556A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102351710B1 (ko) * 2021-11-01 2022-01-17 성한 주식회사 건물의 일사부하에 따른 열쾌적성을 위한 공조제어방법
KR20220012788A (ko) * 2020-07-23 2022-02-04 주식회사 어반베이스 시설 평면도에 포함된 기호 분석 장치 및 방법
KR102365554B1 (ko) * 2021-10-18 2022-02-18 김창원 인공지능 알고리즘을 이용한 창호 교체 플랫폼 제공 방법 및 장치
KR102380014B1 (ko) * 2021-10-28 2022-03-29 한국건설기술연구원 영역 온도차 기반 창호의 열성능 저하 부위 탐지 및 판정 방법
KR20220074715A (ko) * 2020-11-27 2022-06-03 삼성전자주식회사 이미지 처리 방법 및 장치

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101714400B1 (ko) 2015-07-16 2017-03-09 호서대학교 산학협력단 실시간 에너지 형별성능내역서 자동 산출을 위한 에너지 빌딩정보 모델링 분석 시스템 및 방법
KR101933876B1 (ko) 2016-09-23 2018-12-31 김활석 건물의 일조 특성 시뮬레이션 시스템 및 그 제어 방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20220012788A (ko) * 2020-07-23 2022-02-04 주식회사 어반베이스 시설 평면도에 포함된 기호 분석 장치 및 방법
KR20220074715A (ko) * 2020-11-27 2022-06-03 삼성전자주식회사 이미지 처리 방법 및 장치
KR102365554B1 (ko) * 2021-10-18 2022-02-18 김창원 인공지능 알고리즘을 이용한 창호 교체 플랫폼 제공 방법 및 장치
KR102380014B1 (ko) * 2021-10-28 2022-03-29 한국건설기술연구원 영역 온도차 기반 창호의 열성능 저하 부위 탐지 및 판정 방법
KR102351710B1 (ko) * 2021-11-01 2022-01-17 성한 주식회사 건물의 일사부하에 따른 열쾌적성을 위한 공조제어방법

Also Published As

Publication number Publication date
KR20240037716A (ko) 2024-03-22

Similar Documents

Publication Publication Date Title
WO2020256418A2 (fr) Système informatique pour la mise en œuvre d'un capteur virtuel à l'aide d'un jumeau numérique, et procédé de collecte de données en temps réel l'utilisant
WO2019107614A1 (fr) Procédé et système d'inspection de qualité basée sur la vision artificielle utilisant un apprentissage profond dans un processus de fabrication
WO2021201422A1 (fr) Procédé et système de segmentation sémantique applicables à l'ar
WO2017164478A1 (fr) Procédé et appareil de reconnaissance de micro-expressions au moyen d'une analyse d'apprentissage profond d'une dynamique micro-faciale
CN111401293B (zh) 一种基于Head轻量化Mask Scoring R-CNN的手势识别方法
WO2020196985A1 (fr) Appareil et procédé de reconnaissance d'action vidéo et de détection de section d'action
WO2021153861A1 (fr) Procédé de détection de multiples objets et appareil associé
CN113569882A (zh) 一种基于知识蒸馏的快速行人检测方法
WO2020231226A1 (fr) Procédé de réalisation, par un dispositif électronique, d'une opération de convolution au niveau d'une couche donnée dans un réseau neuronal, et dispositif électronique associé
WO2020246655A1 (fr) Procédé de reconnaissance de situation et dispositif permettant de le mettre en œuvre
CN111931719B (zh) 高空抛物检测方法以及装置
WO2021235682A1 (fr) Procédé et dispositif de réalisation d'une prédiction de comportement à l'aide d'une attention auto-focalisée explicable
CN112541403B (zh) 一种利用红外摄像头的室内人员跌倒检测方法
CN108846852A (zh) 基于多示例和时间序列的监控视频异常事件检测方法
WO2021010671A2 (fr) Système de diagnostic de maladie et procédé pour réaliser une segmentation au moyen d'un réseau neuronal et d'un bloc non localisé
WO2021132829A1 (fr) Procédé d'extraction d'image de bord de toit permettant d'installer un panneau solaire au moyen d'un apprentissage automatique
CN113052103A (zh) 一种基于神经网络的电气设备缺陷检测方法及装置
CN113139502A (zh) 无监督视频分割方法
WO2024058556A1 (fr) Procédé et dispositif de calcul automatique d'informations d'ensemble de fenêtres à l'aide d'un réseau neuronal artificiel
CN117409481A (zh) 一种基于2dcnn和3dcnn的动作检测方法
WO2020050456A1 (fr) Procédé d'évaluation du degré d'anomalie de données d'équipement
CN113076808B (zh) 一种通过图像算法精准获取双向人流量的方法
WO2020209487A1 (fr) Dispositif de reconnaissance de lieu basé sur un réseau neuronal artificiel et son dispositif d'apprentissage
CN116310293B (zh) 一种基于弱监督学习的生成高质量候选框目标检测方法
Greiffenhagen et al. The systematic design and analysis cycle of a vision system: A case study in video surveillance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23865852

Country of ref document: EP

Kind code of ref document: A1