CN116819991A - Intelligent building monitoring system and monitoring method thereof - Google Patents

Intelligent building monitoring system and monitoring method thereof Download PDF

Info

Publication number
CN116819991A
CN116819991A CN202310142757.3A CN202310142757A CN116819991A CN 116819991 A CN116819991 A CN 116819991A CN 202310142757 A CN202310142757 A CN 202310142757A CN 116819991 A CN116819991 A CN 116819991A
Authority
CN
China
Prior art keywords
brightness
training
feature
building
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310142757.3A
Other languages
Chinese (zh)
Inventor
张高锋
刘慧慧
曲胜
庞乃杰
黄云
朱一铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cecep Green Building Environmental Protection Technology Co ltd
Original Assignee
Cecep Green Building Environmental Protection Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cecep Green Building Environmental Protection Technology Co ltd filed Critical Cecep Green Building Environmental Protection Technology Co ltd
Priority to CN202310142757.3A priority Critical patent/CN116819991A/en
Publication of CN116819991A publication Critical patent/CN116819991A/en
Pending legal-status Critical Current

Links

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Image Analysis (AREA)

Abstract

An intelligent building monitoring system and a monitoring method thereof are disclosed, which consider that the indoor brightness and the outdoor brightness of a building have a correlation relationship, and the uneven brightness of each place of the building can influence the judgment of the overall lighting effect of the building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted.

Description

Intelligent building monitoring system and monitoring method thereof
Technical Field
The application relates to the technical field of building monitoring, in particular to an intelligent building monitoring system and a monitoring method thereof.
Background
With the development of cities, the number of buildings is gradually increased, a large amount of public space exists in the buildings, the public space always needs to be continuously illuminated for 24 hours, and the whole building needs to be kept under a good illumination effect. However, in practical application, part of public space needs no illumination under the condition of sufficient light in daytime, and part of public space needs only weak illumination, so that the currently adopted general illumination method greatly causes resource waste.
Therefore, an optimized intelligent building monitoring system is expected, which can monitor the illumination condition of a building in real time, so that when the illumination effect of the building is not good, various illumination devices of the building are intelligently regulated and controlled, the energy-saving efficiency of illumination is further improved, and the aim of energy saving is fulfilled.
Disclosure of Invention
The present application has been made to solve the above-mentioned technical problems. The embodiment of the application provides an intelligent building monitoring system and a monitoring method thereof, which take into consideration that the indoor brightness and the outdoor brightness of a building have a correlation relationship, and the uneven brightness of each place of the building can influence the overall lighting effect judgment of the building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted.
According to one aspect of the present application, there is provided an intelligent building monitoring system comprising: the building monitoring unit is used for acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by the brightness meter and building illumination images of the current time point acquired by the camera; a brightness structuring unit, configured to arrange the outdoor brightness values and the indoor brightness values of the plurality of predetermined time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension, respectively; a relative brightness transfer unit for calculating a transfer brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector; the relative brightness characteristic extraction unit is used for passing the transfer brightness matrix through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; the illumination image feature extraction unit is used for enabling the building illumination image to pass through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; the illumination relative brightness transfer unit is used for calculating transfer vectors of the relative brightness feature vectors relative to the building illumination feature matrix as classification feature vectors; and the monitoring result generating unit is used for passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted or not.
In the above intelligent building monitoring system, the relative brightness transfer unit is further configured to: calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector with the following formula; wherein, the formula is:, wherein ,/>Representing the indoor luminance input vector, +.>Representing the outdoor luminance input vector,representing the transfer luminance matrix,>representing matrix multiplication.
In the above intelligent building monitoring system, the relative brightness feature extraction unit includes: each layer of the first convolutional neural network model is respectively carried out in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the first convolutional neural network model is the relative brightness characteristic vector, and the input of the first layer of the first convolutional neural network model is the transfer brightness matrix.
In the above intelligent building monitoring system, the illumination image feature extraction unit includes: a shallow feature extraction subunit, configured to extract a shallow feature map from an mth layer of the second convolutional neural network model, where M is greater than or equal to 1 and less than or equal to 6; a deep feature extraction subunit, configured to extract a deep feature map from an nth layer of the second convolutional neural network model, where N/M is greater than or equal to 5 and less than or equal to 10; the fusion subunit is used for fusing the shallow feature map and the deep feature map by using a deep and shallow feature fusion module of the second convolutional neural network model so as to obtain a fusion feature map; and a pooling subunit, configured to globally pooling the fusion feature map along a channel dimension to obtain the building illumination feature matrix.
In the above intelligent building monitoring system, the fusion subunit is further configured to: fusing the shallow feature map and the deep feature map by the following formula to obtain the fused feature map; wherein, the formula is:
wherein ,for the fusion profile, < >>For the shallow feature map, ++>For the deep feature map, ">"means that the elements at the corresponding positions of the shallow feature map and the deep feature map are added,"> and />For controlling the position between the shallow layer characteristic map and the deep layer characteristic map in the fusion characteristic mapIs used for the balance of the weighting parameters.
In the above intelligent building monitoring system, the lighting relative brightness transfer unit is further configured to: calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector according to the following formula; wherein, the formula is:, wherein ,/>Representing the relative luminance feature vector, +_>Representing the building lighting feature matrix, < >>Representing the classification feature vector,/->Representing matrix multiplication.
In the above intelligent building monitoring system, the monitoring result generating unit includes: the full-connection coding subunit is used for carrying out full-connection coding on the classification characteristic vectors by using a full-connection layer of the classifier so as to obtain coded classification characteristic vectors; and a classification result generation subunit, configured to input the encoded classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
The intelligent building monitoring system further comprises a training module for training the first convolutional neural network model serving as a filter, the second convolutional neural network model comprising the depth feature fusion module and the classifier; wherein, training module includes: the training data acquisition unit is used for acquiring training data, wherein the training data comprises outdoor training brightness values, indoor training brightness values and building training illumination images at the current time point in a preset time period and a true value of whether the overall illumination effect of the building at the current time point needs to be adjusted or not; the training brightness structuring unit is used for respectively arranging the outdoor training brightness values and the indoor training brightness values of the plurality of preset time points into an outdoor training brightness input vector and an indoor training brightness input vector according to the time dimension; the training relative brightness transfer unit is used for calculating a training transfer brightness matrix of the indoor training brightness input vector relative to the outdoor training brightness input vector; the training relative brightness characteristic extraction unit is used for passing the training transfer brightness matrix through the first convolution neural network model serving as a filter to obtain a training relative brightness characteristic vector; the training illumination image feature extraction unit is used for enabling the training building illumination image to pass through the second convolution neural network model comprising the depth feature fusion module to obtain a training building illumination feature matrix; the training illumination relative brightness transfer unit is used for calculating transfer vectors of the training relative brightness feature vectors relative to the training building illumination feature matrix to serve as training classification feature vectors; the classification loss unit is used for passing the training classification feature vector through the classifier to obtain a classification loss function value; and a training unit for training the first convolutional neural network model as a filter, the second convolutional neural network model including a depth feature fusion module, and the classifier based on the classification loss function value and traveling in a gradient descent direction, wherein in each round of the training, the training classification feature vector is iterated based on a vector-generalized hilbert probability spatialization.
In the intelligent building monitoring system, in each iteration of the training, the training classification feature vector is iterated according to the following formula based on the Hilbert probability spatialization of vector assignment; wherein, the formula is:
wherein Is the training classification feature vector, +.>Representing the two norms of the training classification feature vector,representing the inner product of the training classification feature vector itself,/->Is the training classification feature vector +.>Is>Personal characteristic value->An exponential function value based on a natural constant e is represented, and +.>Is the training classification characteristic vector after optimization +.>Is>And characteristic values.
According to another aspect of the present application, there is also provided an intelligent building monitoring method, including: acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by a brightness meter and building illumination images of the current time point acquired by a camera; arranging the outdoor brightness values and the indoor brightness values of the plurality of preset time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension respectively; calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector; the transfer brightness matrix is passed through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; the building illumination image is passed through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector; and the classification feature vector is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted.
In the above intelligent building monitoring method, the calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector includes: calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector with the following formula; wherein, the formula is:, wherein ,/>Representing the indoor luminance input vector, +.>Representing the outdoor luminance input vector, +.>Representing the transfer luminance matrix,>representing matrix multiplication.
In the above intelligent building monitoring method, the passing the transferred luminance matrix through a first convolutional neural network model as a filter to obtain a relative luminance feature vector includes: each layer of the first convolutional neural network model is respectively carried out in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the first convolutional neural network model is the relative brightness characteristic vector, and the input of the first layer of the first convolutional neural network model is the transfer brightness matrix.
In the above intelligent building monitoring method, the step of obtaining the building illumination feature matrix by passing the building illumination image through a second convolutional neural network model including a depth feature fusion module includes: extracting a shallow feature map from an M-th layer of the second convolutional neural network model, wherein M is more than or equal to 1 and less than or equal to 6; extracting a deep feature map from an nth layer of the second convolutional neural network model, wherein N/M is greater than or equal to 5 and less than or equal to 10; a depth feature fusion module of the second convolutional neural network model is used for fusing the shallow feature map and the deep feature map to obtain a fused feature map; and carrying out global pooling on the fusion feature map along the channel dimension to obtain the building illumination feature matrix.
In the above intelligent building monitoring method, the merging module for merging the depth features of the second convolutional neural network model into the shallow feature map and the deep feature map to obtain a merged feature map includes: fusing the shallow feature map and the deep feature map by the following formula to obtain the fused feature map; wherein, the formula is:
wherein->For the fusion profile, < > >For the shallow feature map, ++>For the deep feature map, ">"means that elements at corresponding positions of the shallow feature map and the deep feature map are added, and />Is a weighting parameter for controlling the balance between the shallow feature map and the deep feature map in the fused feature map.
In the above intelligent building monitoring method, the calculating the transfer vector of the relative brightness feature vector with respect to the building illumination feature matrix as the classification feature vector includes: calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector according to the following formula; wherein, the formula is:
, wherein ,/>Representing the relative luminance feature vector, +_>Representing the building lighting feature matrix, < >>Representing the classification feature vector,/->Representing matrix multiplication.
In the above intelligent building monitoring method, the step of passing the classification feature vector through a classifier to obtain a classification result, where the classification result is used to indicate whether the overall lighting effect of the building at the current time point needs to be adjusted, includes: performing full-connection coding on the classification feature vectors by using a full-connection layer of the classifier to obtain coded classification feature vectors; and inputting the coding classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
The intelligent building monitoring method further comprises the step of training the first convolution neural network model serving as a filter, the second convolution neural network model containing the depth feature fusion module and the classifier; the training of the first convolutional neural network model serving as a filter, the second convolutional neural network model containing a depth feature fusion module and the classifier comprises the following steps: acquiring training data, wherein the training data comprises outdoor training brightness values, indoor training brightness values and building training illumination images at the current time point in a plurality of preset time points in a preset time period, and a real value of whether the overall illumination effect of the building at the current time point needs to be adjusted; the outdoor training brightness values and the indoor training brightness values of the plurality of preset time points are respectively arranged into an outdoor training brightness input vector and an indoor training brightness input vector according to the time dimension; calculating a training transition luminance matrix of the indoor training luminance input vector relative to the outdoor training luminance input vector; passing the training transfer brightness matrix through the first convolutional neural network model serving as a filter to obtain a training relative brightness feature vector; the training building illumination image is passed through the second convolution neural network model comprising the depth feature fusion module to obtain a training building illumination feature matrix; calculating a transfer vector of the training relative brightness feature vector relative to the training building illumination feature matrix as a training classification feature vector; passing the training classification feature vector through the classifier to obtain a classification loss function value; and training the first convolutional neural network model as a filter, the second convolutional neural network model comprising a deep-shallow feature fusion module, and the classifier based on the classification loss function value and traveling in a direction of gradient descent, wherein in each iteration of the training, the training classification feature vector is iterated based on a vector-generalized hilbert probability spatialization.
Compared with the prior art, the intelligent building monitoring system and the monitoring method thereof provided by the application have the advantages that the correlation relationship between the indoor brightness and the outdoor brightness of the building is considered, and the uneven brightness of each place of the building can influence the overall lighting effect judgment of the building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted.
Drawings
The above and other objects, features and advantages of the present application will become more apparent by describing embodiments of the present application in more detail with reference to the attached drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 is a schematic view of a scenario of an intelligent building monitoring system according to an embodiment of the present application.
Fig. 2 is a block diagram of an intelligent building monitoring system according to an embodiment of the present application.
Fig. 3 is a block diagram of an intelligent building monitoring system according to an embodiment of the present application.
Fig. 4 is a block diagram of an illumination image feature extraction unit in an intelligent building monitoring system according to an embodiment of the present application.
Fig. 5 is a block diagram of a training module in an intelligent building monitoring system in accordance with an embodiment of the present application.
Fig. 6 is a flowchart of an intelligent building monitoring method according to an embodiment of the present application.
Fig. 7 is a flowchart of training the first convolutional neural network model as a filter, the second convolutional neural network model including the depth feature fusion module, and the classifier in the intelligent building monitoring method according to an embodiment of the present application.
Detailed Description
Hereinafter, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein.
Summary of the application: as described above in the background art, with the development of cities, the number of buildings is gradually increased, a large amount of public space exists inside the buildings, the public space often needs to be continuously illuminated for 24 hours, and the whole building needs to be kept under a good illumination effect. However, in practical application, part of public space needs no illumination under the condition of sufficient light in daytime, and part of public space needs only weak illumination, so that the currently adopted general illumination method greatly causes resource waste. Therefore, an optimized intelligent building monitoring system is expected, which can monitor the illumination condition of a building in real time, so that when the illumination effect of the building is not good, various illumination devices of the building are intelligently regulated and controlled, the energy-saving efficiency of illumination is further improved, and the aim of energy saving is fulfilled.
At present, deep learning and neural networks have been widely used in the fields of computer vision, natural language processing, speech signal processing, and the like. In addition, deep learning and neural networks have also shown levels approaching and even exceeding humans in the fields of image classification, object detection, semantic segmentation, text translation, and the like.
In recent years, deep learning and development of neural networks provide new solutions and schemes for illumination monitoring of intelligent buildings.
Accordingly, considering that the brightness in the building room changes due to the difference between the daytime and the night and the weather, that is, the brightness in the building room and the brightness outside the building room have a correlation relationship, if the judgment of the lighting effect is performed based on the absolute brightness of the building, the judgment of the lighting effect of the building is not accurate due to the interference of the brightness of the external environment. Further, it is considered that the luminance of each place of the building is not uniform, which affects the judgment of the lighting effect of the whole building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is expected to be performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted or not, and the aim of energy saving is fulfilled.
Specifically, in the technical scheme of the application, firstly, outdoor brightness values and indoor brightness values at a plurality of preset time points in a preset time period are acquired through a brightness meter, and building illumination images at the current time point are acquired through a camera. It should be understood that, since the technical solution of the present application is to adjust the lighting effect of the whole building based on the relative brightness characteristic of the outdoor brightness value and the indoor brightness value in the time dimension, the outdoor brightness value and the indoor brightness value at the plurality of predetermined time points are further arranged into the outdoor brightness input vector and the indoor brightness input vector according to the time dimension respectively to integrate the respective brightness information in the time dimension, and then the transition brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector is calculated, so as to represent the differential relative brightness information of the outdoor brightness and the indoor brightness.
Then, feature mining is performed on the transition luminance matrix using a first convolutional neural network model as a filter having excellent performance in implicit feature extraction to extract the differential relative luminance features of the outdoor luminance and the indoor luminance, i.e., the lighting effect features of the building, thereby obtaining a relative luminance feature vector.
Further, considering that the building is actually illuminated, the illumination effect in the building may be different due to the respective reasons of the resident or the merchant in the building, for example, a part of the illumination effect of the building is better, and a part of the illumination effect is worse, which also causes that the overall illumination effect of the building is difficult to detect. Therefore, in the technical scheme of the application, the overall lighting effect of the building is further characterized by utilizing the lighting image characteristics of the building. In particular, in order to be able to focus on the illumination situation of each resident or merchant in the illumination image of the building, the illumination image of the building is passed through a second convolutional neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix. That is, considering that the shallow features such as the shape and the outline of the building in the building illumination image have significance for the illumination effect detection during the illumination effect detection, and the shallow features become blurred and even submerged by noise as the depth of the convolutional neural network deepens during the encoding, in the technical scheme of the application, the second convolutional neural network model comprising the deep and shallow feature fusion module is used for processing the building illumination image, compared with the standard convolutional neural network model, the convolutional neural network model can retain the shallow features and the deep features, so that the feature information is richer, and the features with different depths can be retained, thereby improving the accuracy of the illumination effect detection of the building.
And calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix to represent the correlation feature information of the indoor and outdoor relative illumination change feature of the building and the overall illumination feature of the building, and taking the correlation feature information as a classification feature vector to perform classification processing in a classifier, so as to obtain a classification result for representing whether the overall illumination effect of the building at the current time point needs to be adjusted. Therefore, the illumination condition of the building can be monitored in real time, so that all illumination equipment of the building can be intelligently regulated and controlled when the poor illumination effect of the building is monitored.
In particular, in the technical scheme of the application, as the transfer vector of the relative brightness feature vector relative to the building illumination feature matrix is calculated as the training classification feature vector, although the training classification feature vector expresses the transfer distribution of the relative brightness feature relative to the image semantic feature of the overall illumination effect of the building, the transfer distribution is in the transfer domain between the relative brightness feature vector and the feature domain of the building illumination feature matrix, and the dependence on a single classification result is poor when the classification is performed by the classifier, so that the training speed of the classifier and the accuracy of the classification result of the training classification feature vector are influenced when the classification is performed in the classifier.
Therefore, in order to solve the above-mentioned problems, the applicant of the present application optimizes the training process of the model, so as to carry out the hilbert probability spatialization of vector-normalized to the training classification feature vector. Specifically, in the classification process of the classification feature vector passing through the classifier, in the iteration process of the weight matrix of each classifier, the input of the classifier is optimized by the following formula, which is expressed as:
is the training classification feature vector, +.>Representing the two norms of the training classification feature vector,/->Representing the square thereof, i.e. the inner product of the training classification feature vector itself, < >>Is the training classification feature vector +.>Is>Characteristic value, and->Is the training classification characteristic vector after optimization +.>Is>And characteristic values.
Here, the vector-generalized Hilbert probability spatialization classifies feature vectors through the trainingSelf-assignment advances in Hilbert space defining a vector inner productThe training classification feature vector is lined up>And reduces the training classification feature vector +.>Hidden disturbances of the class expression of the overall Hilbert space topology by the respective locally distributed class expression of the concatenation of (a) thereby increasing the training classification feature vector +. >Is converged to the robustness of a single predetermined classification result while the training classification feature vector +.>Is dependent on the long-range cross-classifier dependence of a single classification result. Thus, the optimized training classification feature vector +.>The dependence on a single classification result when classifying by the classifier improves the training speed of the classifier and the accuracy of the classification result. Therefore, the illumination condition of the building can be monitored in real time, and when the illumination effect of the building is not good, all illumination equipment of the building can be intelligently regulated and controlled, so that the energy-saving efficiency of illumination is improved, and the aim of energy saving is fulfilled.
Based on this, the application proposes an intelligent building monitoring system comprising: the building monitoring unit is used for acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by the brightness meter and building illumination images of the current time point acquired by the camera; a brightness structuring unit, configured to arrange the outdoor brightness values and the indoor brightness values of the plurality of predetermined time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension, respectively; a relative brightness transfer unit for calculating a transfer brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector; the relative brightness characteristic extraction unit is used for passing the transfer brightness matrix through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; the illumination image feature extraction unit is used for enabling the building illumination image to pass through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; the illumination relative brightness transfer unit is used for calculating transfer vectors of the relative brightness feature vectors relative to the building illumination feature matrix as classification feature vectors; and the monitoring result generating unit is used for passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted or not.
Fig. 1 is a schematic view of a scenario of an intelligent building monitoring system according to an embodiment of the present application. As shown in fig. 1, in this application scenario, first, outdoor luminance values and indoor luminance values at a plurality of predetermined time points within a predetermined period of time acquired by a luminance meter (e.g., se as illustrated in fig. 1) and an illumination image of a building (e.g., B as illustrated in fig. 1) at a current time point acquired by a camera (e.g., C as illustrated in fig. 1) are acquired. Further, the outdoor luminance values and the indoor luminance values at a plurality of predetermined time points in the predetermined time period and the building illumination image at the current time point are input to a server (e.g., S as illustrated in fig. 1) in which an intelligent building monitoring algorithm is deployed, wherein the server is capable of processing the outdoor luminance values and the indoor luminance values at a plurality of predetermined time points in the predetermined time period and the building illumination image at the current time point based on the intelligent building monitoring algorithm to obtain a classification result for indicating whether the overall illumination effect of the building at the current time point needs to be adjusted.
Having described the basic principles of the present application, various non-limiting embodiments of the present application will now be described in detail with reference to the accompanying drawings.
Exemplary System: fig. 2 is a block diagram of an intelligent building monitoring system according to an embodiment of the present application. As shown in fig. 2, an intelligent building monitoring system 100 according to an embodiment of the present application includes: a building monitoring unit 110 for acquiring outdoor brightness values and indoor brightness values of a plurality of predetermined time points within a predetermined period of time acquired by a brightness meter and a building illumination image of a current time point acquired by a camera; a brightness structuring unit 120, configured to arrange the outdoor brightness values and the indoor brightness values of the plurality of predetermined time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension, respectively; a relative brightness transfer unit 130 for calculating a transfer brightness matrix of the indoor brightness input vector with respect to the outdoor brightness input vector; a relative brightness feature extraction unit 140, configured to pass the transferred brightness matrix through a first convolutional neural network model serving as a filter to obtain a relative brightness feature vector; the illumination image feature extraction unit 150 is configured to obtain a building illumination feature matrix by passing the building illumination image through a second convolutional neural network model including a depth feature fusion module; a lighting relative brightness transfer unit 160 for calculating a transfer vector of the relative brightness feature vector with respect to the building lighting feature matrix as a classification feature vector; and a monitoring result generating unit 170, configured to pass the classification feature vector through a classifier to obtain a classification result, where the classification result is used to indicate whether the overall lighting effect of the building at the current time point needs to be adjusted.
Fig. 3 is a block diagram of an intelligent building monitoring system according to an embodiment of the present application. As shown in fig. 3, in the architecture of the intelligent building monitoring system, first, outdoor luminance values and indoor luminance values at a plurality of predetermined time points within a predetermined period of time acquired by a luminance meter and a building illumination image at a current time point acquired by a camera are acquired. Then, the outdoor luminance values and the indoor luminance values at the plurality of predetermined time points are arranged as an outdoor luminance input vector and an indoor luminance input vector, respectively, in a time dimension. Then, a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector is calculated. Further, the transfer luminance matrix is passed through a first convolutional neural network model as a filter to obtain a relative luminance feature vector. And then, the building illumination image is passed through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix. Then, a transfer vector of the relative brightness feature vector with respect to the building lighting feature matrix is calculated as a classification feature vector. And then, the classification feature vector is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted.
In the intelligent building monitoring system 100, the building monitoring unit 110 is configured to obtain outdoor luminance values and indoor luminance values of a plurality of predetermined time points in a predetermined period of time acquired by a luminance meter, and a building illumination image of a current time point acquired by a camera. As described above in the background art, with the development of cities, the number of buildings is gradually increased, a large amount of public space exists inside the buildings, the public space often needs to be continuously illuminated for 24 hours, and the whole building needs to be kept under a good illumination effect. However, in practical application, part of public space needs no illumination under the condition of sufficient light in daytime, and part of public space needs only weak illumination, so that the currently adopted general illumination method greatly causes resource waste. Therefore, an optimized intelligent building monitoring system is expected, which can monitor the illumination condition of a building in real time, so that when the illumination effect of the building is not good, various illumination devices of the building are intelligently regulated and controlled, the energy-saving efficiency of illumination is further improved, and the aim of energy saving is fulfilled.
At present, deep learning and neural networks have been widely used in the fields of computer vision, natural language processing, speech signal processing, and the like. In addition, deep learning and neural networks have also shown levels approaching and even exceeding humans in the fields of image classification, object detection, semantic segmentation, text translation, and the like. In recent years, deep learning and development of neural networks provide new solutions and schemes for illumination monitoring of intelligent buildings.
Accordingly, considering that the brightness in the building room changes due to the difference between the daytime and the night and the weather, that is, the brightness in the building room and the brightness outside the building room have a correlation relationship, if the judgment of the lighting effect is performed based on the absolute brightness of the building, the judgment of the lighting effect of the building is not accurate due to the interference of the brightness of the external environment. Further, it is considered that the luminance of each place of the building is not uniform, which affects the judgment of the lighting effect of the whole building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is expected to be performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted or not, and the aim of energy saving is fulfilled. Specifically, in the technical scheme of the application, firstly, outdoor brightness values and indoor brightness values at a plurality of preset time points in a preset time period are acquired through a brightness meter, and building illumination images at the current time point are acquired through a camera.
In the intelligent building monitoring system 100, the luminance structuring unit 120 is configured to arrange the outdoor luminance values and the indoor luminance values at the plurality of predetermined time points into an outdoor luminance input vector and an indoor luminance input vector according to a time dimension, respectively. It should be understood that, since the technical solution of the present application is to adjust the lighting effect of the whole building based on the relative brightness characteristic between the outdoor brightness value and the indoor brightness value in the time dimension, the outdoor brightness value and the indoor brightness value at the plurality of predetermined time points are further arranged as an outdoor brightness input vector and an indoor brightness input vector according to the time dimension, respectively, to integrate the respective brightness information in the time dimension.
In the intelligent building monitoring system 100, the relative brightness shifting unit 130 is configured to calculate a shifting brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector. After integrating the indoor luminance information and the outdoor luminance information in the time dimension, calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector to represent differential relative luminance information of the outdoor luminance and the indoor luminance.
Specifically, in the embodiment of the present application, the relative brightness transfer unit 130 is further configured to: the indoor brightness output is calculated according to the following formulaA transition luminance matrix of the input vector relative to the outdoor luminance input vector; wherein, the formula is:, wherein ,/>Representing the indoor luminance input vector, +.>Representing the outdoor luminance input vector, +.>Representing the transfer luminance matrix,>representing matrix multiplication.
In the intelligent building monitoring system 100, the relative brightness feature extraction unit 140 is configured to pass the transferred brightness matrix through a first convolutional neural network model serving as a filter to obtain a relative brightness feature vector. That is, feature mining is performed on the transition luminance matrix using a first convolutional neural network model as a filter having excellent performance in implicit feature extraction to extract the differential relative luminance features of the outdoor luminance and the indoor luminance, i.e., the lighting effect features of the building, thereby obtaining a relative luminance feature vector.
Specifically, in the embodiment of the present application, the relative brightness feature extraction unit 140 includes: each layer of the first convolutional neural network model is respectively carried out in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the first convolutional neural network model is the relative brightness characteristic vector, and the input of the first layer of the first convolutional neural network model is the transfer brightness matrix.
In the intelligent building monitoring system 100, the illumination image feature extraction unit 150 is configured to pass the building illumination image through a second convolutional neural network model including a depth feature fusion module to obtain a building illumination feature matrix. Further, considering that the building is actually illuminated, the illumination effect in the building may be different due to the respective reasons of the resident or the merchant in the building, for example, a part of the illumination effect of the building is better, and a part of the illumination effect is worse, which also causes that the overall illumination effect of the building is difficult to detect. Therefore, in the technical scheme of the application, the overall lighting effect of the building is further characterized by utilizing the lighting image characteristics of the building. In particular, in order to be able to focus on the illumination situation of each resident or merchant in the illumination image of the building, the illumination image of the building is passed through a second convolutional neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix. That is, considering that the shallow features such as the shape and the outline of the building in the building illumination image have significance for the illumination effect detection during the illumination effect detection, and the shallow features become blurred and even submerged by noise as the depth of the convolutional neural network deepens during the encoding, in the technical scheme of the application, the second convolutional neural network model comprising the deep and shallow feature fusion module is used for processing the building illumination image, compared with the standard convolutional neural network model, the convolutional neural network model can retain the shallow features and the deep features, so that the feature information is richer, and the features with different depths can be retained, thereby improving the accuracy of the illumination effect detection of the building.
In one specific example, the second convolutional neural network model includes a plurality of neural network layers cascaded with each other, wherein each neural network layer includes a convolutional layer, a pooling layer, and an activation layer. In the encoding process of the second convolutional neural network model, each layer of the second convolutional neural network model uses the convolutional layer to carry out convolutional processing based on a convolutional kernel on input data in the forward transmission process of the layer, uses the pooling layer to carry out pooling processing on a convolutional feature map output by the convolutional layer and uses the activating layer to carry out activating processing on the pooling feature map output by the pooling layer, wherein the input data of the first layer of the second convolutional neural network model is the building illumination image. Here, each layer of the second convolutional neural network model may output a feature map. In the technical scheme of the application, a shallow feature map is extracted from a shallow layer (such as an M layer) of the second convolutional neural network model, and a deep feature map is extracted from a deep layer (such as an N layer) of the second convolutional neural network model. It should be noted that, here, the value range of M is 2-6 layers, and N is greater than M. It should be understood that the second convolutional neural network model extracts shallow features such as the shape and the outline of the building at 2 to 6 layers thereof, and extracts deep substantial features such as the overall lighting effect of the building. Therefore, the shallow layer features and the deep layer features of the building illumination image can be extracted respectively, so that different feature information of the building can be better utilized to accurately judge.
Fig. 4 is a block diagram of an illumination image feature extraction unit in an intelligent building monitoring system according to an embodiment of the present application. As shown in fig. 4, the illumination image feature extraction unit 150 includes: a shallow feature extraction subunit 151, configured to extract a shallow feature map from an mth layer of the second convolutional neural network model, where M is greater than or equal to 1 and less than or equal to 6; a deep feature extraction subunit 152, configured to extract a deep feature map from an nth layer of the second convolutional neural network model, where N/M is greater than or equal to 5 and less than or equal to 10; a fusion subunit 153, configured to fuse the shallow feature map and the deep feature map by using a deep-shallow feature fusion module of the second convolutional neural network model to obtain a fused feature map; and a pooling subunit 154 configured to globally pool the fused feature map along a channel dimension to obtain the building lighting feature matrix.
Specifically, in the embodiment of the present application, the fusion subunit 153 is further configured to: fusing the shallow feature map and the deep feature map by the following formula to obtain the fused feature map; wherein, the formula is:
wherein->For the fusion profile, < > >For the shallow feature map, ++>For the deep feature map, ">"means that elements at corresponding positions of the shallow feature map and the deep feature map are added, and />Is a weighting parameter for controlling the balance between the shallow feature map and the deep feature map in the fused feature map.
In the above intelligent building monitoring system 100, the lighting relative brightness shifting unit 160 is configured to calculate a shifting vector of the relative brightness feature vector with respect to the building lighting feature matrix as a classification feature vector. That is, the transfer vector of the relative luminance feature vector with respect to the building illumination feature matrix is calculated to represent the correlation feature information of the indoor and outdoor relative illumination change feature of the building and the overall illumination feature of the building, and is taken as the classification feature vector.
Specifically, in the embodiment of the present application, the illumination relative brightness transfer unit 160 is further configured to: calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector according to the following formula; wherein, the formula is:, wherein ,/>Representing the relative luminance feature vector, +_ >Representing the building lighting feature matrix, < >>Representing the classification feature vector,/->Representing matrix multiplication.
In the intelligent building monitoring system 100, the monitoring result generating unit 170 is configured to pass the classification feature vector through a classifier to obtain a classification result, where the classification result is used to indicate whether the overall lighting effect of the building at the current time point needs to be adjusted. Therefore, the illumination condition of the building can be monitored in real time, so that all illumination equipment of the building can be intelligently regulated and controlled when the poor illumination effect of the building is monitored.
Specifically, in the embodiment of the present application, the monitoring result generating unit 170 includes: the full-connection coding subunit is used for carrying out full-connection coding on the classification characteristic vectors by using a full-connection layer of the classifier so as to obtain coded classification characteristic vectors; and a classification result generation subunit, configured to input the encoded classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
In the intelligent building monitoring system 100, the system further includes a training module 200 for training the first convolutional neural network model as a filter, the second convolutional neural network model including the depth feature fusion module, and the classifier.
Fig. 5 is a block diagram of a training module in an intelligent building monitoring system in accordance with an embodiment of the present application. As shown in fig. 5, the training module 200 includes: a training data obtaining unit 210, configured to obtain training data, where the training data includes outdoor training brightness values, indoor training brightness values, building training illumination images at a current time point, and a real value of whether the overall building illumination effect at the current time point needs to be adjusted; a training luminance structuring unit 220, configured to arrange the outdoor training luminance values and the indoor training luminance values at the plurality of predetermined time points into an outdoor training luminance input vector and an indoor training luminance input vector according to a time dimension, respectively; a training relative brightness transfer unit 230 for calculating a training transfer brightness matrix of the indoor training brightness input vector relative to the outdoor training brightness input vector; a training relative brightness feature extraction unit 240, configured to pass the training transfer brightness matrix through the first convolutional neural network model serving as a filter to obtain a training relative brightness feature vector; the training illumination image feature extraction unit 250 is configured to obtain a training building illumination feature matrix by passing the training building illumination image through the second convolutional neural network model including the depth feature fusion module; a training illumination relative brightness transfer unit 260, configured to calculate a transfer vector of the training relative brightness feature vector relative to the training building illumination feature matrix as a training classification feature vector; a classification loss unit 270, configured to pass the training classification feature vector through the classifier to obtain a classification loss function value; and a training unit 280 for training the first convolutional neural network model as a filter, the second convolutional neural network model including a depth feature fusion module, and the classifier based on the classification loss function value and traveling in a gradient descent direction, wherein in each round of the training, the training classification feature vector is iterated based on a hilbert probability spatialization of vector assignment.
In particular, in the technical scheme of the application, as the transfer vector of the relative brightness feature vector relative to the building illumination feature matrix is calculated as the training classification feature vector, although the training classification feature vector expresses the transfer distribution of the relative brightness feature relative to the image semantic feature of the overall illumination effect of the building, the transfer distribution is in the transfer domain between the relative brightness feature vector and the feature domain of the building illumination feature matrix, and the dependence on a single classification result is poor when the classification is performed by the classifier, so that the training speed of the classifier and the accuracy of the classification result of the training classification feature vector are influenced when the classification is performed in the classifier. Therefore, in order to solve the above-mentioned problems, the applicant of the present application optimizes the training process of the model, so as to carry out the hilbert probability spatialization of vector-normalized to the training classification feature vector.
Specifically, in the embodiment of the present application, in each iteration of the training, the training classification feature vector is iterated according to the following formula based on the hilbert probability spatialization of vector assignment; wherein, the formula is:
wherein Is the training classification feature vector, +.>Representing the two norms of the training classification feature vector,representing the inner product of the training classification feature vector itself,/->Is the training classification feature vector +.>Is>Personal characteristic value->An exponential function value based on a natural constant eAnd->Is the training classification characteristic vector after optimization +.>Is>And characteristic values.
Here, the vector-generalized Hilbert probability spatialization classifies feature vectors through the trainingSelf-assignment of the training classification feature vector +.>And reduces the training classification feature vector +.>Hidden disturbances of the class expression of the overall Hilbert space topology by the respective locally distributed class expression of the concatenation of (a) thereby increasing the training classification feature vector +.>Is converged to the robustness of a single predetermined classification result while the training classification feature vector +.>Is dependent on the long-range cross-classifier dependence of a single classification result. Thus, the optimized training classification feature vector +.>The dependence on a single classification result when classifying by the classifier improves the training speed of the classifier and the accuracy of the classification result of the training classification feature vector. In this way, illumination of the building is enabled The condition is monitored in real time, so that when the poor illumination effect of the building is monitored, all illumination equipment of the building is intelligently regulated, the energy-saving efficiency of illumination is further improved, and the aim of energy saving is fulfilled.
In summary, the intelligent building monitoring system 100 according to the embodiment of the present application is illustrated, which considers that the indoor brightness and the outdoor brightness of the building have a correlation, and the uneven brightness of each place of the building affects the overall lighting effect judgment of the building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted.
As described above, the intelligent building monitoring system 100 according to the embodiment of the present application may be implemented in various terminal devices, for example, a server having an intelligent building monitoring function, etc. In one example, the intelligent building monitoring system 100 according to an embodiment of the present application may be integrated into the terminal device as a software module and/or hardware module. For example, the intelligent building monitoring system 100 may be a software module in the operating system of the terminal device, or may be an application developed for the terminal device; of course, the intelligent building monitoring system 100 could equally be one of a number of hardware modules of the terminal device.
Alternatively, in another example, the intelligent building monitoring system 100 and the terminal device may be separate devices, and the intelligent building monitoring system 100 may be connected to the terminal device through a wired and/or wireless network and transmit interactive information in a agreed data format.
An exemplary method is: fig. 6 is a flowchart of an intelligent building monitoring method according to an embodiment of the present application. As shown in fig. 6, the intelligent building monitoring method according to the embodiment of the application includes the steps of: s110, acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by a brightness meter and building illumination images of the current time point acquired by a camera; s120, arranging the outdoor brightness values and the indoor brightness values of the plurality of preset time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension respectively; s130, calculating a transfer brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector; s140, the transfer brightness matrix passes through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; s150, the building illumination image passes through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; s160, calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector; and S170, passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted.
In one example, in the intelligent building monitoring method, the calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector includes: calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector with the following formula; wherein, the formula is:
wherein ,representing the indoor luminance input vector, +.>Representing the outdoor luminance input vector, +.>Representing the transfer luminance matrix,>representing matrix multiplication.
In one example, in the intelligent building monitoring method, the passing the transition brightness matrix through a first convolutional neural network model as a filter to obtain a relative brightness feature vector includes: each layer of the first convolutional neural network model is respectively carried out in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; performing nonlinear activation on the pooled feature map to obtain an activated feature map; the output of the last layer of the first convolutional neural network model is the relative brightness characteristic vector, and the input of the first layer of the first convolutional neural network model is the transfer brightness matrix.
In an example, in the intelligent building monitoring method, the step of obtaining the building illumination feature matrix by passing the building illumination image through a second convolutional neural network model including a depth feature fusion module includes: extracting a shallow feature map from an M-th layer of the second convolutional neural network model, wherein M is more than or equal to 1 and less than or equal to 6; extracting a deep feature map from an nth layer of the second convolutional neural network model, wherein N/M is greater than or equal to 5 and less than or equal to 10; a depth feature fusion module of the second convolutional neural network model is used for fusing the shallow feature map and the deep feature map to obtain a fused feature map; and carrying out global pooling on the fusion feature map along the channel dimension to obtain the building illumination feature matrix.
In an example, in the above intelligent building monitoring method, the merging module for merging the shallow feature map and the deep feature map to obtain a merged feature map includes: fusing the shallow feature map and the deep feature map by the following formula to obtain the fused feature map; wherein, the formula is:
Wherein->For the fusion profile, < >>For the shallow feature map, ++>For the deep feature map, ">"means that elements at corresponding positions of the shallow feature map and the deep feature map are added, and />Is a weighting parameter for controlling the balance between the shallow feature map and the deep feature map in the fused feature map.
In one example, in the above intelligent building monitoring method, the calculating the transfer vector of the relative brightness feature vector with respect to the building lighting feature matrix as the classification feature vector includes: calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector according to the following formula; wherein, the formula is:, wherein ,/>Representing the relative luminance feature vector, +_>Representing the building lighting feature matrix, < >>Representing the classification feature vector,/->Representing matrix multiplication.
In an example, in the above intelligent building monitoring method, the step of passing the classification feature vector through a classifier to obtain a classification result, where the classification result is used to indicate whether the overall lighting effect of the building at the current time point needs to be adjusted, includes: performing full-connection coding on the classification feature vectors by using a full-connection layer of the classifier to obtain coded classification feature vectors; and inputting the coding classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
In one example, in the intelligent building monitoring method, training is further performed on the first convolutional neural network model serving as a filter, the second convolutional neural network model containing the depth feature fusion module, and the classifier.
Fig. 7 is a flowchart of training the first convolutional neural network model as a filter, the second convolutional neural network model including the depth feature fusion module, and the classifier in the intelligent building monitoring method according to an embodiment of the present application. As shown in fig. 7, the training the first convolutional neural network model as a filter, the second convolutional neural network model including the depth feature fusion module, and the classifier includes the steps of: s210, acquiring training data, wherein the training data comprises outdoor training brightness values, indoor training brightness values, building training illumination images at the current time point and a true value of whether the overall building illumination effect at the current time point needs to be adjusted or not at a plurality of preset time points in a preset time period; s220, arranging the outdoor training brightness values and the indoor training brightness values of the plurality of preset time points into an outdoor training brightness input vector and an indoor training brightness input vector according to the time dimension respectively; s230, calculating a training transfer brightness matrix of the indoor training brightness input vector relative to the outdoor training brightness input vector; s240, passing the training transfer brightness matrix through the first convolution neural network model serving as a filter to obtain a training relative brightness characteristic vector; s250, passing the training building illumination image through the second convolution neural network model comprising the depth feature fusion module to obtain a training building illumination feature matrix; s260, calculating a transfer vector of the training relative brightness feature vector relative to the training building illumination feature matrix as a training classification feature vector; s270, passing the training classification feature vector through the classifier to obtain a classification loss function value; and S280, training the first convolutional neural network model serving as a filter, the second convolutional neural network model comprising a depth feature fusion module and the classifier based on the classification loss function value and through gradient descent direction propagation, wherein in each round of iteration of the training, the training classification feature vector is iterated based on vector-generalized Hilbert probability spatialization.
In one example, in the intelligent building monitoring method, in each iteration of the training, the training classification feature vector is iterated in the following formula based on the vector-normalized hilbert probability spatialization; wherein, the formula is:
wherein Is the training classification feature vector, +.>Representing the two norms of the training classification feature vector,representing the inner product of the training classification feature vector itself,/->Is the training classification feature vector +.>Is>Personal characteristic value->An exponential function value based on a natural constant e is represented, and +.>Is the training classification characteristic vector after optimization +.>Is>And characteristic values. />
In summary, the intelligent building monitoring method according to the embodiment of the application is explained by considering that the indoor brightness and the outdoor brightness of the building have a correlation relationship, and the uneven brightness of each place of the building can affect the judgment of the overall lighting effect of the building. Therefore, in the technical scheme of the application, the detection of the lighting effect of the whole building is performed based on the relative change characteristics of the indoor and outdoor brightness information in time and the image characteristics of the building, so as to judge whether the lighting effect needs to be adjusted.

Claims (10)

1. An intelligent building monitoring system, comprising: the building monitoring unit is used for acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by the brightness meter and building illumination images of the current time point acquired by the camera; a brightness structuring unit, configured to arrange the outdoor brightness values and the indoor brightness values of the plurality of predetermined time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension, respectively; a relative brightness transfer unit for calculating a transfer brightness matrix of the indoor brightness input vector relative to the outdoor brightness input vector; the relative brightness characteristic extraction unit is used for passing the transfer brightness matrix through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; the illumination image feature extraction unit is used for enabling the building illumination image to pass through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; the illumination relative brightness transfer unit is used for calculating transfer vectors of the relative brightness feature vectors relative to the building illumination feature matrix as classification feature vectors; and the monitoring result generating unit is used for passing the classification feature vector through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted or not.
2. The intelligent building monitoring system of claim 1, wherein the relative brightness transfer unit is further configured to: calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector with the following formula; wherein, the formula is:, wherein ,/>Representing the indoor luminance input vector, +.>Representing the outdoor luminance input vector, +.>Representing the transfer luminance matrix,>representing matrix multiplication.
3. The intelligent building monitoring system of claim 2, wherein the relative brightness feature extraction unit comprises: each layer of the first convolutional neural network model is respectively carried out in forward transfer of the layer: carrying out convolution processing on input data to obtain a convolution characteristic diagram; carrying out mean pooling based on a local feature matrix on the convolution feature map to obtain a pooled feature map; non-linear activation is carried out on the pooled feature map so as to obtain an activated feature map; the output of the last layer of the first convolutional neural network model is the relative brightness characteristic vector, and the input of the first layer of the first convolutional neural network model is the transfer brightness matrix.
4. An intelligent building monitoring system according to claim 3, wherein the illumination image feature extraction unit comprises: a shallow feature extraction subunit, configured to extract a shallow feature map from an mth layer of the second convolutional neural network model, where M is greater than or equal to 1 and less than or equal to 6; a deep feature extraction subunit, configured to extract a deep feature map from an nth layer of the second convolutional neural network model, where N/M is greater than or equal to 5 and less than or equal to 10; the fusion subunit is used for fusing the shallow feature map and the deep feature map by using a deep and shallow feature fusion module of the second convolutional neural network model so as to obtain a fusion feature map; and a pooling subunit, configured to globally pooling the fusion feature map along a channel dimension to obtain the building illumination feature matrix.
5. The intelligent building monitoring system of claim 4, wherein the fusion subunit is further configured to: fusing the shallow feature map and the deep feature map by the following formula to obtain the fused feature map; wherein, the formula is:wherein->For the fusion profile, < >>For the shallow feature map, ++ >For the deep feature map, ">"means that the elements at the corresponding positions of the shallow feature map and the deep feature map are added,"> and />Is a weighting parameter for controlling the balance between the shallow feature map and the deep feature map in the fused feature map.
6. The intelligent building monitoring system of claim 5, wherein the lighting relative brightness transfer unit is further configured to: calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector according to the following formula; wherein, the formula is:
wherein ,representing the relative luminance feature vector, +_>Representing the building lighting feature matrix, < >>Representing the classification feature vector,/->Representing matrix multiplication.
7. The intelligent building monitoring system according to claim 6, wherein the monitoring result generating unit includes: the full-connection coding subunit is used for carrying out full-connection coding on the classification characteristic vectors by using a full-connection layer of the classifier so as to obtain coded classification characteristic vectors; and a classification result generation subunit, configured to input the encoded classification feature vector into a Softmax classification function of the classifier to obtain the classification result.
8. The intelligent building monitoring system of claim 7, further comprising a training module for training the first convolutional neural network model as a filter, the second convolutional neural network model comprising a deep-shallow feature fusion module, and the classifier; wherein, training module includes: the training data acquisition unit is used for acquiring training data, wherein the training data comprises outdoor training brightness values, indoor training brightness values and building training illumination images at the current time point in a preset time period and a true value of whether the overall illumination effect of the building at the current time point needs to be adjusted or not; the training brightness structuring unit is used for respectively arranging the outdoor training brightness values and the indoor training brightness values of the plurality of preset time points into an outdoor training brightness input vector and an indoor training brightness input vector according to the time dimension; the training relative brightness transfer unit is used for calculating a training transfer brightness matrix of the indoor training brightness input vector relative to the outdoor training brightness input vector; the training relative brightness characteristic extraction unit is used for passing the training transfer brightness matrix through the first convolution neural network model serving as a filter to obtain a training relative brightness characteristic vector; the training illumination image feature extraction unit is used for enabling the training building illumination image to pass through the second convolution neural network model comprising the depth feature fusion module to obtain a training building illumination feature matrix; the training illumination relative brightness transfer unit is used for calculating transfer vectors of the training relative brightness feature vectors relative to the training building illumination feature matrix to serve as training classification feature vectors; the classification loss unit is used for passing the training classification feature vector through the classifier to obtain a classification loss function value; and a training unit for training the first convolutional neural network model as a filter, the second convolutional neural network model including a depth feature fusion module, and the classifier based on the classification loss function value and traveling in a gradient descent direction, wherein in each round of the training, the training classification feature vector is iterated based on a vector-generalized hilbert probability spatialization.
9. The intelligent building monitoring system of claim 8, wherein in each iteration of the training, the training classification feature vector is iterated based on vector-normalized hilbert probability spatialization in the following formula; wherein, the formula is:
wherein Is the training classification feature vector, +.>Representing the two norms of the training classification feature vector,/->Representing the inner product of the training classification feature vector itself,/->Is the training classification feature vector +.>Is>The value of the characteristic is a value of,expressed in natural normalAn exponential function value with the number e as the base, and +.>Is the training classification characteristic vector after optimization +.>Is>And characteristic values.
10. An intelligent building monitoring method is characterized by comprising the following steps: acquiring outdoor brightness values and indoor brightness values of a plurality of preset time points in a preset time period acquired by a brightness meter and building illumination images of the current time point acquired by a camera; arranging the outdoor brightness values and the indoor brightness values of the plurality of preset time points into an outdoor brightness input vector and an indoor brightness input vector according to a time dimension respectively; calculating a transition luminance matrix of the indoor luminance input vector relative to the outdoor luminance input vector; the transfer brightness matrix is passed through a first convolution neural network model serving as a filter to obtain a relative brightness characteristic vector; the building illumination image is passed through a second convolution neural network model comprising a depth feature fusion module to obtain a building illumination feature matrix; calculating a transfer vector of the relative brightness feature vector relative to the building illumination feature matrix as a classification feature vector; and the classification feature vector is passed through a classifier to obtain a classification result, wherein the classification result is used for indicating whether the overall lighting effect of the building at the current time point needs to be adjusted.
CN202310142757.3A 2023-02-21 2023-02-21 Intelligent building monitoring system and monitoring method thereof Pending CN116819991A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310142757.3A CN116819991A (en) 2023-02-21 2023-02-21 Intelligent building monitoring system and monitoring method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310142757.3A CN116819991A (en) 2023-02-21 2023-02-21 Intelligent building monitoring system and monitoring method thereof

Publications (1)

Publication Number Publication Date
CN116819991A true CN116819991A (en) 2023-09-29

Family

ID=88126413

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310142757.3A Pending CN116819991A (en) 2023-02-21 2023-02-21 Intelligent building monitoring system and monitoring method thereof

Country Status (1)

Country Link
CN (1) CN116819991A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160255701A1 (en) * 2014-09-10 2016-09-01 Boe Technology Group Co., Ltd. Method and device for adjusting indoor brightness and smart home control system
CN109874197A (en) * 2018-11-02 2019-06-11 中国计量大学 Commercial hotel guest room illumination control method based on scene automatic identification
US20200412932A1 (en) * 2018-03-05 2020-12-31 Omron Corporation Method, device, system and computer-program product for setting lighting condition and storage medium
CN113325762A (en) * 2021-05-25 2021-08-31 西安交通大学 Intelligent building personalized energy utilization control method, system, device and equipment
CN115376075A (en) * 2022-10-25 2022-11-22 中节能绿建环保科技有限公司 Fresh air energy-saving system of intelligent building and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160255701A1 (en) * 2014-09-10 2016-09-01 Boe Technology Group Co., Ltd. Method and device for adjusting indoor brightness and smart home control system
US20200412932A1 (en) * 2018-03-05 2020-12-31 Omron Corporation Method, device, system and computer-program product for setting lighting condition and storage medium
CN109874197A (en) * 2018-11-02 2019-06-11 中国计量大学 Commercial hotel guest room illumination control method based on scene automatic identification
CN113325762A (en) * 2021-05-25 2021-08-31 西安交通大学 Intelligent building personalized energy utilization control method, system, device and equipment
CN115376075A (en) * 2022-10-25 2022-11-22 中节能绿建环保科技有限公司 Fresh air energy-saving system of intelligent building and control method thereof

Similar Documents

Publication Publication Date Title
CN108764298B (en) Electric power image environment influence identification method based on single classifier
CN112149547B (en) Remote sensing image water body identification method based on image pyramid guidance and pixel pair matching
CN105787458A (en) Infrared behavior identification method based on adaptive fusion of artificial design feature and depth learning feature
CN111709410B (en) Behavior identification method for strong dynamic video
CN107316035A (en) Object identifying method and device based on deep learning neutral net
CN111461038A (en) Pedestrian re-identification method based on layered multi-mode attention mechanism
CN114863097B (en) Infrared dim target detection method based on attention mechanism convolutional neural network
CN111611861B (en) Image change detection method based on multi-scale feature association
CN114943963A (en) Remote sensing image cloud and cloud shadow segmentation method based on double-branch fusion network
CN110853057A (en) Aerial image segmentation method based on global and multi-scale full-convolution network
CN113361710B (en) Student model training method, picture processing device and electronic equipment
CN111178284A (en) Pedestrian re-identification method and system based on spatio-temporal union model of map data
CN109086803A (en) A kind of haze visibility detection system and method based on deep learning and the personalized factor
CN115853173A (en) Building curtain wall for construction and installation
CN112923523A (en) Intelligent fresh air system regulation and control method based on data link of Internet of things
CN103218829B (en) A kind of foreground extracting method being adapted to dynamic background
CN117237559A (en) Digital twin city-oriented three-dimensional model data intelligent analysis method and system
CN115527134A (en) Urban garden landscape lighting monitoring system and method based on big data
CN109685288B (en) Distributed traffic flow prediction method and system
CN109685823A (en) A kind of method for tracking target based on depth forest
CN111291663B (en) Method for quickly segmenting video target object by using space-time information
CN116819991A (en) Intelligent building monitoring system and monitoring method thereof
CN115497006B (en) Urban remote sensing image change depth monitoring method and system based on dynamic mixing strategy
CN115937662A (en) Intelligent household system control method and device, server and storage medium
CN114724245A (en) CSI-based incremental learning human body action identification method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination