CN117911907B - Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle - Google Patents

Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle Download PDF

Info

Publication number
CN117911907B
CN117911907B CN202410311592.2A CN202410311592A CN117911907B CN 117911907 B CN117911907 B CN 117911907B CN 202410311592 A CN202410311592 A CN 202410311592A CN 117911907 B CN117911907 B CN 117911907B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
image
opening
curtain wall
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410311592.2A
Other languages
Chinese (zh)
Other versions
CN117911907A (en
Inventor
李嘉琪
岳清瑞
张素梅
杨新聪
金楠
施钟淇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Urban Safety Development Science And Technology Research Institute Shenzhen
Shenzhen Graduate School Harbin Institute of Technology
Original Assignee
Urban Safety Development Science And Technology Research Institute Shenzhen
Shenzhen Graduate School Harbin Institute of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Urban Safety Development Science And Technology Research Institute Shenzhen, Shenzhen Graduate School Harbin Institute of Technology filed Critical Urban Safety Development Science And Technology Research Institute Shenzhen
Priority to CN202410311592.2A priority Critical patent/CN117911907B/en
Publication of CN117911907A publication Critical patent/CN117911907A/en
Application granted granted Critical
Publication of CN117911907B publication Critical patent/CN117911907B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention discloses a curtain wall opening and closing state detection method, a patrol system and a medium based on an unmanned aerial vehicle, wherein the method comprises the following steps: determining proper inspection distance, flight parameters and data acquisition parameters for the unmanned aerial vehicle according to the outer elevation shape of the building to be detected and the flight surrounding environment information, and generating an unmanned aerial vehicle inspection route containing the data acquisition position; when the unmanned aerial vehicle reaches a preset detection position, synchronously acquiring thermal infrared and visible light images according to data acquisition parameters; processing image data acquired by the unmanned aerial vehicle, and determining a detection result of the opening and closing states of the opening fan of the building to be detected by using the fusion detection method; and determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to the data acquisition positions of the unmanned aerial vehicle and the opening and closing state detection results, and forming detection results. The invention can improve the inspection efficiency and accuracy when the curtain wall building is inspected by opening the fan before severe weather.

Description

Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle
Technical Field
The invention relates to the field of data processing, in particular to a curtain wall opening and closing state detection method, a patrol system and a medium based on an unmanned aerial vehicle.
Background
The glass curtain wall is a common design element of modern high-rise buildings, and is mainly used for outer wall decoration and enclosure. The outer vertical surface of the modern high-rise building is directly affected by various natural environments, and under extreme weather conditions such as typhoons, storm and the like, an opening fan in the glass curtain wall can become a weak link of the integral structure, so that a great potential safety hazard exists.
When the opening fan of the glass curtain wall is not in a closed state, if strong wind or bad weather is encountered, the opening fan which is not closed can cause serious damage to the glass curtain wall connecting piece under the action of wind power, even the opening fan falls off, and besides the property loss of the curtain wall and the inside of a building, the opening fan also can threaten the safety of surrounding environment and pedestrians.
In the related art, it is generally ensured by means of manual inspection that the opening fans of the curtain wall have been closed before the arrival of bad weather. However, for curtain wall buildings or building groups with higher heights and larger scales, the manual inspection mode has the defect of low inspection efficiency.
The foregoing is provided merely for the purpose of facilitating understanding of the technical solutions of the present invention and is not intended to represent an admission that the foregoing is prior art.
Disclosure of Invention
The invention mainly aims to provide a curtain wall opening and closing state detection method, a patrol system and a storage medium based on an unmanned aerial vehicle, which solve the problem that the patrol efficiency is low when a glass curtain wall is patrol by a manual patrol mode in the prior art.
In order to achieve the above purpose, the invention provides a curtain wall opening and closing state detection method based on an unmanned aerial vehicle, which comprises the following steps:
Determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the shape and the size parameters of the outer vertical surface of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating an unmanned aerial vehicle inspection route containing the data acquisition position;
Controlling the unmanned aerial vehicle to fly along the unmanned aerial vehicle inspection route, and executing a data acquisition instruction by the unmanned aerial vehicle when the unmanned aerial vehicle reaches a preset data acquisition position of the inspection route, wherein the unmanned aerial vehicle synchronously acquires thermal infrared and visible light images according to data acquisition parameters of each camera;
Receiving and processing image detection data acquired by the unmanned aerial vehicle, and determining an opening and closing state detection result of the building to be detected according to the image detection data;
And determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to each data acquisition position of the unmanned aerial vehicle and the opening and closing state detection result, and forming a final detection result.
Optionally, the step of receiving and processing the image detection data collected by the unmanned aerial vehicle, and determining the on-off state detection result of the building to be detected according to the image detection data includes:
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting and registering the image detection data, and based on the registered image, utilizing a pre-training detection model to identify the opening fan which is not closed, so as to obtain an infrared image detection result and a visible light image detection result corresponding to the image detection data;
And carrying out superposition fusion treatment on the infrared detection result and the visible light detection result, and determining the opening and closing state detection result of the building to be detected by the fusion result after superposition fusion treatment.
Optionally, the step of performing superposition fusion processing on the infrared detection result and the visible light detection result, and determining the on-off state detection result of the building to be detected by using the fusion result after superposition fusion processing includes:
determining an infrared marking frame corresponding to the infrared detection result and a visible light marking frame corresponding to the visible light detection result;
Overlapping and fusing the infrared marking frame and the visible light marking frame to obtain a fused target marking frame, wherein the target marking frame is the fusion result;
and taking the target mark frame as a detection result of the opening and closing state of the building to be detected.
Optionally, after receiving the image detection data collected by the unmanned aerial vehicle, correcting and registering the image detection data, and based on the registered image, performing recognition of the non-closed open fan by using a pre-training detection model, so as to obtain an infrared image detection result and a visible light image detection result corresponding to the image detection data, where the steps include:
after receiving the image detection data acquired by the unmanned aerial vehicle, correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting an infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters to obtain a corrected infrared image;
And identifying the corrected infrared image according to a pre-training infrared image detection model to obtain the infrared detection result, and identifying the corrected visible image according to a pre-training visible image detection model to obtain the visible light detection result.
Optionally, after the image detection data collected by the unmanned aerial vehicle is received, correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters, and before the step of obtaining the corrected visible light image, further includes:
calibrating a visible light camera to obtain the camera parameters;
The step of correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image comprises the following steps:
And recalculating the RGB channel numerical tensor corresponding to the visible light image according to the camera parameters to obtain a corrected image tensor, and storing the corrected image tensor as the corrected visible light image.
Optionally, after the image detection data collected by the unmanned aerial vehicle is received, according to a preset infrared image correction algorithm and infrared camera parameters, the correction processing is performed on the infrared image in the image detection data, and before the step of obtaining the corrected infrared image, the method further includes:
Calibrating an infrared camera or solving based on the corresponding relation between the characteristic points in the corrected visible light image and the corresponding characteristic points in the infrared image to obtain the infrared camera parameters;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting the infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters, wherein the step of obtaining the corrected infrared image comprises the following steps of:
And according to the infrared camera parameters, recalculating the temperature channel or the pseudo-color RGB channel numerical tensor of the infrared image to obtain a corrected second image tensor, and storing the corrected second image tensor as the corrected infrared image.
Optionally, before determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to each data acquisition position of the unmanned aerial vehicle and the opening and closing state detection result and forming the final detection result, the method further includes:
When the image detection data are received, the space position and the camera cradle head angle of the unmanned aerial vehicle are read from the image attribute information of the image detection data;
The step of determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to the data acquisition positions of the unmanned aerial vehicle and the opening and closing state detection results and forming a final detection result comprises the following steps:
and projecting the opening and closing state detection result of the opening and closing fan to the spatial position of the building curtain wall from the same coordinate plane of the registered image according to the spatial position of the building curtain wall, the camera pan-tilt angle and the opening and closing state detection result of the opening and closing fan, so as to obtain the final detection result.
Optionally, the size parameter includes a building height and a building width of the building to be tested, and the step of determining the inspection distance, the flight parameter and the data acquisition parameter of the unmanned aerial vehicle according to the outer elevation shape and the size parameter of the building to be tested and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating the unmanned aerial vehicle inspection route including the data acquisition position includes:
determining a patrol flight area and a patrol distance of the unmanned aerial vehicle according to the building height, the building width, the outer elevation shape, the peripheral obstacles corresponding to the peripheral environment information and the shielding objects affecting flight;
Determining the data acquisition parameters according to the illumination condition, the camera view angle, the preset image acquisition distance, the camera zoom multiple and the adjacent image overlapping rate on the flight route when the unmanned aerial vehicle acquires the images in the inspection process;
And generating an unmanned aerial vehicle inspection route containing a data acquisition position according to the inspection flight area, the inspection distance, the data acquisition parameters and the flight parameters of the unmanned aerial vehicle.
In addition, in order to achieve the above purpose, the invention also provides a patrol system, which comprises a memory, a processor and a curtain wall opening and closing state detection program stored on the memory and capable of running on the processor, wherein the curtain wall opening and closing state detection program is executed by the processor to realize the steps of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle.
In addition, in order to achieve the above object, the present invention further provides a computer readable storage medium, on which a curtain wall opening and closing state detection program is stored, which when executed by a processor, implements the steps of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle as described above.
The embodiment of the invention provides a curtain wall opening and closing state detection method, an inspection system and a storage medium based on an unmanned aerial vehicle, wherein the inspection system can determine inspection distance, flight parameters and data acquisition parameters of the unmanned aerial vehicle according to the outer elevation shape, the size parameters, the flight surrounding environment information of the unmanned aerial vehicle and the like of a building to be detected, then generates an unmanned aerial vehicle inspection route containing a data acquisition position, then controls the unmanned aerial vehicle to fly along the route, and when the unmanned aerial vehicle reaches a preset data acquisition position of an inspection course, the unmanned aerial vehicle executes data acquisition, wherein the unmanned aerial vehicle can synchronously acquire thermal infrared and visible light images according to the data acquisition parameters of each camera, receives and processes image data acquired by the unmanned aerial vehicle after the data acquisition, determines an opening and closing state detection result of the building to be detected according to the image data, and determines a building curtain wall space position corresponding to an unopened opening fan according to the acquisition position and the opening and closing state detection result, so as to form a final detection result. Based on this, patrol and examine through unmanned aerial vehicle, confirm the open and close state of the glass curtain wall of building that awaits measuring based on the infrared image and the visible light image of the building that awaits measuring that detect, need not artifical inspection, improve the efficiency of patrolling and examining of the open and close state of the open fan of glass curtain wall.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention. In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments will be briefly described below, and it will be obvious to those skilled in the art that other drawings can be obtained from these drawings without inventive effort.
FIG. 1 is a schematic view of a patrol flow of a curtain wall opening and closing state detection method based on an unmanned aerial vehicle;
FIG. 2 is a schematic flow chart of a first embodiment of a method for detecting the opening and closing state of a curtain wall based on an unmanned aerial vehicle;
FIG. 3 is a schematic view of an alternative inspection route of the method for detecting the opening and closing states of the curtain wall based on the unmanned aerial vehicle;
FIG. 4 is a schematic view of another alternative inspection route of the method for detecting the opening and closing states of the curtain wall based on the unmanned aerial vehicle;
FIG. 5 is a schematic flow chart of a second embodiment of a method for detecting an open/close state of a curtain wall based on an unmanned aerial vehicle;
FIG. 6 is a schematic diagram of a thermal infrared image and visible light image correction registration result of a curtain wall opening and closing state detection method based on an unmanned aerial vehicle;
FIG. 7 is a schematic diagram of a recognition result of a visible light image detection model of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle;
FIG. 8 is a schematic diagram of an infrared image detection model recognition result of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle;
Fig. 9 is a schematic diagram of a terminal hardware structure of each embodiment of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In the related art, it is generally ensured by manual inspection that the opening fan of the glass curtain wall is closed before severe weather comes. However, for curtain wall buildings with higher sizes and larger scales, the inspection efficiency is low based on a manual inspection mode.
In order to solve the above-mentioned defect, the embodiment of the invention provides a curtain wall opening and closing state detection method based on unmanned aerial vehicle, which mainly comprises the following steps:
Determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the shape and the size parameters of the outer vertical surface of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating an unmanned aerial vehicle inspection route containing the data acquisition position;
Controlling the unmanned aerial vehicle to fly along the unmanned aerial vehicle inspection route, and executing a data acquisition instruction by the unmanned aerial vehicle when the unmanned aerial vehicle reaches a preset data acquisition position of the inspection route, wherein the unmanned aerial vehicle synchronously acquires thermal infrared and visible light images according to data acquisition parameters of each camera;
Receiving and processing image detection data acquired by the unmanned aerial vehicle, and determining an opening and closing state detection result of the building to be detected according to the image detection data;
And determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to each data acquisition position of the unmanned aerial vehicle and the opening and closing state detection result, and forming a final detection result.
According to the invention, the unmanned aerial vehicle is controlled to shoot through the inspection system, the opening and closing states of the middle glass curtain wall of the building to be inspected are determined by combining the shot infrared image and the visible light image of the building to be inspected, and the inspection efficiency of the opening fan of the glass curtain wall is improved.
In order to better understand the above technical solution, exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Example 1
In this embodiment, the glass curtain wall is used as an exterior wall decoration of a building, and if the opening fan is in an unclosed state, the connection piece of the glass curtain wall can be damaged even the opening fan falls off when strong wind or bad weather is encountered, which threatens the safety of pedestrians and surrounding environments below the building. In this embodiment, the inspection system is used for controlling the unmanned aerial vehicle to inspect, and the image collected by the unmanned aerial vehicle is analyzed and processed, so that the opening and closing state detection result of the building to be detected is obtained.
The open-close state detection flow of the open-close fan can be as shown in fig. 1, firstly, visible light and infrared light images of a building curtain wall are collected through an unmanned aerial vehicle, then, the images are corrected and registered to obtain registered visible light images and infrared light images, then, the corrected and registered visible light images and infrared light images are read respectively, the two images are analyzed and processed through a pre-trained visible light image detection model and an infrared image detection model respectively, an open-close state detection result of the open-close fan based on the visible light images and an open-close state detection result of the open-close fan based on the infrared images are obtained, and finally, the two detection results are fused to obtain an accurate detection result of the open-close fan and corresponding position information.
The following is a further explanation in connection with the implementation steps shown in fig. 2:
Step S10, determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the shape and the size parameters of the outer vertical surface of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating an unmanned aerial vehicle inspection route containing the data acquisition position;
In this embodiment, the shape and size parameters of the facade include the facade direction of the building to be tested, the height and width of the facade of the building, etc. The information of the surrounding environment of the unmanned aerial vehicle comprises the current flyable area, the flyable height, obstacles needing to be avoided in the flying process and the like. The information of the building to be tested can be uploaded to a database, and query processing is carried out based on the building information stored in the database, so that the outer facade shape and the dimension parameters of the building to be tested are obtained.
In the image acquisition process, the unmanned aerial vehicle has a short inspection distance, and can finish a single inspection task only after a long time, if the inspection distance is long, a fan opening target to be identified in a shot image is small, and the detection accuracy based on an image detection model is reduced. Based on the above, the inspection system needs to determine the flight area and the inspection shooting distance when the unmanned aerial vehicle is acquired according to the outer facade shape, the size parameters and the surrounding flight environment of the building to be inspected, and each data acquisition parameter, then generates an unmanned aerial vehicle inspection route which comprises each data acquisition point position and can cover the inspected area according to the required inspection area, the inspection distance, the data acquisition parameters and the like, and can avoid the condition that the images acquired by the unmanned aerial vehicle are omitted or excessively overlapped through the corresponding inspection route.
Specifically, in an optional implementation manner, when determining the inspection route, the size parameters include building height and width of the building to be inspected, the inspection flight area and the inspection distance of the unmanned aerial vehicle can be determined according to the building height, the building width, the outer elevation shape, the peripheral obstacles corresponding to the peripheral environment information and the shielding objects affecting the flight, then the data acquisition parameters are determined according to the illumination condition, the camera view angle, the preset image acquisition distance, the camera zoom multiple and the adjacent image overlapping rate of the unmanned aerial vehicle on the flight route when performing image acquisition, and finally the unmanned aerial vehicle inspection route including the data acquisition position is generated according to the inspection flight area, the area to be inspected, the inspection distance, the data acquisition parameters and the flight parameters of the unmanned aerial vehicle. Based on this, improve unmanned aerial vehicle's inspection efficiency.
For example, referring to fig. 3, the lower left corner square mark of the building to be tested is the shooting size corresponding to the shooting area, that is, the inspection distance, of the unmanned aerial vehicle, and when shooting is performed based on the shooting area and the corresponding flight speed, the inspection route of the unmanned aerial vehicle, shown in fig. 3, which can cover the glass curtain wall area of the whole building to be tested, needs to be generated. When the inspection distance is smaller, that is, the shot range is smaller, the unmanned aerial vehicle inspection route shown in fig. 4 can be generated. It should be noted that, in the inspection route shown in fig. 3 and fig. 4, each data acquisition position is marked, so that the inspection system accurately issues a flight control instruction to the unmanned aerial vehicle, thereby improving the inspection accuracy of the unmanned aerial vehicle.
Step S20, controlling the unmanned aerial vehicle to fly along the unmanned aerial vehicle inspection route, and executing a data acquisition instruction by the unmanned aerial vehicle when the unmanned aerial vehicle reaches a preset data acquisition position of the inspection route, wherein the unmanned aerial vehicle synchronously acquires thermal infrared and visible light images according to data acquisition parameters of each camera;
In this embodiment, after determining the inspection route of the unmanned aerial vehicle, the inspection system may issue a corresponding flight control instruction to the unmanned aerial vehicle based on each data acquisition position in the inspection route, so as to control the unmanned aerial vehicle to fly along the inspection route. When the unmanned aerial vehicle reaches a preset data acquisition position, namely, the current position fed back by the unmanned aerial vehicle is coincident with the preset detection position shown in fig. 2, the unmanned aerial vehicle is stated to reach the preset data acquisition position, and at the moment, the unmanned aerial vehicle starts to execute a data acquisition instruction to shoot the glass curtain wall of the building to be detected. In the shooting process, the unmanned aerial vehicle can control the infrared camera and the optical camera to carry out synchronous data acquisition according to a preset working frequency, and then corresponding infrared images and visible light images in a building to be detected are acquired, so that the inspection system synchronously detects the opening and closing states of the glass curtain wall based on the infrared images and the visible light images, the detection efficiency is improved, and meanwhile the detection accuracy is improved.
In an optional implementation manner of controlling the camera to perform synchronous data acquisition, except for performing synchronous data acquisition together through the infrared camera and the optical camera, if the unmanned aerial vehicle performing the inspection task is only loaded with one of the infrared camera or the optical camera, the unmanned aerial vehicle can perform data acquisition based on one of the cameras, and the requirement on the hardware device of the unmanned aerial vehicle is reduced, so that most of unmanned aerial vehicles with lower cost can be used for inspection, and the inspection cost is further reduced. On this basis, can add at least one unmanned aerial vehicle that has single type camera and shoot in step, and then when the cost is close, carry out data acquisition through many unmanned aerial vehicles that possess single type camera, improve the rate of accuracy that the state was patrolled and examined that opens and close.
In another alternative implementation manner for controlling the cameras to perform synchronous data acquisition, at least one other unmanned aerial vehicle with the same performance and hardware configuration can be controlled to perform secondary data acquisition, so that the inspection system can analyze and compare results of data acquired by a plurality of unmanned aerial vehicles, and further accuracy of detection of an on-off state is improved.
Step S30, receiving and processing image detection data acquired by the unmanned aerial vehicle, and determining an opening and closing state detection result of the building to be detected according to the image detection data;
In the embodiment, the inspection system can receive and process the infrared image and the visible light image of the building to be detected, which are acquired by the unmanned aerial vehicle, and analyze, process and compare the images based on two different images, so that a more accurate on-off state identification result is obtained, and the inspection accuracy is improved. The processing process comprises the steps of correcting and registering the infrared image, obtaining relevant labeling information, and fusing the infrared image with the visible light image.
As an optional implementation manner, the detection data collected by the unmanned aerial vehicle includes an infrared image corresponding to the infrared camera and a visible light image corresponding to the optical camera, wherein when the data collection is performed, the infrared camera and the visible light camera are synchronously shot, but because the camera performance and the component parameters of the optical camera and the infrared camera are different, the size of the shot picture pixels and the position of the picture content are not in a superposition state, and the two types of images need to be corrected and registered before comparison. In the comparison process, the detection system can respectively identify the corresponding positions of the opening fans in the opening state in the infrared image and the visible light image, then compares the detection results corresponding to the two pictures, and directly takes the corresponding positions of the opening fans in the opening state in the infrared image as the corresponding opening and closing state detection results in the current data acquisition point position when the detection results corresponding to the two pictures are the same. Based on the method, the unmanned plane and the two different types of cameras are used for shooting, so that the inspection efficiency and the inspection accuracy of the glass curtain wall of the building to be detected are improved.
In another alternative implementation manner of determining the on-off state detection result, if the detection results corresponding to the two pictures are different at the same acquisition point, the difference set of the two pictures is determined first. If the difference set of the two is a subset of the detection result of the infrared image, the position corresponding to the opening fan in the opening state in the infrared image is directly used as the corresponding opening and closing state detection result in the current data acquisition point. Otherwise, the inspection instruction needs to be issued to the unmanned aerial vehicle again, secondary data acquisition is carried out on the navigation points with problems, and errors of detection results are avoided. In the process of determining the on-off state detection result according to the detection data, the detection mode of the infrared image is determined based on the temperature difference of each area in the picture, namely, the window in the on state, the corresponding temperature is lower or higher than the window in the unopened state, in most cases, the accuracy of the result identified through temperature sensing is higher than that of the result obtained through analysis of the visible light image, so that the position of the opening fan detected in the visible light image is not identified when the position is detected in the infrared image, and the current detection result can be considered to have errors and needs to be detected again.
And S40, determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to the data acquisition positions of the unmanned aerial vehicle and the opening and closing state detection result, and forming a final detection result.
In this embodiment, the on-off state detection result of each data acquisition position needs to be calculated separately, and after the on-off state detection results of all data acquisition positions are calculated, the on-off state detection results can be summarized to obtain the spatial positions of all the un-closed opening fans in the building to be detected.
As an optional implementation manner, when the image detection data is received, the spatial position and the camera pan-tilt angle of the unmanned aerial vehicle may be acquired first, and the spatial position and the camera pan-tilt angle of the unmanned aerial vehicle are read from the image attribute information of the image detection data. And then, according to the spatial position of the building curtain wall, the camera cradle head angle and the opening and closing state detection result of the opening and closing state of the opening and closing fan, projecting the detection result of the opening and closing state of the opening and closing fan to the spatial position of the opening and closing state of the opening and closing fan on the building curtain wall from the same coordinate plane of the registered image to obtain the final detection result.
It should be noted that the final detection result may be in the form of coordinates, or may be information related to the floor, such as the nth position on the left side of the N-th floor, and the detection result may specifically represent the position information of the opening fan in the open state.
In the technical scheme disclosed in the embodiment, the inspection system determines an inspection distance, a flight parameter and a data acquisition parameter when the unmanned aerial vehicle performs inspection based on the outer elevation shape and the size parameter of the building to be inspected, surrounding environment information when the unmanned aerial vehicle flies and the like, and generates an inspection route based on the parameters. After the unmanned aerial vehicle is patrolled and inspected and the unmanned aerial vehicle is detected to reach a preset detection position, detection data of an infrared image and a visible light image acquired by the unmanned aerial vehicle are received, two different pictures are analyzed, a more accurate detection result of the opening and closing state of a building to be detected is obtained, and finally, based on all the data acquisition positions and the detection result of the opening and closing state, spatial position information of a building curtain wall corresponding to an opening fan which is not closed is obtained, and a final detection result is obtained. And the glass curtain wall is subjected to inspection treatment based on unmanned aerial vehicle, visible light detection and infrared detection, so that the inspection accuracy and inspection efficiency are improved.
Referring to fig. 5, in the second embodiment, based on the first embodiment, step S30 specifically includes:
step S31, after receiving the detection data acquired by the unmanned aerial vehicle, carrying out correction and registration processing on the image detection data, and carrying out recognition of the non-closed opening fan by utilizing a pre-training detection model based on the registered image to obtain an infrared image detection result and a visible light image detection result corresponding to the image detection data;
In this embodiment, both the visible light detection result and the infrared detection result may be obtained through corresponding image detection models, and due to different imaging principles, camera performance and component parameters of the infrared camera and the common optical camera, in the images corresponding to the detection data, there is a deviation between the positions corresponding to the infrared image and the visible light image. In order to avoid the situation that the same opening fan identification result cannot be integrated due to deviation of the positions identified in the pictures in the process of analyzing and comparing two different pictures, so that the inspection result accuracy is low, the infrared image in the detection data needs to be corrected. After the infrared image is corrected, the corrected infrared image and visible light image can be identified according to the image detection model, so that an infrared detection result and a visible light detection result are obtained.
Specifically, the infrared detection result and the visible light detection result are obtained by the inspection system after corresponding image data are obtained, and the position of an opening fan in an opening state in the image data is marked, so that the infrared detection result is obtained. And the infrared detection result comprises an infrared marking frame, and the visible light detection result comprises a visible light marking frame. Specifically, after detection data are received, two images are respectively corrected and registered through a preset correction algorithm, so that the positions of the buildings to be detected in the visible light images and the positions of the buildings to be detected in the infrared images are in one-to-one correspondence, then the corrected and registered images are input into a detection model, and further, after the position is marked by the detection model, the corresponding infrared detection results and the corresponding visible light image detection results are obtained. The preset correction algorithm may include a camera calibration correction method based on computer vision, a deep learning method, and the like, and the specific correction algorithm is not limited herein.
For example, referring to the upper half of fig. 6, in the visible light image and the infrared image acquired by the two cameras of the unmanned aerial vehicle, the positions of the buildings to be detected in the images are not in a superposition state, and the proportions of the buildings to be detected in the two images are different, so that the two images need to be corrected by a preset correction algorithm, and two corrected and registered images shown in the lower half of fig. 6 are obtained. Based on the corrected and registered images, the inspection system can accurately process the two images, so that the accuracy in the inspection process is improved.
And S32, carrying out superposition fusion processing on the infrared detection result and the visible light detection result, and determining the opening and closing state detection result of the building to be detected by the fusion result after superposition fusion processing.
As an optional implementation manner, in the process of performing the fusion processing, an infrared marking frame corresponding to the infrared detection result and a visible light marking frame corresponding to the visible light detection result may be determined first, then the infrared marking frame and the visible light marking frame are subjected to superposition fusion processing, a target marking frame after the fusion processing is obtained, the target marking frame is the fusion result, and finally the target marking frame is used as the opening and closing state detection result of the building to be detected. It can be understood that the infrared detection result and the visible light detection result are obtained based on the corrected and registered images, and the corrected and registered images are correspondingly overlapped at each opening fan position, so that the marks of the two images can be directly overlapped together, and finally, the total mark overlapped is used as an opening and closing state detection result. By superposing the two mark information, detection processing can be performed based on different cameras, and the accuracy of calculating the detection result of the on-off state is improved.
In another optional implementation manner, when the infrared detection result and the visible light detection result are fused, an infrared detection coordinate set corresponding to the infrared detection result in the building to be detected and a visible light detection coordinate set corresponding to the visible light detection result in the building to be detected may be obtained, wherein the infrared detection coordinate set and a subset of the visible light detection coordinate set are in the same coordinate plane, and then a union of the infrared detection coordinate set and the visible light detection coordinate set is used as a target detection coordinate set, and the target detection coordinate set is used as the opening fan and the opening fan position corresponding to the building to be detected. By acquiring the coordinate information of the infrared detection result, the accuracy of calculating the on-off state detection result can be improved.
When the area of the two kinds of detection images, which are not overlapped, exceeds a set threshold, the point location information needs to be recorded and sent to a checking interface of a worker for manual calibration.
In the technical scheme disclosed in the embodiment, in the process of determining the opening and closing state detection result of the building to be detected according to the detection data, after the infrared image and the visible light image in the detection data are corrected and matched, the infrared image and the visible light image of the detection data are identified through the image detection model to obtain the infrared detection result and the visible light detection result, then image fusion processing is performed according to the marking information or the coordinate information corresponding to the infrared detection result and the visible light detection result, finally the opening and closing state detection result is obtained, and the inspection accuracy is improved based on the opening and closing state detection result.
Based on the second embodiment, in the third embodiment, after receiving the image detection data collected by the unmanned aerial vehicle, step S31 performs correction and registration processing on the image detection data, and based on the registered image, performs recognition of the non-closed open fan by using a pre-training detection model, so as to obtain an infrared image detection result and a visible light image detection result corresponding to the image detection data, where the method specifically includes:
step S311, after receiving the image detection data acquired by the unmanned aerial vehicle, correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image;
Step S312, after receiving the image detection data collected by the unmanned aerial vehicle, correcting the infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters, so as to obtain a corrected infrared image.
In this embodiment, as an optional implementation manner, in the process of performing the correction registration of the images, two images need to be processed separately, and parameters of the corresponding cameras are required in addition to the correction algorithm. Therefore, before correcting and registering the visible light images, the visible light camera is also required to be calibrated to obtain corresponding camera parameters.
Specifically, the RGB channel numerical tensor corresponding to the visible light image may be recalculated according to the camera parameters to obtain a corrected image tensor, and the corrected image tensor is stored as the corrected visible light image.
Similarly, before correcting the infrared image, the infrared camera needs to be calibrated, so as to obtain the parameters of the infrared camera. In addition, the method can be used for solving the corresponding relation between the characteristic points in the corrected visible light image and the corresponding characteristic points in the infrared image, so that the infrared camera parameters are obtained. After obtaining the infrared camera parameters, the temperature channel or pseudo-color RGB channel numerical tensor of the infrared image can be recalculated to obtain a corrected second image tensor, and the corrected second image tensor is stored as the corrected infrared image. It should be noted that, in the two corrected images, the corresponding positions and sizes of the building to be tested in the images are the same, so in the acquisition point, the two corrected images are the corrected and registered images.
Step S313, identifying the corrected infrared image according to a pre-training infrared image detection model to obtain the infrared detection result, and identifying the corrected visible image according to a pre-training visible image detection model to obtain the visible light detection result.
In this embodiment, before two corrected images are respectively identified by the image detection model, a corresponding model is further required to be generated by training, so that before the corrected infrared image is identified according to a pre-training infrared image detection model, and the corrected visible image is identified according to a pre-training visible image detection model, a preset visible image set and a preset infrared image set are required to be acquired, then the image detection model is trained according to the preset visible image set, the pre-training visible image detection model is obtained, and the image detection model is trained according to the preset infrared image set, so as to obtain the pre-training infrared image detection model. And by constructing a pre-trained image detection model, the images in the detection data are analyzed and processed, and the accuracy of on-off state inspection is improved.
After the pre-trained model is obtained, the visible light image and the infrared image after correction registration are respectively analyzed and processed based on the pre-trained model, so that a detection result for calculating an on-off state detection result is obtained, and the accuracy of the on-off state detection result is improved.
For example, referring to fig. 7 and fig. 8, in the process of identifying the visible light image and the infrared image by the pre-trained detection model, the opening fan in the opening state can be marked, the corresponding marking information in fig. 7 is visible cursor marking information, and the corresponding marking information in fig. 8 is infrared marking information.
In the technical scheme disclosed in the embodiment, the RGB channel numerical tensors of the visible light image and the infrared image are recalculated through the camera parameters and the correction algorithm, so that the correction processing of the image is realized, the corrected visible light image and infrared image are identified through the trained image detection model, the accuracy of the detection result of the on-off state is improved, the data processing capacity is reduced, and the detection efficiency is improved.
Referring to fig. 9, fig. 9 is a schematic diagram of a terminal structure of a hardware running environment according to an embodiment of the present invention.
As shown in fig. 9, the terminal may include: a processor 1001, such as a central processing unit (Central Processing Unit, CPU), a communication bus 1002, a network interface 1003, a memory 1004. Wherein the communication bus 1002 is used to enable connected communication between these components. The network interface 1003 may optionally include a standard wired interface, a wireless interface (e.g., a wireless FIdelity (WI-FI) interface). The Memory 1004 may be a high-speed RAM Memory (Random Access Memory, RAM) or a stable Non-Volatile Memory (NVM), such as a disk Memory. The memory 1004 may also optionally be a storage device separate from the processor 1001 described above.
It will be appreciated by those skilled in the art that the terminal structure shown in fig. 9 is not limiting of the terminal and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
As shown in fig. 9, the memory 1004, which is a computer storage medium, may include an operating system, a data storage module, a network communication module, and a curtain wall opening/closing state detection program.
In the terminal shown in fig. 9, the network interface 1003 is mainly used for connecting to a background server, and performing data communication with the background server; the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and perform the following operations:
Determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the shape and the size parameters of the outer vertical surface of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating an unmanned aerial vehicle inspection route containing the data acquisition position;
Controlling the unmanned aerial vehicle to fly along the unmanned aerial vehicle inspection route, and executing a data acquisition instruction by the unmanned aerial vehicle when the unmanned aerial vehicle reaches a preset data acquisition position of the inspection route, wherein the unmanned aerial vehicle synchronously acquires thermal infrared and visible light images according to data acquisition parameters of each camera;
Receiving and processing image detection data acquired by the unmanned aerial vehicle, and determining an opening and closing state detection result of the building to be detected according to the image detection data;
And determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to each data acquisition position of the unmanned aerial vehicle and the opening and closing state detection result, and forming a final detection result.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting and registering the image detection data, and based on the registered image, utilizing a pre-training detection model to identify the opening fan which is not closed, so as to obtain an infrared image detection result and a visible light image detection result corresponding to the image detection data;
And carrying out superposition fusion treatment on the infrared detection result and the visible light detection result, and determining the opening and closing state detection result of the building to be detected by the fusion result after superposition fusion treatment.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
determining an infrared marking frame corresponding to the infrared detection result and a visible light marking frame corresponding to the visible light detection result;
Overlapping and fusing the infrared marking frame and the visible light marking frame to obtain a fused target marking frame, wherein the target marking frame is the fusion result;
and taking the target mark frame as a detection result of the opening and closing state of the building to be detected.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
after receiving the image detection data acquired by the unmanned aerial vehicle, correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting an infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters to obtain a corrected infrared image;
And identifying the corrected infrared image according to a pre-training infrared image detection model to obtain the infrared detection result, and identifying the corrected visible image according to a pre-training visible image detection model to obtain the visible light detection result.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
calibrating a visible light camera to obtain the camera parameters;
The step of correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image comprises the following steps:
And recalculating the RGB channel numerical tensor corresponding to the visible light image according to the camera parameters to obtain a corrected image tensor, and storing the corrected image tensor as the corrected visible light image.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
Calibrating an infrared camera or solving based on the corresponding relation between the characteristic points in the corrected visible light image and the corresponding characteristic points in the infrared image to obtain the infrared camera parameters;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting the infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters, wherein the step of obtaining the corrected infrared image comprises the following steps of:
And according to the infrared camera parameters, recalculating the temperature channel or the pseudo-color RGB channel numerical tensor of the infrared image to obtain a corrected second image tensor, and storing the corrected second image tensor as the corrected infrared image.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
When the image detection data are received, the space position and the camera cradle head angle of the unmanned aerial vehicle are read from the image attribute information of the image detection data;
The step of determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to the data acquisition positions of the unmanned aerial vehicle and the opening and closing state detection results and forming a final detection result comprises the following steps:
and projecting the opening and closing state detection result of the opening and closing fan to the spatial position of the building curtain wall from the same coordinate plane of the registered image according to the spatial position of the building curtain wall, the camera pan-tilt angle and the opening and closing state detection result of the opening and closing fan, so as to obtain the final detection result.
Further, the processor 1001 may call the curtain wall opening and closing state detection program stored in the memory 1004, and further perform the following operations:
determining a patrol flight area and a patrol distance of the unmanned aerial vehicle according to the building height, the building width, the outer elevation shape, the peripheral obstacles corresponding to the peripheral environment information and the shielding objects affecting flight;
Determining the data acquisition parameters according to the illumination condition, the camera view angle, the preset image acquisition distance, the camera zoom multiple and the adjacent image overlapping rate on the flight route when the unmanned aerial vehicle acquires the images in the inspection process;
And generating an unmanned aerial vehicle inspection route containing a data acquisition position according to the inspection flight area, the inspection distance, the data acquisition parameters and the flight parameters of the unmanned aerial vehicle.
Furthermore, it will be appreciated by those of ordinary skill in the art that implementing all or part of the processes in the methods of the above embodiments may be accomplished by computer programs to instruct related hardware. The computer program comprises program instructions, and the computer program may be stored in a storage medium, which is a computer readable storage medium. The program instructions are executed by at least one processor in the control terminal to carry out the flow steps of the embodiments of the method described above.
Therefore, the present invention also provides a computer readable storage medium, where a curtain wall opening and closing state detection program is stored, where the curtain wall opening and closing state detection program, when executed by a processor, implements each step of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle described in the above embodiment.
It should be noted that, because the storage medium provided in the embodiments of the present application is a storage medium used for implementing the method in the embodiments of the present application, based on the method described in the embodiments of the present application, a person skilled in the art can understand the specific structure and the modification of the storage medium, and therefore, the description thereof is omitted herein. All storage media adopted by the method of the embodiment of the application belong to the scope of protection of the application.
It will be appreciated by those skilled in the art that embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each flowchart and/or block of the flowchart illustrations and/or block diagrams, and combinations of flowcharts and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that in the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.
The foregoing description is only of the preferred embodiments of the present invention, and is not intended to limit the scope of the invention, but rather is intended to cover any equivalents of the structures or equivalent processes disclosed herein or in the alternative, which may be employed directly or indirectly in other related arts.

Claims (7)

1. The curtain wall opening and closing state detection method based on the unmanned aerial vehicle is characterized by being applied to a patrol system, and comprises the following steps of:
Determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the shape and the size parameters of the outer vertical surface of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle, and generating an unmanned aerial vehicle inspection route containing the data acquisition position;
Controlling the unmanned aerial vehicle to fly along the unmanned aerial vehicle inspection route, and executing a data acquisition instruction by the unmanned aerial vehicle when the unmanned aerial vehicle reaches a preset data acquisition position of the inspection route, wherein the unmanned aerial vehicle synchronously acquires thermal infrared and visible light images according to data acquisition parameters of each camera;
after receiving the image detection data acquired by the unmanned aerial vehicle, correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting an infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters to obtain a corrected infrared image;
Registering the corrected infrared image and the corrected visible light image;
Identifying the corrected infrared image according to a pre-training infrared image detection model to obtain an infrared image detection result, and identifying the corrected visible image according to a pre-training visible image detection model to obtain a visible image detection result;
determining an infrared marking frame corresponding to the infrared image detection result and a visible light marking frame corresponding to the visible light image detection result;
Overlapping and fusing the infrared marking frame and the visible light marking frame to obtain a fused target marking frame, wherein the target marking frame is the fusion result;
taking the target mark frame as a detection result of the opening and closing state of the building to be detected;
And determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to each data acquisition position of the unmanned aerial vehicle and the opening and closing state detection result, and forming a final detection result.
2. The method for detecting the opening and closing state of the curtain wall based on the unmanned aerial vehicle according to claim 1, wherein after the image detection data acquired by the unmanned aerial vehicle is received, the method further comprises the following steps of:
calibrating a visible light camera to obtain the camera parameters;
The step of correcting the visible light image in the image detection data according to a preset visible light image correction algorithm and camera parameters to obtain a corrected visible light image comprises the following steps:
And recalculating the RGB channel numerical tensor corresponding to the visible light image according to the camera parameters to obtain a corrected image tensor, and storing the corrected image tensor as the corrected visible light image.
3. The method for detecting the opening and closing state of the curtain wall based on the unmanned aerial vehicle according to claim 1, wherein after the image detection data acquired by the unmanned aerial vehicle is received, the method further comprises the following steps of:
Calibrating an infrared camera or solving based on the corresponding relation between the characteristic points in the corrected visible light image and the corresponding characteristic points in the infrared image to obtain the infrared camera parameters;
After receiving the image detection data acquired by the unmanned aerial vehicle, correcting the infrared image in the image detection data according to a preset infrared image correction algorithm and infrared camera parameters, wherein the step of obtaining the corrected infrared image comprises the following steps of:
And according to the infrared camera parameters, recalculating the temperature channel or the pseudo-color RGB channel numerical tensor of the infrared image to obtain a corrected second image tensor, and storing the corrected second image tensor as the corrected infrared image.
4. The method for detecting the opening and closing states of the curtain wall based on the unmanned aerial vehicle according to the detection results of the opening and closing states and the data acquisition positions of the unmanned aerial vehicle, before determining the spatial positions of the building curtain wall corresponding to the opening and closing fans which are not closed and forming the final detection results, further comprises:
Acquiring the space position and the camera pan-tilt angle of the unmanned aerial vehicle when the image detection data are received, wherein the space position and the camera pan-tilt angle of the unmanned aerial vehicle are read from the image attribute information of the image detection data;
The step of determining the spatial position of the building curtain wall corresponding to the un-closed opening fan according to the data acquisition positions of the unmanned aerial vehicle and the opening and closing state detection results and forming a final detection result comprises the following steps:
and projecting the opening and closing state detection result of the opening and closing fan to the spatial position of the building curtain wall from the same coordinate plane of the registered image according to the spatial position of the building curtain wall, the camera pan-tilt angle and the opening and closing state detection result of the opening and closing fan, so as to obtain the final detection result.
5. The method for detecting the opening and closing states of the curtain wall based on the unmanned aerial vehicle according to claim 1, wherein the dimension parameters comprise the building height and the building width of the building to be detected, and the step of determining the inspection distance, the flight parameters and the data acquisition parameters of the unmanned aerial vehicle according to the outer elevation shape and the dimension parameters of the building to be detected and the information of the flight surrounding environment of the unmanned aerial vehicle and generating the unmanned aerial vehicle inspection route comprising the data acquisition position comprises the following steps:
Determining a patrol flight area and a patrol distance of the unmanned aerial vehicle according to the building height, the building width, the outer elevation shape, the peripheral obstacles corresponding to the peripheral environment information and the shielding objects affecting flight;
Determining the data acquisition parameters according to the illumination condition, the camera view angle, the preset image acquisition distance, the camera zoom multiple and the adjacent image overlapping rate on the flight route when the unmanned aerial vehicle acquires the images in the inspection process;
And generating the unmanned aerial vehicle inspection route containing the data acquisition position according to the inspection flight area, the inspection distance, the data acquisition parameters and the flight parameters of the unmanned aerial vehicle.
6. A patrol system, the patrol system comprising: the method for detecting the curtain wall opening and closing state based on the unmanned aerial vehicle comprises a memory, a processor and a curtain wall opening and closing state detection program which is stored in the memory and can run on the processor, wherein the curtain wall opening and closing state detection program is executed by the processor to realize the steps of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle according to any one of claims 1 to 5.
7. A computer-readable storage medium, wherein a curtain wall opening and closing state detection program is stored on the computer-readable storage medium, and when the curtain wall opening and closing state detection program is executed by a processor, the steps of the curtain wall opening and closing state detection method based on the unmanned aerial vehicle according to any one of claims 1 to 5 are realized.
CN202410311592.2A 2024-03-19 2024-03-19 Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle Active CN117911907B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410311592.2A CN117911907B (en) 2024-03-19 2024-03-19 Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410311592.2A CN117911907B (en) 2024-03-19 2024-03-19 Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN117911907A CN117911907A (en) 2024-04-19
CN117911907B true CN117911907B (en) 2024-05-17

Family

ID=90689422

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410311592.2A Active CN117911907B (en) 2024-03-19 2024-03-19 Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN117911907B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006039973A (en) * 2004-07-28 2006-02-09 Toshiba Corp Intrusion detection method and intrusion detection device
CN206258943U (en) * 2016-12-12 2017-06-16 上海知鲤振动科技有限公司 One kind is based on infrared imaging and unmanned plane Super High glass curtain wall connecting node cruising inspection system
CN111144324A (en) * 2019-12-28 2020-05-12 西安因诺航空科技有限公司 System and method for analyzing and managing abnormal target of photovoltaic panel during unmanned aerial vehicle inspection
CN114326794A (en) * 2021-12-13 2022-04-12 广东省建设工程质量安全检测总站有限公司 Curtain wall defect identification method, control terminal, server and readable storage medium
WO2023273219A1 (en) * 2021-06-28 2023-01-05 中冶建筑研究总院(深圳)有限公司 Glass curtain wall open window open state detection method and apparatus, device, and medium
CN116297488A (en) * 2023-03-10 2023-06-23 深圳市城市公共安全技术研究院有限公司 Curtain wall monitoring method, device, system and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006039973A (en) * 2004-07-28 2006-02-09 Toshiba Corp Intrusion detection method and intrusion detection device
CN206258943U (en) * 2016-12-12 2017-06-16 上海知鲤振动科技有限公司 One kind is based on infrared imaging and unmanned plane Super High glass curtain wall connecting node cruising inspection system
CN111144324A (en) * 2019-12-28 2020-05-12 西安因诺航空科技有限公司 System and method for analyzing and managing abnormal target of photovoltaic panel during unmanned aerial vehicle inspection
WO2023273219A1 (en) * 2021-06-28 2023-01-05 中冶建筑研究总院(深圳)有限公司 Glass curtain wall open window open state detection method and apparatus, device, and medium
CN114326794A (en) * 2021-12-13 2022-04-12 广东省建设工程质量安全检测总站有限公司 Curtain wall defect identification method, control terminal, server and readable storage medium
CN116297488A (en) * 2023-03-10 2023-06-23 深圳市城市公共安全技术研究院有限公司 Curtain wall monitoring method, device, system and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
天空地一体化监测技术在既有建筑安全管控中的应用和展望;周艳兵 等;建筑结构;20221231;第52卷(第S2期);第1893-1897页 *

Also Published As

Publication number Publication date
CN117911907A (en) 2024-04-19

Similar Documents

Publication Publication Date Title
CN109977813B (en) Inspection robot target positioning method based on deep learning framework
CN109785337B (en) In-column mammal counting method based on example segmentation algorithm
CN108037770B (en) Unmanned aerial vehicle power transmission line inspection system and method based on artificial intelligence
CN108898047B (en) Pedestrian detection method and system based on blocking and shielding perception
Rudol et al. Human body detection and geolocalization for UAV search and rescue missions using color and thermal imagery
CN110142785A (en) A kind of crusing robot visual servo method based on target detection
CN111931565A (en) Photovoltaic power station UAV-based autonomous inspection and hot spot identification method and system
CN111582234B (en) Large-scale oil tea tree forest fruit intelligent detection and counting method based on UAV and deep learning
CN110490936B (en) Calibration method, device and equipment of vehicle camera and readable storage medium
CN110910341B (en) Method and device for detecting defects of rusted areas of power transmission line
Yang et al. A robotic system towards concrete structure spalling and crack database
CN110255318B (en) Method for detecting idle articles in elevator car based on image semantic segmentation
CN110287907A (en) A kind of method for checking object and device
CN114089330B (en) Indoor mobile robot glass detection and map updating method based on depth image restoration
CN110390261A (en) Object detection method, device, computer readable storage medium and electronic equipment
CN115937746A (en) Smoke and fire event monitoring method and device and storage medium
CN113378754B (en) Bare soil monitoring method for construction site
CN117911907B (en) Curtain wall opening and closing state detection method, inspection system and medium based on unmanned aerial vehicle
CN107886544A (en) IMAQ control method and device for vehicle calibration
CN112950543A (en) Bridge maintenance method and system, storage medium and intelligent terminal
CN108225735A (en) A kind of precision approach indicator flight check method of view-based access control model
CN112699748A (en) Human-vehicle distance estimation method based on YOLO and RGB image
CN115373416B (en) Intelligent inspection method for railway electric power through line
CN115019216B (en) Real-time ground object detection and positioning counting method, system and computer
CN116203976A (en) Indoor inspection method and device for transformer substation, unmanned aerial vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant