CN115515077A - UAV-based dynamic generation method and system for WSN data acquisition track - Google Patents

UAV-based dynamic generation method and system for WSN data acquisition track Download PDF

Info

Publication number
CN115515077A
CN115515077A CN202211465106.XA CN202211465106A CN115515077A CN 115515077 A CN115515077 A CN 115515077A CN 202211465106 A CN202211465106 A CN 202211465106A CN 115515077 A CN115515077 A CN 115515077A
Authority
CN
China
Prior art keywords
monitored
area
data
monitoring
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211465106.XA
Other languages
Chinese (zh)
Other versions
CN115515077B (en
Inventor
王骥
陈宇歌
杨玉强
谢再秘
莫春梅
李依潼
刘雯景
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Ocean University
Original Assignee
Guangdong Ocean University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Ocean University filed Critical Guangdong Ocean University
Priority to CN202211465106.XA priority Critical patent/CN115515077B/en
Publication of CN115515077A publication Critical patent/CN115515077A/en
Application granted granted Critical
Publication of CN115515077B publication Critical patent/CN115515077B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/029Location-based management or tracking services
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D21/00Measuring or testing not otherwise provided for
    • G01D21/02Measuring two or more variables by means not covered by a single other subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04QSELECTING
    • H04Q9/00Arrangements in telecontrol or telemetry systems for selectively calling a substation from a main station, in which substation desired apparatus is selected for applying a control signal thereto or for obtaining measured values therefrom
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W84/00Network topologies
    • H04W84/18Self-organising networks, e.g. ad-hoc networks or sensor networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Alarm Systems (AREA)
  • Emergency Alarm Devices (AREA)

Abstract

The invention discloses a WSN data acquisition track dynamic generation method and a system based on UAVs (unmanned aerial vehicles). By arranging various monitoring devices on an unmanned aerial vehicle carrier, unmanned aerial vehicle identification is carried out on a region to be monitored of a region blind spot, and building information, security information and environmental data are constructed; the problem of can't realize not having the dead angle control to whole region through installing monitoring facilities at the scene in traditional intelligent residential district is used is overcome, utilize unmanned aerial vehicle to carry out the data acquisition of whole region and carry out the dynamic monitoring show, can carry out the intelligent monitoring of all-round, no dead angle to whole target area, realize the real-time data dynamic show of target area, provide the real-time supervision data of all-round, no dead angle for the regional staff of target, further provide solution for intelligent residential district, wisdom family.

Description

UAV-based dynamic generation method and system for WSN data acquisition track
Technical Field
The invention relates to the technical field of big data processing, in particular to a WSN data acquisition track dynamic generation method and system based on UAVs.
Background
UAVs, i.e. unmanned planes, simply "drones". An unmanned aircraft operated by a radio remote control device and self-contained program control means, or operated autonomously, either fully or intermittently, by an on-board computer. Unmanned aircraft are often more suited to tasks that are too "fool, messy, or dangerous" than manned aircraft. Unmanned aerial vehicles can be classified into military and civilian applications. For military use, unmanned aerial vehicles are divided into reconnaissance aircraft and target drone; in the civil aspect, the unmanned aerial vehicle + the industry application is really just needed by the unmanned aerial vehicle; the unmanned aerial vehicle is applied to the fields of aerial photography, agriculture, plant protection, miniature self-timer, express transportation, disaster relief, wild animal observation, infectious disease monitoring, surveying and mapping, news reporting, power inspection, disaster relief, film and television shooting, romantic manufacturing and the like, the application of the unmanned aerial vehicle is greatly expanded, and developed countries actively expand the industrial application and develop the unmanned aerial vehicle technology.
A WSN, a Wireless Sensor network (Wireless Sensor Networks), is a distributed sensing network whose tips are sensors that can sense and inspect the outside world. The sensor in the WSN communicates in a wireless mode, so that the network setting is flexible, the position of equipment can be changed at any time, and wired or wireless connection can be performed with the Internet; a multi-hop ad hoc network formed by wireless communication.
With the development of the technology, the application of the WSN technology is more and more extensive; particularly in the field of artificial intelligence such as intelligent communities, intelligent families and the like, cameras, infrared sensors, radar sensors and other various environment monitoring sensors are arranged in the communities and are used for security protection, environment monitoring and application; therefore, a series of functional systems are realized, and a wireless sensor network is formed. However, in practical application, due to the factors that the position of the sensor, the effective distance of data transmission, the data acquisition are affected by the object vision, etc., the WSN application field in the intelligent cell has a large limitation at present, only a wireless sensor network can be formed in a local area, and the intelligent monitoring in all directions and without dead angles can not be realized in the whole area.
In recent years, with the development of unmanned aerial vehicle UAV technology, unmanned aerial vehicles are widely applied in various fields, and the dilemma is expected to be broken in the future. This application is on the basis of unmanned aerial vehicle UAV technique, explore unmanned aerial vehicle at intelligent residential district, the practical application of wisdom family, develop a WSN data acquisition orbit dynamic generation strategy based on UAV, can carry out the intelligent monitoring at all-round, no dead angle to whole target area, realize the real-time data dynamic display of target area, provide the real-time supervision data at all-round, no dead angle for the regional staff of target, further provide solution for intelligent residential district, wisdom family.
Disclosure of Invention
The invention provides a WSN data acquisition track dynamic generation method and a WSN data acquisition track dynamic generation system based on UAVs (unmanned aerial vehicles), which can carry out omnibearing non-dead-angle intelligent monitoring on the whole target area, realize real-time data dynamic display of the target area, provide omnibearing non-dead-angle real-time monitoring data for target area workers and further provide solutions for intelligent communities and intelligent families.
In order to solve the technical problem, an embodiment of the present invention provides a method for dynamically generating a WSN data acquisition trajectory based on a UAV, which is applied to an unmanned aerial vehicle carrier, wherein the unmanned aerial vehicle carrier is provided with a plurality of monitoring devices, and each monitoring device includes a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; the method comprises the following steps:
acquiring an overall layout on a target area, and carrying out gridding processing on the overall layout to obtain a grid layout; the overall layout diagram is marked with a place for setting the monitoring equipment on site and a corresponding monitoring range;
dividing a monitored area and an area to be monitored in the grid layout according to a place where monitoring equipment is arranged on site and a corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored by the unmanned aerial vehicle carrier;
receiving monitoring data of the unmanned aerial vehicle carrier in the area to be monitored, wherein the monitoring data comprises image data collected by a camera, infrared data collected by an infrared sensor, radar data collected by a radar sensor and environmental data collected by an environmental monitoring sensor;
generating building information of an area to be monitored as first dimension information according to the image data and the radar data;
generating security information of an area to be monitored as second dimension information according to the infrared data and the image data;
and generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
As a preferred scheme, the step of generating building information of an area to be monitored as first dimension information according to the image data and the radar data specifically includes:
identifying the building features in the image data to obtain building plane information in the area to be monitored;
extracting radar data corresponding to the building features to obtain building space information corresponding to each building feature;
constructing a space coordinate system in the area to be monitored, and generating three-dimensional information of building characteristics in the area to be monitored according to the building plane information and the building space information;
and adjusting the three-dimensional information of the building characteristics in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored which are divided in the grid layout chart to obtain the building information of the area to be monitored as first dimension information.
As a preferred scheme, the step of adjusting three-dimensional information of building features in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored divided in the grid layout drawing specifically includes:
determining a region boundary between the monitored region and a region to be monitored according to a monitoring range region corresponding to monitoring equipment arranged in the monitored region;
identifying the region boundary in the image data, and determining a monitoring blind spot on the region boundary according to a preset rule;
calculating the data proportion of the monitoring blind point to the three-dimensional information corresponding to the characteristics of the buildings in the area to be monitored, and taking the data proportion as a first data relation value;
selecting a plurality of reference points in the monitored area, and determining the data proportion of the three-dimensional information corresponding to the monitoring blind points and the reference points as a second data relation value;
and correspondingly adjusting the three-dimensional information of the building characteristics in the area to be monitored according to the proportion between the first data relation value and the second data relation value.
As a preferred scheme, the step of generating security information of an area to be monitored as second dimension information according to the infrared data and the image data specifically includes:
the method comprises the steps that the movement condition of a living being in a region to be monitored is judged in real time through infrared data, and when the fact that the infrared data corresponding to the same living being change in two continuous time units is determined, the target position of the living being moving currently in a grid layout is calculated according to the infrared data changing in the two continuous time units;
acquiring the target position, extracting image data at the current time, performing feature recognition on the extracted image data, and determining a biological type and a behavior type corresponding to the currently moving organism;
generating a security risk value according to the biological type and the behavior type, and generating a corresponding security early warning level according to the security risk value;
and according to the biological type, the behavior type and the security early warning level, correlating in the target position of the grid layout drawing to generate security information as second dimension information.
As a preferred scheme, the step of generating a security risk value according to the biological type and the behavior type specifically includes:
predicting the next execution action of the currently moving creature according to the behavior type of the currently moving creature to obtain a predicted behavior;
determining the visual direction of the currently moving living being according to the extracted feature recognition result of the image data, and determining the target object of the currently moving living being according to the visual direction and the prediction behavior;
calculating a distance value between the target object and a currently moving living being according to the position of the target object in the grid layout;
and according to the distance value, the predicted behavior and the type of the living beings, judging the risk condition of the currently moving living beings on the target object in the next execution action, and generating a security risk value.
As a preferred scheme, the calculation formula for generating the security risk value is as follows:
Figure DEST_PATH_IMAGE002AAAA
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE004A
the value is a security risk value;
Figure DEST_PATH_IMAGE006A
to define different constant values for different types of organisms,
Figure DEST_PATH_IMAGE006AA
j in (b) is a biotype;
Figure DEST_PATH_IMAGE008A
to define different constant values based on different predicted behaviors,
Figure DEST_PATH_IMAGE008AA
i in (1) is the predicted behavior;
Figure DEST_PATH_IMAGE010A
in order to define the minimum distance at which different predicted behaviors will affect the target object according to the preset rules,
Figure DEST_PATH_IMAGE012A
defining the maximum distance of the influence of different prediction behaviors on a target object according to a preset rule;
Figure DEST_PATH_IMAGE014A
and
Figure DEST_PATH_IMAGE016A
respectively are preset weighted values which are constants;
Figure DEST_PATH_IMAGE018A
is a distance value.
As a preferred scheme, the step of generating the dynamic monitoring information of the area to be monitored in real time according to the environmental data, the first dimension information, and the second dimension information specifically includes:
establishing a three-dimensional model, and constructing a building frame in the three-dimensional model by taking the first dimension information as bottom data;
selecting a plurality of grid points in the area to be monitored on the grid layout drawing, and adjusting the position of the building frame according to the actual positions of the grid points in the three-dimensional model to ensure that the positions on the grid layout drawing are consistent with the actual positions;
according to the position of the second dimension information in the building frame, associating the second dimension information to a corresponding node of the building frame, and associating the second dimension information with corresponding first dimension information;
according to the position corresponding to the environment data, the environment data are correlated to the corresponding node of the building frame, and the environment data are correlated with the corresponding first dimension information;
according to a position instruction input by a user, determining first dimension information corresponding to the position instruction in the three-dimensional model, and dynamically displaying environment data and second dimension information associated with the first dimension information.
Preferably, the environment monitoring sensor includes: the device comprises a temperature sensor, an air temperature and humidity sensor, an evaporation sensor, a rainfall sensor, an illumination sensor and an air speed and wind direction sensor.
Correspondingly, the invention also provides a WSN data acquisition track dynamic generation system based on the UAV, which is applied to the UAV carrier, wherein the UAV carrier is provided with a plurality of monitoring devices, and each monitoring device comprises a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; the system comprises: the system comprises a grid processing module, a region division module, a data acquisition module, a first dimension module, a second dimension module and a dynamic monitoring module;
the grid processing module is used for acquiring an overall layout on a target area and carrying out gridding processing on the overall layout to obtain a grid layout; the overall layout diagram is marked with a place for setting the monitoring equipment on site and a corresponding monitoring range;
the area division module is used for dividing a monitored area and an area to be monitored in the grid layout according to a place where monitoring equipment is arranged on site and a corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored by the unmanned aerial vehicle carrier;
the data acquisition module is used for receiving monitoring data of the unmanned aerial vehicle carrier in the area to be monitored, and the monitoring data comprises image data acquired by a camera, infrared data acquired by an infrared sensor, radar data acquired by a radar sensor and environmental data acquired by an environmental monitoring sensor;
the first dimension module is used for generating building information of an area to be monitored as first dimension information according to the image data and the radar data;
the second dimension module is used for generating security information of an area to be monitored as second dimension information according to the infrared data and the image data;
and the dynamic monitoring module is used for generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
As a preferred scheme, the first dimension module is specifically configured to: identifying the building features in the image data to obtain building plane information in an area to be monitored; extracting radar data corresponding to the building features to obtain building space information corresponding to each building feature; constructing a space coordinate system in the area to be monitored, and generating three-dimensional information of building characteristics in the area to be monitored according to the building plane information and the building space information; and adjusting the three-dimensional information of the building characteristics in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored which are divided in the grid layout chart to obtain the building information of the area to be monitored as first dimension information.
As a preferred scheme, the first dimension module is configured to adjust, according to a monitored area and an area to be monitored divided in the grid layout diagram, three-dimensional information of building features in the area to be monitored with the monitored area as a reference, and specifically includes: determining a region boundary between the monitored region and a region to be monitored according to a monitoring range region corresponding to monitoring equipment arranged in the monitored region; identifying the region boundary in the image data, and determining a monitoring blind spot on the region boundary according to a preset rule; calculating the data proportion of the three-dimensional information corresponding to the monitoring blind point and the characteristics of the buildings in the area to be monitored as a first data relation value; selecting a plurality of reference points in the monitored area, and determining the data proportion of the three-dimensional information corresponding to the monitoring blind points and the reference points as a second data relation value; and correspondingly adjusting the three-dimensional information of the building characteristics in the area to be monitored according to the proportion between the first data relation value and the second data relation value.
As a preferred solution, the second dimension module is specifically configured to: the method comprises the steps that the movement condition of a living being in a region to be monitored is judged in real time through infrared data, and when the fact that the infrared data corresponding to the same living being change in two continuous time units is determined, the target position of the living being moving currently in a grid layout is calculated according to the infrared data changing in the two continuous time units; acquiring the target position, extracting image data at the current time, performing feature recognition on the extracted image data, and determining a biological type and a behavior type corresponding to a currently moving organism; generating a security risk value according to the biological type and the behavior type, and generating a corresponding security early warning level according to the security risk value; and according to the biological type, the behavior type and the security early warning level, correlating in the target position of the grid layout drawing to generate security information as second dimension information.
As a preferred scheme, the second dimension module is configured to generate a security risk value according to the biological type and the behavior type, and specifically includes: predicting the next execution action of the currently moving creature according to the behavior type of the currently moving creature to obtain a predicted behavior; determining the visual direction of the currently moving living being according to the feature recognition result of the extracted image data, and determining the target object of the currently moving living being according to the visual direction and the predicted behavior; calculating a distance value between the target object and a currently moving living being according to the position of the target object in the grid layout; and according to the distance value, the predicted behavior and the type of the living beings, judging the risk condition brought to the target object by the currently moving living beings in the next execution action, and generating a security risk value.
As a preferred scheme, the calculation formula for generating the security risk value is as follows:
Figure DEST_PATH_IMAGE019
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE004AA
the value is a security risk value;
Figure DEST_PATH_IMAGE006AAA
to define different constant values for different biotypes,
Figure DEST_PATH_IMAGE006AAAA
j in (1) is a biotype;
Figure DEST_PATH_IMAGE008AAA
to define different constant values based on different predicted behavior,
Figure DEST_PATH_IMAGE008AAAA
i in (1) is the predicted behavior;
Figure DEST_PATH_IMAGE010AA
in order to define the minimum distance at which different predicted behaviors have an impact on the target object according to preset rules,
Figure DEST_PATH_IMAGE012AA
defining the maximum distance of influence of different prediction behaviors on a target object according to a preset rule;
Figure DEST_PATH_IMAGE014AA
and
Figure DEST_PATH_IMAGE016AA
respectively are preset weight values which are constants;
Figure DEST_PATH_IMAGE018AA
is a distance value.
As a preferred scheme, the dynamic monitoring module is specifically configured to: establishing a three-dimensional model, and constructing a building frame in the three-dimensional model by using the first dimension information as bottom data; selecting a plurality of grid points in the area to be monitored on the grid layout chart, and adjusting the position of the building frame according to the actual positions of the grid points in the three-dimensional model to ensure that the positions on the grid layout chart are consistent with the actual positions; according to the position of the second dimension information in the building frame, associating the second dimension information to a corresponding node of the building frame, and associating the second dimension information with corresponding first dimension information; according to the position corresponding to the environment data, the environment data are correlated to the corresponding node of the building frame, and the environment data are correlated with the corresponding first dimension information; according to a position instruction input by a user, determining first dimension information corresponding to the position instruction in the three-dimensional model, and dynamically displaying environment data and second dimension information associated with the first dimension information.
Preferably, the environment monitoring sensor includes: the device comprises a temperature sensor, an air temperature and humidity sensor, an evaporation sensor, a rainfall sensor, an illumination sensor and an air speed and wind direction sensor.
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; wherein the computer program, when running, controls an apparatus on which the computer-readable storage medium is located to perform any one of the above-mentioned methods for dynamically generating a trajectory for UAV-based WSN data acquisition.
An embodiment of the present invention further provides a terminal device, which includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor, when executing the computer program, implements the UAV-based WSN data acquisition trajectory dynamic generation method according to any one of the above items.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
according to the technical scheme, various monitoring devices are arranged on an unmanned aerial vehicle carrier, unmanned aerial vehicle identification is carried out on the area to be monitored of the area blind spot, and building information, security information and environmental data are constructed; the problem of can't realize not having the dead angle control to whole region through installing monitoring facilities at the scene in traditional intelligent residential district is used is overcome, utilize unmanned aerial vehicle to carry out the data acquisition of whole region and carry out the dynamic monitoring show, can carry out the intelligent monitoring of all-round, no dead angle to whole target area, realize the real-time data dynamic show of target area, provide the real-time supervision data of all-round, no dead angle for the regional staff of target, further provide solution for intelligent residential district, wisdom family.
Drawings
FIG. 1: the step flow chart of the WSN data acquisition track dynamic generation method based on the UAV is provided by the embodiment of the invention;
FIG. 2: the invention provides a structural schematic diagram of a WSN data acquisition track dynamic generation system based on a UAV (unmanned aerial vehicle);
FIG. 3: the structure diagram of an embodiment of the terminal device provided by the embodiment of the invention is shown.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Example one
Referring to fig. 1, a flowchart of steps of a method for dynamically generating a WSN data acquisition trajectory based on a UAV according to an embodiment of the present invention is shown. The method is applied to an unmanned aerial vehicle carrier, wherein various monitoring devices are arranged on the unmanned aerial vehicle carrier, and each monitoring device comprises a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; in this embodiment, the environment monitoring sensor includes: the device comprises a temperature sensor, an air temperature and humidity sensor, an evaporation sensor, a rainfall sensor, an illumination sensor and an air speed and wind direction sensor.
The method comprises steps 101 to 106, and the steps are as follows:
step 101, acquiring an overall layout on a target area, and carrying out gridding processing on the overall layout to obtain a grid layout; and the overall layout diagram is marked with a place for setting the monitoring equipment on site and a corresponding monitoring range.
Specifically, the overall layout on the smart cell (target area) can be obtained by a residential cell construction planning map or the like. It should be understood that the overall layout map refers to the overall layout situation in the target area, and does not refer to the situations such as building details, and can be obtained by means of planning maps or remote sensing images, and the steps are not limited. And meshing the whole layout chart to better divide areas in subsequent steps, and marking corresponding nodes in the meshes according to different characteristics to finish position marking.
102, dividing a monitored area and an area to be monitored in the grid layout according to a place where monitoring equipment is arranged on site and a corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored through the unmanned aerial vehicle carrier.
In particular, we divide the smart cell into two types of areas, one where monitoring devices, e.g., cameras, sensors, etc., can be field installed. Monitored areas where data acquisition can be performed with off-line equipment in the field. In the process of distinguishing the monitored area, the monitoring coverage area can be judged according to the function and the equipment model of each monitoring equipment, so that the monitoring area corresponding to the monitored area is determined. The other area is an area to be monitored, which cannot be subjected to data acquisition through field installation monitoring equipment. The above-mentioned region to be monitored is actually the region in which the data acquisition of the blind area by the unmanned aerial vehicle UAV is required in the present research. Often this type of area to be monitored is often found in special buildings, terrain that is not suitable for installation of equipment, dark spaces that are affected by environmental factors and cannot accommodate equipment, etc. In this research, utilize unmanned aerial vehicle to carry out data acquisition to above-mentioned blind area, can overcome the difficult problem of the unable erection equipment in scene.
Step 103, receiving monitoring data of the unmanned aerial vehicle carrier in the area to be monitored, wherein the monitoring data comprises image data collected by a camera, infrared data collected by an infrared sensor, radar data collected by a radar sensor and environmental data collected by an environmental monitoring sensor.
Specifically, in order to realize multi-data display of the smart cell, in general, we must have several data, including image data of the surrounding environment by the camera, infrared sensing data of the living body by the infrared sensor, radar signals of the radar sensor for detecting the shape object such as the building, and environment data of the surrounding environment factors detected by various environment monitoring sensors.
And 104, generating building information of an area to be monitored as first dimension information according to the image data and the radar data.
In this embodiment, the step 104 specifically includes: step 1041, identifying the building features in the image data to obtain building plane information in the area to be monitored; step 1042, extracting radar data corresponding to the building features to obtain building space information corresponding to each building feature; step 1043, constructing a space coordinate system in the area to be monitored, and generating three-dimensional information of building characteristics in the area to be monitored according to the building plane information and the building space information; step 1044 of adjusting three-dimensional information of building features in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored divided in the grid layout drawing to obtain building information of the area to be monitored as first dimension information.
Wherein the step 1044 specifically includes: determining a region boundary between the monitored region and a region to be monitored according to a monitoring range region corresponding to monitoring equipment arranged in the monitored region; identifying the region boundary in the image data, and determining a monitoring blind spot on the region boundary according to a preset rule; calculating the data proportion of the three-dimensional information corresponding to the monitoring blind point and the characteristics of the buildings in the area to be monitored as a first data relation value; selecting a plurality of reference points in the monitored area, and determining the data proportion of the three-dimensional information corresponding to the monitoring blind points and the reference points as a second data relation value; and correspondingly adjusting the three-dimensional information of the building characteristics in the area to be monitored according to the proportion between the first data relation value and the second data relation value.
Specifically, in a research, in order to overcome the problem of a blind area of an area to be monitored, building information in the blind area is calculated together by using image data and radar data collected by an unmanned aerial vehicle. The buildings in the area to be monitored are first identified using the image data (i.e., step 1041), i.e., the frame information, i.e., the plane information, of the buildings. Next, the building is identified by using more detailed dimension data, which is difficult to distinguish by images. However, radar data is reflected on an object through radar signals, the radar transmits electromagnetic waves to irradiate a target and receives an echo of the target, and therefore information such as the distance from the target to an electromagnetic wave transmitting point, the distance change rate (radial speed), the direction and the height is obtained; finding objects and determining their spatial position. After the position of the building in the blind area is obtained, the spatial coordinates of the two areas (the monitored area and the area to be monitored) need to be unified. At this time, the area boundary between the monitored area and the area to be monitored is utilized in the research, and three types of data are adopted: 1. a reference point in a monitored area; 2. a reference point on a zone boundary; 3. a reference point in the area to be monitored. The positions of the three reference points in different area types are adjusted in a proportional relation respectively, so that the unification of space coordinates between the monitored area and the area to be monitored is realized.
And 105, generating security information of the area to be monitored as second dimension information according to the infrared data and the image data.
In this embodiment, the step 105 specifically includes: step 1051, judge the biological movement situation in the area to be monitored in real time through the infrared data, when confirming that the infrared data corresponding to the same biological change on two consecutive time units, calculate the target position of the biological moving at present in the grid layout according to the infrared data changed on two consecutive time units; step 1052, acquiring the target position, extracting image data at the current time, performing feature recognition on the extracted image data, and determining a biological type and a behavior type corresponding to the currently moving biological; step 1053, generating a security risk value according to the biological type and the behavior type and generating a corresponding security early warning level according to the security risk value; and 1054, correlating in the target position of the grid layout according to the biological type, the behavior type and the security early warning level to generate security information as second dimension information.
In this embodiment, the step 1053 specifically includes: predicting the next execution action of the currently moving creature according to the behavior type of the currently moving creature to obtain a predicted behavior; determining the visual direction of the currently moving living being according to the extracted feature recognition result of the image data, and determining the target object of the currently moving living being according to the visual direction and the prediction behavior; calculating a distance value between the target object and a currently moving living being according to the position of the target object in the grid layout; and according to the distance value, the predicted behavior and the type of the living beings, judging the risk condition brought to the target object by the currently moving living beings in the next execution action, and generating a security risk value. The calculation formula for generating the security risk value is as follows:
Figure DEST_PATH_IMAGE020
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE004AAA
the value is a security risk value;
Figure DEST_PATH_IMAGE006_5A
to define different constant values for different types of organisms,
Figure DEST_PATH_IMAGE006_6A
j in (1) is a biotype;
Figure DEST_PATH_IMAGE008_5A
to define different constant values based on different predicted behavior,
Figure DEST_PATH_IMAGE008_6A
i in (1) is the predicted behavior;
Figure DEST_PATH_IMAGE010AAA
in order to define the minimum distance at which different predicted behaviors have an impact on the target object according to preset rules,
Figure DEST_PATH_IMAGE012AAA
defining the maximum distance of the influence of different prediction behaviors on a target object according to a preset rule;
Figure DEST_PATH_IMAGE014AAA
and
Figure DEST_PATH_IMAGE016AAA
respectively are preset weighted values which are constants;
Figure DEST_PATH_IMAGE018AAA
is a distance value.
Specifically, in a research process, in order to overcome the problem of a blind area of a region to be monitored, security information in the blind area is calculated by using image data and infrared data collected by an unmanned aerial vehicle. Firstly, the risk condition of the organism possibly occurring in the current region to be monitored can be judged in advance by utilizing the infrared data (namely step 1051); a sensor for measuring by using physical properties of infrared rays. Infrared light is also called infrared light, and has properties of reflection, refraction, scattering, interference, absorption and the like. Any substance, as long as it has a certain temperature itself (above absolute zero), can radiate infrared rays. The infrared sensor does not directly contact with the measured object during measurement, so that friction does not exist, and the infrared sensor has the advantages of high sensitivity, quick response and the like. The infrared sensor can be used for quickly and accurately identifying the organisms in the monitored area, the organisms are used as infrared capturing points according to the movement condition of the organisms, and the target position of the organisms can be determined by calculating infrared data (the position relation of infrared signals in the image) which change in two time units. Then, by performing feature recognition using the image data, it is possible to accurately know which kind of living body the living body is and what kind of behavior the living body is performing. For example, a person holding a knife and a dog roaring. And generating corresponding risk values according to different types and correlating. Through experimental comparison, an accurate algorithm model is found to be used for calculating the security risk value finally through research, and the method is shown in the formula.
And 106, generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
In this embodiment, the step 106 specifically includes: step 1061, establishing a three-dimensional model, and constructing a building frame in the three-dimensional model by using the first dimension information as bottom data; step 1062, selecting a plurality of grid points in the area to be monitored on the grid layout, and adjusting the position of the building frame according to the actual positions of the grid points in the three-dimensional model, so that the positions on the grid layout are consistent with the actual positions; step 1063, according to the position of the second dimension information in the building frame, associating the second dimension information to a corresponding node of the building frame, and associating the second dimension information with corresponding first dimension information; step 1064, associating the environmental data with corresponding nodes of the building frame according to the corresponding positions of the environmental data, and associating the environmental data with corresponding first dimension information; step 1065, according to a position instruction input by a user, determining first dimension information corresponding to the position instruction in the three-dimensional model, and dynamically displaying environment data and second dimension information associated with the first dimension information.
Specifically, in order to realize the multi-dimensional dynamic display of the monitoring data, after the environmental data, the first dimension information and the second dimension information are obtained through the above steps, the data need to be displayed in a multi-dimensional manner. In the step of carrying out multi-dimensional display, constructing data in a form of constructing a building frame; but still overcomes a problem of alignment between data and data. In contrast, after research, it is determined that multidimensional data alignment action is realized in a mode of grid point position alignment; the data of different dimensionalities can be kept consistent on the same three-dimensional space model, and real dynamic monitoring display is realized.
According to the technical scheme, various monitoring devices are arranged on an unmanned aerial vehicle carrier, unmanned aerial vehicle identification is carried out on the area to be monitored of the area blind spot, and building information, security information and environmental data are constructed; overcome the difficult problem that can't realize no dead angle control to whole region through installing monitoring facilities at the scene in traditional intelligent residential district uses, utilize unmanned aerial vehicle to carry out the data acquisition of whole region and carry out the dynamic monitoring show, can carry out the intelligent monitoring all-round, no dead angle to whole target area, realize the real-time data dynamic show of target area, provide the real-time supervision data all-round, no dead angle for the target area staff, further provide solution for intelligent residential district, wisdom family.
Example two
Referring to fig. 2, a schematic structural diagram of a system for dynamically generating a WSN data acquisition trajectory based on a UAV according to another embodiment of the present invention is shown. The system is applied to an unmanned aerial vehicle carrier, a plurality of monitoring devices are arranged on the unmanned aerial vehicle carrier, and each monitoring device comprises a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; in this embodiment, the environment monitoring sensor includes: the device comprises a temperature sensor, an air temperature and humidity sensor, an evaporation sensor, a rainfall sensor, an illumination sensor and an air speed and wind direction sensor.
The system comprises: the device comprises a grid processing module, a region division module, a data acquisition module, a first dimension module, a second dimension module and a dynamic monitoring module.
The grid processing module is used for acquiring an overall layout on a target area and carrying out gridding processing on the overall layout to obtain a grid layout; and the overall layout is marked with a place for arranging the monitoring equipment on site and a corresponding monitoring range.
The area division module is used for dividing a monitored area and an area to be monitored in the grid layout according to the place where the monitoring equipment is arranged on site and the corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored through the unmanned aerial vehicle carrier.
The data acquisition module is used for receiving the monitoring data of the area to be monitored, including the image data collected by the camera, the infrared data collected by the infrared sensor, the radar data collected by the radar sensor, and the environmental data collected by the environmental monitoring sensor.
And the first dimension module is used for generating building information of an area to be monitored as first dimension information according to the image data and the radar data.
In this embodiment, the first dimension module is specifically configured to: identifying the building features in the image data to obtain building plane information in an area to be monitored; extracting radar data corresponding to the building features to obtain building space information corresponding to each building feature; constructing a space coordinate system in the area to be monitored, and generating three-dimensional information of the building characteristics in the area to be monitored according to the building plane information and the building space information; and adjusting the three-dimensional information of the building characteristics in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored which are divided in the grid layout chart to obtain the building information of the area to be monitored as first dimension information.
In this embodiment, the step of adjusting the three-dimensional information of the building features in the area to be monitored by using the monitored area as a reference according to the monitored area and the area to be monitored divided in the grid layout drawing by the first dimension module specifically includes: determining a region boundary between the monitored region and a region to be monitored according to a monitoring range region corresponding to monitoring equipment arranged in the monitored region; identifying the region boundary in the image data, and determining a monitoring blind spot on the region boundary according to a preset rule; calculating the data proportion of the monitoring blind point to the three-dimensional information corresponding to the characteristics of the buildings in the area to be monitored, and taking the data proportion as a first data relation value; selecting a plurality of reference points in the monitored area, and determining the data proportion of the three-dimensional information corresponding to the monitoring blind points and the reference points as a second data relation value; and correspondingly adjusting the three-dimensional information of the building characteristics in the area to be monitored according to the proportion between the first data relation value and the second data relation value.
And the second dimension module is used for generating security protection information of the area to be monitored as second dimension information according to the infrared data and the image data.
In this embodiment, the second dimension module is specifically configured to: judging the movement condition of the creature in the area to be monitored in real time through the infrared data, and calculating the target position of the currently moving creature in the grid layout according to the infrared data changed in two continuous time units when the infrared data corresponding to the same creature is determined to be changed in the two continuous time units; acquiring the target position, extracting image data at the current time, performing feature recognition on the extracted image data, and determining a biological type and a behavior type corresponding to the currently moving organism; generating a security risk value according to the biological type and the behavior type, and generating a corresponding security early warning level according to the security risk value; and according to the biological type, the behavior type and the security early warning level, correlating in the target position of the grid layout drawing to generate security information as second dimension information.
In this embodiment, the step of generating the security risk value according to the biological type and the behavior type by the second dimension module specifically includes: predicting the next execution action of the currently moving creature according to the behavior type of the currently moving creature to obtain a predicted behavior; determining the visual direction of the currently moving living being according to the extracted feature recognition result of the image data, and determining the target object of the currently moving living being according to the visual direction and the prediction behavior; calculating a distance value between the target object and a currently moving living being according to the position of the target object in the grid layout; and according to the distance value, the predicted behavior and the type of the living beings, judging the risk condition brought to the target object by the currently moving living beings in the next execution action, and generating a security risk value. The calculation formula for generating the security risk value is as follows:
Figure DEST_PATH_IMAGE021
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE004AAAA
the value is a security risk value;
Figure DEST_PATH_IMAGE006_7A
to define different constant values for different types of organisms,
Figure DEST_PATH_IMAGE006_8A
j in (1) is a biotype;
Figure DEST_PATH_IMAGE008_7A
to define different constant values based on different predicted behavior,
Figure DEST_PATH_IMAGE008_8A
i in (1) is the predicted behavior;
Figure DEST_PATH_IMAGE010AAAA
in order to define the minimum distance at which different predicted behaviors have an impact on the target object according to preset rules,
Figure DEST_PATH_IMAGE012AAAA
defining the maximum distance of the influence of different prediction behaviors on a target object according to a preset rule;
Figure DEST_PATH_IMAGE014AAAA
and
Figure DEST_PATH_IMAGE016AAAA
respectively are preset weighted values which are constants;
Figure DEST_PATH_IMAGE018AAAA
is a distance value.
And the dynamic monitoring module is used for generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
In this embodiment, the dynamic monitoring module is specifically configured to: establishing a three-dimensional model, and constructing a building frame in the three-dimensional model by taking the first dimension information as bottom data; selecting a plurality of grid points in the area to be monitored on the grid layout chart, and adjusting the position of the building frame according to the actual positions of the grid points in the three-dimensional model to ensure that the positions on the grid layout chart are consistent with the actual positions; according to the position of the second dimension information in the building frame, associating the second dimension information to a corresponding node of the building frame, and associating the second dimension information with corresponding first dimension information; according to the position corresponding to the environment data, the environment data are correlated to the corresponding node of the building frame, and the environment data are correlated with the corresponding first dimension information; according to a position instruction input by a user, determining first dimension information corresponding to the position instruction in the three-dimensional model, and dynamically displaying environment data and second dimension information associated with the first dimension information.
EXAMPLE III
An embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium includes a stored computer program; when the computer program runs, the apparatus where the computer readable storage medium is located is controlled to execute the method for dynamically generating the UAV-based WSN data acquisition trajectory according to any of the above embodiments.
Example four
Referring to fig. 3, the terminal device according to an embodiment of the present invention includes a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, where the processor implements the method for dynamically generating the WSN data acquisition trajectory based on the UAV according to any of the embodiments when executing the computer program.
Preferably, the computer program may be divided into one or more modules/units (e.g., computer program) that are stored in the memory and executed by the processor to implement the invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used for describing the execution process of the computer program in the terminal device.
The Processor may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, a discrete hardware component, etc., the general purpose Processor may be a microprocessor, or the Processor may be any conventional Processor, the Processor is a control center of the terminal device, and various interfaces and lines are used to connect various parts of the terminal device.
The memory mainly includes a program storage area and a data storage area, wherein the program storage area may store an operating system, an application program required for at least one function, and the like, and the data storage area may store related data and the like. In addition, the memory may be a high speed random access memory, may also be a non-volatile memory, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card), and the like, or may also be other volatile solid state memory devices.
It should be noted that the terminal device may include, but is not limited to, a processor and a memory, and those skilled in the art will understand that the terminal device is only an example and does not constitute a limitation of the terminal device, and may include more or less components, or combine some components, or different components.
The above-mentioned embodiments are provided to further explain the objects, technical solutions and advantages of the present invention in detail, and it should be understood that the above-mentioned embodiments are only examples of the present invention and are not intended to limit the scope of the present invention. It should be understood that various changes, substitutions and alterations can be made herein without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. A WSN data acquisition track dynamic generation method based on UAVs is characterized by being applied to an unmanned aerial vehicle carrier, wherein a plurality of monitoring devices are arranged on the unmanned aerial vehicle carrier, and each monitoring device comprises a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; the method comprises the following steps:
acquiring an overall layout on a target area, and carrying out gridding processing on the overall layout to obtain a grid layout; the overall layout diagram is marked with a place for setting the monitoring equipment on site and a corresponding monitoring range;
dividing a monitored area and an area to be monitored in the grid layout according to a place where monitoring equipment is arranged on site and a corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored by the unmanned aerial vehicle carrier;
receiving monitoring data of the unmanned aerial vehicle carrier in the area to be monitored, wherein the monitoring data comprises image data collected by a camera, infrared data collected by an infrared sensor, radar data collected by a radar sensor and environmental data collected by an environmental monitoring sensor;
generating building information of an area to be monitored as first dimension information according to the image data and the radar data;
generating security information of an area to be monitored as second dimension information according to the infrared data and the image data;
and generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
2. The method as claimed in claim 1, wherein the step of generating building information of the area to be monitored as the first dimension information according to the image data and the radar data specifically comprises:
identifying the building features in the image data to obtain building plane information in the area to be monitored;
extracting radar data corresponding to the building features to obtain building space information corresponding to each building feature;
constructing a space coordinate system in the area to be monitored, and generating three-dimensional information of the building characteristics in the area to be monitored according to the building plane information and the building space information;
and adjusting the three-dimensional information of the building characteristics in the area to be monitored by taking the monitored area as a reference according to the monitored area and the area to be monitored which are divided in the grid layout chart to obtain the building information of the area to be monitored as first dimension information.
3. The UAV-based WSN data acquisition trajectory dynamic generation method of claim 2, wherein the step of adjusting the three-dimensional information of the building features in the area to be monitored with reference to the monitored area according to the monitored area and the area to be monitored divided in the mesh layout map specifically comprises:
determining a region boundary between the monitored region and a region to be monitored according to a monitoring range region corresponding to monitoring equipment arranged in the monitored region;
identifying the region boundary in the image data, and determining a monitoring blind spot on the region boundary according to a preset rule;
calculating the data proportion of the three-dimensional information corresponding to the monitoring blind point and the characteristics of the buildings in the area to be monitored as a first data relation value;
selecting a plurality of reference points in the monitored area, and determining the data proportion of the three-dimensional information corresponding to the monitoring blind points and the reference points as a second data relation value;
and correspondingly adjusting the three-dimensional information of the building characteristics in the area to be monitored according to the proportion between the first data relation value and the second data relation value.
4. The UAV-based WSN data acquisition trajectory dynamic generation method of claim 1, wherein the step of generating security information of an area to be monitored as second-dimensional information according to the infrared data and the image data specifically includes:
the method comprises the steps that the movement condition of a living being in a region to be monitored is judged in real time through infrared data, and when the fact that the infrared data corresponding to the same living being change in two continuous time units is determined, the target position of the living being moving currently in a grid layout is calculated according to the infrared data changing in the two continuous time units;
acquiring the target position, extracting image data at the current time, performing feature recognition on the extracted image data, and determining a biological type and a behavior type corresponding to a currently moving organism;
generating a security risk value according to the biological type and the behavior type, and generating a corresponding security early warning level according to the security risk value;
and according to the biological type, the behavior type and the security early warning level, correlating in the target position of the grid layout drawing to generate security information as second dimension information.
5. The UAV-based WSN data acquisition trajectory dynamic generation method of claim 4, wherein the step of generating a security risk value according to the biological type and the behavioral type specifically comprises:
predicting the next execution action of the currently moving creature according to the behavior type of the currently moving creature to obtain a predicted behavior;
determining the visual direction of the currently moving living being according to the extracted feature recognition result of the image data, and determining the target object of the currently moving living being according to the visual direction and the prediction behavior;
calculating a distance value between the target object and a currently moving living being according to the position of the target object in the grid layout;
and according to the distance value, the predicted behavior and the type of the living beings, judging the risk condition brought to the target object by the currently moving living beings in the next execution action, and generating a security risk value.
6. The UAV-based WSN data acquisition trajectory dynamic generation method of claim 1, wherein the step of generating the dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information, and the second dimension information specifically includes:
establishing a three-dimensional model, and constructing a building frame in the three-dimensional model by using the first dimension information as bottom data;
selecting a plurality of grid points in the area to be monitored on the grid layout chart, and adjusting the position of the building frame according to the actual positions of the grid points in the three-dimensional model to ensure that the positions on the grid layout chart are consistent with the actual positions;
according to the position of the second dimension information in the building frame, associating the second dimension information to a corresponding node of the building frame, and associating the second dimension information with corresponding first dimension information;
according to the position corresponding to the environment data, the environment data are correlated to the corresponding node of the building frame, and the environment data are correlated with the corresponding first dimension information;
according to a position instruction input by a user, first dimension information corresponding to the position instruction is determined in the three-dimensional model, and environment data and second dimension information related to the first dimension information are dynamically displayed.
7. The UAV-based WSN data acquisition trajectory dynamic generation method of claim 1, wherein the environmental monitoring sensor comprises: the device comprises a temperature sensor, an air temperature and humidity sensor, an evaporation sensor, a rainfall sensor, an illumination sensor and an air speed and wind direction sensor.
8. A WSN data acquisition track dynamic generation system based on UAVs is characterized in that the system is applied to an unmanned aerial vehicle carrier, a plurality of monitoring devices are arranged on the unmanned aerial vehicle carrier, and each monitoring device comprises a camera, an infrared sensor, a radar sensor and an environment monitoring sensor; the system comprises: the system comprises a grid processing module, a region dividing module, a data acquisition module, a first dimension module, a second dimension module and a dynamic monitoring module;
the grid processing module is used for acquiring an overall layout diagram on a target area and carrying out gridding processing on the overall layout diagram to obtain a grid layout diagram; the overall layout diagram is marked with a place for setting monitoring equipment on site and a corresponding monitoring range;
the area division module is used for dividing a monitored area and an area to be monitored in the grid layout according to the place where the monitoring equipment is arranged on site and the corresponding monitoring range; the monitored area refers to a monitoring range area corresponding to the field-set monitoring equipment, and the area to be monitored refers to an area monitored by the unmanned aerial vehicle carrier;
the data acquisition module is used for receiving monitoring data of the unmanned aerial vehicle carrier in the area to be monitored, and the monitoring data comprises image data acquired by a camera, infrared data acquired by an infrared sensor, radar data acquired by a radar sensor and environmental data acquired by an environmental monitoring sensor;
the first dimension module is used for generating building information of an area to be monitored as first dimension information according to the image data and the radar data;
the second dimension module is used for generating security information of an area to be monitored as second dimension information according to the infrared data and the image data;
and the dynamic monitoring module is used for generating dynamic monitoring information of the area to be monitored in real time according to the environment data, the first dimension information and the second dimension information.
9. A computer-readable storage medium, characterized in that the computer-readable storage medium comprises a stored computer program; wherein the computer program, when executed, controls an apparatus in which the computer readable storage medium is located to perform the UAV based WSN data acquisition trajectory dynamic generation method of any of claims 1-7.
10. A terminal device comprising a processor, a memory, and a computer program stored in the memory and configured to be executed by the processor, the processor when executing the computer program implementing the UAV-based WSN data acquisition trajectory dynamic generation method of any of claims 1-7.
CN202211465106.XA 2022-11-22 2022-11-22 UAV-based WSN data acquisition track dynamic generation method and system Active CN115515077B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211465106.XA CN115515077B (en) 2022-11-22 2022-11-22 UAV-based WSN data acquisition track dynamic generation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211465106.XA CN115515077B (en) 2022-11-22 2022-11-22 UAV-based WSN data acquisition track dynamic generation method and system

Publications (2)

Publication Number Publication Date
CN115515077A true CN115515077A (en) 2022-12-23
CN115515077B CN115515077B (en) 2023-02-14

Family

ID=84513621

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211465106.XA Active CN115515077B (en) 2022-11-22 2022-11-22 UAV-based WSN data acquisition track dynamic generation method and system

Country Status (1)

Country Link
CN (1) CN115515077B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012368A (en) * 2023-02-16 2023-04-25 江西惜能照明有限公司 Security monitoring method and system based on intelligent lamp post, storage medium and computer
CN116881386A (en) * 2023-09-08 2023-10-13 北京国星创图科技有限公司 Construction method and system of space environment space-time reference model

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049448A (en) * 2019-04-22 2019-07-23 福州大学 A kind of wireless sense network method of data capture based on unmanned aerial vehicle group
CN114637326A (en) * 2022-03-15 2022-06-17 平安国际智慧城市科技股份有限公司 Regional strategy making method, device, equipment and storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110049448A (en) * 2019-04-22 2019-07-23 福州大学 A kind of wireless sense network method of data capture based on unmanned aerial vehicle group
CN114637326A (en) * 2022-03-15 2022-06-17 平安国际智慧城市科技股份有限公司 Regional strategy making method, device, equipment and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李茜雯等: "可充电无人机辅助数据采集系统的飞行路线与通信调度优化", 《物联网学报》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116012368A (en) * 2023-02-16 2023-04-25 江西惜能照明有限公司 Security monitoring method and system based on intelligent lamp post, storage medium and computer
CN116881386A (en) * 2023-09-08 2023-10-13 北京国星创图科技有限公司 Construction method and system of space environment space-time reference model
CN116881386B (en) * 2023-09-08 2023-12-05 北京国星创图科技有限公司 Construction method and system of space environment space-time reference model

Also Published As

Publication number Publication date
CN115515077B (en) 2023-02-14

Similar Documents

Publication Publication Date Title
CN115515077B (en) UAV-based WSN data acquisition track dynamic generation method and system
Alam et al. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs)
Dong et al. UAV-based real-time survivor detection system in post-disaster search and rescue operations
KR101948569B1 (en) Flying object identification system using lidar sensors and pan/tilt zoom cameras and method for controlling the same
CN111339826A (en) Landslide unmanned aerial vehicle linear sensor network frame detection system
CA3148166A1 (en) Systems for the classification of interior structure areas based on exterior images
Leng et al. Multi-UAV surveillance over forested regions
Prabu et al. Drone networks and monitoring systems in smart cities
CN115272493B (en) Abnormal target detection method and device based on continuous time sequence point cloud superposition
Baeck et al. Drone based near real-time human detection with geographic localization
Pal et al. A Comprehensive Review of AI-enabled Unmanned Aerial Vehicle: Trends, Vision, and Challenges
Sniatala Drone as a sensors’ platform
Gubanov et al. Algorithms and software for evaluation of plant height in vertical farm using uavs
Kristiani et al. Flame and smoke recognition on smart edge using deep learning
CN113836975A (en) Binocular vision unmanned aerial vehicle obstacle avoidance method based on YOLOV3
Xu et al. Assistance of UAVs in the Intelligent Management of Urban Space: A Survey
Islam et al. Volunteer drone: Search and rescue of the industrial building collapsed worker
Ru et al. Material Location and Stacking Detection Method Based on LIDAR and Camera Fusion
Fraiwan et al. A system for application development using aerial robots
Kurdi et al. Efficient navigation system of mobile robot with thermography image processing system and mapping of quadcopter
EP3792719A1 (en) Deduction system, deduction device, deduction method, and computer program
Singh et al. UAV-Based Terrain-Following Mapping Using LiDAR in High Undulating Catastrophic Areas
Fan Using convolutional neural networks to identify illegal roofs from unmanned aerial vehicle images
Lakhtyr UAV’s indoor navigation using TDOA method
Masykur et al. Measurement of plant leaf area as a result of drone acquisition with arUco markers as a reference

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant