CN112908039A - Airspace control method based on intelligent street lamp and intelligent street lamp - Google Patents

Airspace control method based on intelligent street lamp and intelligent street lamp Download PDF

Info

Publication number
CN112908039A
CN112908039A CN202110113706.9A CN202110113706A CN112908039A CN 112908039 A CN112908039 A CN 112908039A CN 202110113706 A CN202110113706 A CN 202110113706A CN 112908039 A CN112908039 A CN 112908039A
Authority
CN
China
Prior art keywords
aircraft
street lamp
intelligent street
radar
auxiliary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110113706.9A
Other languages
Chinese (zh)
Other versions
CN112908039B (en
Inventor
刘中岭
石玉波
郑海钦
侯晓青
周骉
刘迪
刘培霖
冯飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Gcl Smart Energy Co ltd
Original Assignee
Shenzhen Gcl Smart Energy Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Gcl Smart Energy Co ltd filed Critical Shenzhen Gcl Smart Energy Co ltd
Priority to CN202110113706.9A priority Critical patent/CN112908039B/en
Publication of CN112908039A publication Critical patent/CN112908039A/en
Application granted granted Critical
Publication of CN112908039B publication Critical patent/CN112908039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0082Surveillance aids for monitoring traffic from a ground station
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21VFUNCTIONAL FEATURES OR DETAILS OF LIGHTING DEVICES OR SYSTEMS THEREOF; STRUCTURAL COMBINATIONS OF LIGHTING DEVICES WITH OTHER ARTICLES, NOT OTHERWISE PROVIDED FOR
    • F21V33/00Structural combinations of lighting devices with other articles, not otherwise provided for
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/044Recurrent networks, e.g. Hopfield networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/084Backpropagation, e.g. using gradient descent
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/38Outdoor scenes
    • G06V20/39Urban scenes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F21LIGHTING
    • F21WINDEXING SCHEME ASSOCIATED WITH SUBCLASSES F21K, F21L, F21S and F21V, RELATING TO USES OR APPLICATIONS OF LIGHTING DEVICES OR SYSTEMS
    • F21W2131/00Use or application of lighting devices or systems not provided for in codes F21W2102/00-F21W2121/00
    • F21W2131/10Outdoor lighting
    • F21W2131/103Outdoor lighting of streets or roads

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application belongs to the technical field of intelligent street lamps, and particularly relates to an airspace control method based on an intelligent street lamp and the intelligent street lamp. The method comprises the following steps: acquiring an image of a designated airspace through a camera device on the intelligent street lamp, and detecting an object in the image of the designated airspace; if an object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model; if the object is an aircraft, measuring flight parameters of the aircraft through a radar on the intelligent street lamp; predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft; and carrying out recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp. Through the method and the device, the aircraft can be identified, the track can be predicted, and the aircraft can be recycled, so that the control of the aircraft is effectively realized, and potential safety hazards caused by abuse of the aircraft are reduced.

Description

Airspace control method based on intelligent street lamp and intelligent street lamp
Technical Field
The application belongs to the technical field of intelligent street lamps, and particularly relates to an airspace control method based on an intelligent street lamp and the intelligent street lamp.
Background
With the continuous development of the technology, various civil aircrafts including unmanned planes are more and more popular, and great convenience is brought to daily work and life of people. However, at the same time, the abuse of the aircraft brings a lot of safety hazards, some users lack the necessary safety awareness, the private use of the aircraft in the traffic trunk and some important places in the unauthorized situation disturbs the public order and even can cause serious accidents, and the prior art lacks effective technical means for managing the abuse of the aircraft.
Disclosure of Invention
In view of this, the embodiment of the present application provides an airspace control method based on an intelligent street lamp and an intelligent street lamp, so as to solve the problem that an effective technical means is not available in the prior art to control the abuse condition of an aircraft.
The first aspect of the embodiment of the application provides an airspace control method based on an intelligent street lamp, which may include:
acquiring an image of a designated airspace through a camera device on the intelligent street lamp, and detecting an object in the image of the designated airspace;
if an object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model;
if the object is an aircraft, measuring flight parameters of the aircraft through a radar on the intelligent street lamp;
predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft;
and carrying out recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp.
Further, the recovering the aircraft on the predicted track through a recovering device on the intelligent street lamp may include:
selecting a target track point on the predicted track, and calculating a target time when the aircraft travels to the target track point;
determining the relative position relationship between the recovery device and the target track point;
calculating the emission parameters of the recovery device according to the target time and the relative position relation;
and launching the recovery device according to the launching parameters to perform recovery operation on the aircraft.
Further, the calculating the emission parameter of the recovery device according to the target time and the relative position relationship may include:
establishing a plane rectangular coordinate system on a recovery plane, wherein the recovery plane is a plane determined by the intelligent street lamp and the target track point;
constructing a motion trail equation of the recovery device in the plane rectangular coordinate system according to the target time and the relative position relation;
and solving the motion trail equation to obtain the emission parameters of the recovery device.
Furthermore, the recovery device is connected with the intelligent street lamp through a preset connecting wire, and the connecting wire is wound on a rotating shaft of the intelligent street lamp;
in the process of moving the recovery device to the target track point, the method may further include:
driving the rotating shaft to rotate according to a first rotating direction so as to gradually release the connecting wire;
after the recovery device reaches the target track point, the method may further include:
and driving the rotating shaft to rotate according to a second rotating direction so as to gradually recover the connecting wire.
Further, before predicting the flight trajectory of the aircraft according to the flight parameters, the method may further include:
measuring auxiliary flight parameters of the aircraft through a preset auxiliary radar group, wherein the auxiliary radar group comprises a plurality of auxiliary radars, and each auxiliary radar is distributed at different preset positions;
respectively determining the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft;
and fusing the auxiliary flight parameters into the flight parameters according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft to obtain fused flight parameters.
Further, the fusing the auxiliary flight parameters into the flight parameters according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the smart street lamp and the aircraft may include:
respectively calculating the weight coefficients of each auxiliary radar and the radar on the intelligent street lamp according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft;
and carrying out weighted average on the auxiliary flight parameters and the flight parameters according to the weight coefficients to obtain fused flight parameters.
Further, before the flight parameters of the aircraft are measured by the radar on the intelligent street lamp, the method may further include:
sending identity verification request information to the aircraft through a preset communication frequency, and receiving feedback information of the aircraft;
if the feedback information is received within a preset time length, extracting an aircraft identifier and first verification information corresponding to the aircraft identifier from the feedback information;
performing hash calculation on the aircraft identifier by using a hash function stored in the intelligent street lamp to obtain second verification information corresponding to the aircraft identifier;
and if the first verification information is consistent with the second verification information, the aircraft is judged to be a compliant aircraft, and the step of measuring the flight parameters of the aircraft through the radar on the intelligent street lamp and the subsequent steps are not executed.
Further, before performing hash calculation on the aircraft identifier by using the hash function stored in the intelligent street lamp, the method may further include:
sending update timestamp query request information to a preset server, and receiving an update timestamp fed back by the server; the update timestamp is a timestamp when the server updates the hash function;
if the update timestamp fed back by the server is inconsistent with the update timestamp stored in the intelligent street lamp, sending hash function update request information to the server, and receiving the hash function fed back by the server;
and replacing the hash function stored in the intelligent street lamp by using the hash function fed back by the server, and replacing the update timestamp stored in the intelligent street lamp by using the update timestamp fed back by the server.
Further, the predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft may include:
arranging the flight parameters into a reference sequence according to the time sequence;
processing the reference sequence by using a preset time sequence prediction model to obtain a prediction parameter of the aircraft;
calculating the next predicted track point of the aircraft according to the predicted parameters;
adding the prediction parameters into the reference sequence, and returning to execute the step of processing the reference sequence by using a preset time sequence prediction model and the subsequent steps until a preset number of predicted track points are obtained through calculation;
and determining the predicted track of the aircraft according to the preset number of predicted track points.
A second aspect of the present embodiment provides an airspace control apparatus, which may include a functional module that implements any one of the steps of the airspace control method described above.
A third aspect of embodiments of the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the steps of any of the above-mentioned airspace regulating methods.
A fourth aspect of the embodiments of the present application provides an intelligent street lamp, which includes a memory, a processor, and a computer program stored in the memory and capable of running on the processor, wherein the processor implements any one of the above-mentioned airspace control methods when executing the computer program.
A fifth aspect of the embodiments of the present application provides a computer program product, which, when running on a terminal device, enables the smart street lamp to execute any one of the steps of the airspace control method described above.
Compared with the prior art, the embodiment of the application has the advantages that: according to the method and the device, the image of the designated airspace is acquired through the camera device on the intelligent street lamp, and object detection is carried out in the image of the designated airspace; if an object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model; if the object is an aircraft, measuring flight parameters of the aircraft through a radar on the intelligent street lamp; predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft; and carrying out recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp. Through this application embodiment, carry on camera device, radar and recovery unit on wisdom street lamp, can discern, orbit prediction and recovery operation to the aircraft in appointed airspace to effectively realized the control to the aircraft, reduced the potential safety hazard that the aircraft abuse and bring.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a flowchart illustrating an embodiment of an airspace control method based on intelligent street lamps according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow diagram of a recovery operation of an aircraft on a predicted trajectory by a recovery device on a smart street light;
FIG. 3 is a block diagram of an embodiment of an airspace regulating device according to an embodiment of the present application;
fig. 4 is a schematic block diagram of an intelligent street lamp in the embodiment of the present application.
Detailed Description
In order to make the objects, features and advantages of the present invention more apparent and understandable, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it is apparent that the embodiments described below are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In addition, in the description of the present application, the terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
Referring to fig. 1, an embodiment of an airspace control method based on an intelligent street lamp in the embodiment of the present application may include:
s101, acquiring an image of a designated airspace through a camera device on the intelligent street lamp, and detecting an object in the image of the designated airspace.
Wisdom street lamp is the street lamp that has data storage ability and data calculation processing ability, and the execution main part of this application embodiment can be for wisdom street lamp. The intelligent street lamp is provided with the camera device in advance and used for acquiring images of the designated airspace. The designated airspace can be set according to actual conditions, for example, a cylindrical space region with the intelligent street lamp as the axis can be set as the designated airspace, and the radius of the cylindrical space region, the height of the upper bottom surface and the height of the lower bottom surface can be set according to actual conditions. Of course, the designated airspace with other shapes may be set according to actual situations, and this is not particularly limited in the embodiment of the present application.
In the embodiment of the present application, the image may be periodically acquired by using a camera device, and the image acquisition period may be set according to an actual situation, which is not specifically limited in the embodiment of the present application.
After the images of the designated airspace acquired by the camera device are acquired, the intelligent street lamp can perform object detection in the images to judge whether abnormal objects exist in the images. In general, a reference image of a designated airspace, that is, an image of the designated airspace when there is no abnormal object is stored in advance in the intelligent street lamp, and this is used as a reference for object detection. And taking the image of the designated airspace acquired by the camera device as an image to be detected, comparing the image with a reference image, and if a certain region with large difference exists between the image and the reference image, determining that an object is detected and is positioned in the region with large difference.
In a specific implementation of the embodiment of the application, binarization processing may be performed on the image to be detected and the reference image respectively to obtain a binarized image to be detected and a binarized reference image, and an exclusive or operation may be performed on pixels at corresponding positions of the image to be detected and the reference image to obtain a difference image between the two. The pixel point (marked as difference pixel point) with the value of "1" in the difference image represents that the two are different at the position, and the pixel point with the value of "0" in the difference image represents that the two are the same at the position. Clustering the difference pixel points according to position distribution, clustering the difference pixel points close to the position into a pixel point class, and eliminating the pixel point class with the difference pixel point number smaller than a preset threshold value from the pixel point class, wherein if the residual pixel point classes exist, the object can be considered to be detected, and the region where the residual pixel point classes exist is the region where the object exists. Otherwise, if the residual pixel point classes do not exist, the object is considered to be not detected, and the image of the next frame is collected again to perform object detection.
And S102, if the object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model.
The object recognition model is a neural network model which is trained in advance and used for object recognition. Specific types of Neural network models can be set according to actual situations, including but not limited to Convolutional Neural Networks (CNNs), Deep Convolutional Neural Networks (DCNNs), Inverse Graphics Networks (IGNs), Generative Adaptive Networks (GANs), Recurrent Neural Networks (RNNs), Deep Residual error Networks (DRNs), Support Vector Machines (SVMs), and other Neural network models.
In order to improve the ability of the object recognition model to distinguish various objects under any conditions, before the object recognition model is used, a training data set for training the object recognition model can be constructed, and then the initial object recognition model is trained by using the training data set until a preset training condition is met, so that the trained object recognition model is obtained.
The training data set includes a number of training samples, each training sample including a frame of an image of an object and an expected output object class (i.e., aircraft or non-aircraft) corresponding to the image. To improve the accuracy of the model, these training samples should cover as much as possible various types of aircraft and non-aircraft images. Each object class may be provided with a corresponding numerical representation, for example, an aircraft may be represented by 1 and a non-aircraft by 0. Of course, other numerical representation forms may be adopted according to actual situations, and the embodiment of the present application is not particularly limited thereto.
In the training process, for each training sample in the training data set, the object recognition model is used to process the object image in the training sample to obtain the actually output object class, and then the training loss value is calculated according to the expected output object class and the actually output object class in the training sample. The specific calculation manner of the training loss value may be set according to an actual situation, and in a specific implementation of the embodiment of the present application, a square error between an object class of the expected output and an object class of the actual output may be calculated and determined as the training loss value.
After the training loss value is obtained through calculation, the model parameters of the object recognition model can be adjusted according to the training loss value. In the embodiment of the present application, assuming that the model parameter of the object recognition model is W1, the training loss value is reversely propagated to modify the model parameter W1 of the object recognition model, so as to obtain a modified model parameter W2. After the parameters are modified, the next training process is continued, in the training process, a training loss value is obtained through recalculation, the training loss value is reversely propagated to modify the model parameters W2 of the object recognition model, modified model parameters W3, … … are obtained, and the like, the above processes are repeated continuously, the model parameters can be modified in each training process until preset training conditions are met, wherein the training conditions can be that the training times reach a preset time threshold value, and the time threshold value can be set according to actual conditions, for example, the training process can be set to values of thousands, tens of thousands, hundreds of thousands or even larger; the training condition may also be object recognition model convergence; as it may happen that the training times have not reached the time threshold, but the object recognition model has converged, unnecessary work may be repeated; or the object recognition model cannot be converged all the time, which may result in infinite loop and failure to end the training process, and based on the two cases, the training condition may also be that the number of times of training reaches a threshold value or the object recognition model converges. When the training condition is satisfied, the trained object recognition model can be obtained.
Optionally, in order to further improve the applicability of the model to the real scene, after the trained object recognition model is obtained, knowledge distillation may be performed on the object recognition model to obtain the knowledge-distilled object recognition model. In the knowledge distillation process, an object recognition model obtained by training is used as a teacher model, another randomly initialized neural network model is used as a student model, and real data is used as a learning object. The student model improves the prediction capability on the real data set through the soft target generated by the learning teacher model, and can achieve better prediction precision.
When an object is detected in the image of the designated airspace, the sub-image of the area where the object is located can be input into the object recognition model for processing, and the output object type is obtained. And if the object is a non-aircraft, acquiring the image of the next frame again and detecting the object.
And S103, if the object is an aircraft, measuring the flight parameters of the aircraft through the radar on the intelligent street lamp.
In this embodiment, a spatial stereo coordinate system may be pre-established, where the spatial stereo coordinate system includes three coordinate dimensions, which are respectively marked as a first coordinate dimension (x-axis), a second coordinate dimension (y-axis), and a third coordinate dimension (z-axis), where the second coordinate dimension is perpendicular to the first coordinate dimension, and the third coordinate dimension is perpendicular to the first coordinate dimension and the second coordinate dimension. Flight parameters of the aircraft in three dimensions, which may include, but are not limited to, position and speed, may be measured by radar (denoted as primary radar) on the smart street lamp, respectively.
The radar on the wisdom street lamp is subject to its mounted position, and its measured flight parameter can have certain error, in order to further improve measuring degree of accuracy, in a concrete realization of this application embodiment, except that the radar of installing on the wisdom street lamp, can also dispose a plurality of radar in different preset positions of wisdom street lamp peripheral region in advance, mark it as auxiliary radar, and these auxiliary radars have constituted an auxiliary radar group.
Flight parameters of the aircraft can also be measured by the secondary radar group, which are denoted as secondary flight parameters for the sake of distinction. After the measurement of flight parameters is completed, the distances between each auxiliary radar in the auxiliary radar group and the aircraft and the distances between the radar on the intelligent street lamp and the aircraft can be respectively determined, then the auxiliary flight parameters are fused into the flight parameters measured by the main radar according to the distances between each auxiliary radar in the auxiliary radar group and the aircraft and the distances between the radar on the intelligent street lamp and the aircraft, and the fused flight parameters are obtained.
Specifically, the weight coefficients of the radars on each auxiliary radar and the intelligent street lamp can be respectively calculated according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft. For any radar (primary radar or secondary radar), the weight coefficient is inversely related to the distance between the radar and the aircraft, i.e. the weight coefficient of the radar at a closer distance is larger, whereas the weight coefficient of the radar at a farther distance is smaller. After the weight coefficient is obtained, the auxiliary flight parameter and the flight parameter measured by the main radar can be weighted and averaged according to the weight coefficient to obtain the fused flight parameter.
In the embodiment of the application, the flight parameters measured by the main radar can be directly used for subsequent prediction according to actual conditions, or the fused flight parameters can be used for subsequent prediction, and if the flight parameters are the latter, the flight parameters appearing in the subsequent description refer to the fused flight parameters.
And S104, predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft.
In the application embodiment, the flight parameters may be periodically measured, where the measurement period is denoted as Cycle, and a specific value of the Cycle may be set according to an actual situation, which is not specifically limited in the application embodiment.
When the flight trajectory prediction is performed, for each dimension, the flight parameters obtained by periodic measurement can be sequentially arranged into a sequence according to the time sequence, and the sequence is recorded as a reference sequence. The speed of the aircraft is mainly predicted here, so that the reference sequence is a sequence in which the speeds obtained by the measurements are arranged from beginning to end in the measurement sequence.
And processing the reference sequence by using a preset time sequence prediction model to obtain the prediction parameters of the aircraft, namely the prediction speed of the aircraft at the next measurement moment. The time sequence prediction model can be set according to actual conditions, and includes but is not limited to an LSTM model, an ARMA model, an ARIMA model, an ARCH model, a GARCH model and other time sequence prediction models.
After the predicted parameters are obtained, the next predicted trajectory point of the aircraft can be calculated according to the predicted parameters. Specifically, the next predicted trajectory point of the aircraft may be calculated according to:
NextPosX=PosX+(NextVelX+VelX)×Cycle÷2
NextPosY=PosY+(NextVelY+VelY)×Cycle÷2
NextPosZ=PosZ+(NextVelZ+VelZ)×Cycle÷2
wherein, (PosX, PosY, PosZ) is the aircraft position obtained by the last measurement, VelX, VelY, VelZ are the speeds in three dimensions obtained by the last measurement, NextVelX, NextVelY, NextVelZ are the predicted speeds in three dimensions, respectively, (NextPosX, NextPosY, NextPosZ) is the position of the next predicted track point.
And adding the prediction parameters into the reference sequence, and repeating the time sequence prediction process to continuously obtain new predicted track points until the number of the predicted track points reaches a preset number. The preset number may be set according to actual conditions, and this is not particularly limited in the embodiment of the present application. And finally, sequentially connecting the predicted track points with the preset number to obtain the predicted track of the aircraft.
And S105, performing recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp.
As shown in fig. 2, step S105 may specifically include the following processes:
and S1051, selecting a target track point on the predicted track, and calculating the target time when the aircraft travels to the target track point.
In the embodiment of the application, one predicted track point can be selected from the predicted track as the target track point according to the actual situation. For example, the first, second, third, or other subsequent predicted trajectory points after the last flight parameter measurement may be selected as target trajectory points. Here, the time of the last flight parameter measurement is recorded as InitialT, and if the selected target track point is the kth predicted track point (k is a positive integer) after the last flight parameter measurement, the time when the aircraft travels to the target track point can be calculated according to the following formula:
TargetT=InitialT+k×Cycle
wherein, TargetT is the time when the aircraft travels to the target track point, and is referred to as the target time here.
And step S1052, determining the relative position relationship between the recovery device and the target track point.
Because the initial position of recovery unit on wisdom street lamp is fixed, after selecting the target track point, then can confirm the relative position relation between the two.
And S1053, calculating the emission parameters of the recovery device according to the target time and the relative position relation.
Firstly, the intelligent street lamp is regarded as a vertical line segment, a unique plane can be determined according to the line segment and a target track point, and the unique plane is recorded as a recovery plane. The rectangular plane coordinate system is established on the recycling plane, for example, the intersection point of the intelligent street lamp and the ground can be used as a base point of the coordinate system, the vertical upward direction is used as the positive direction of a longitudinal axis, and the direction which is perpendicular to the longitudinal axis and points to the target track point is used as the positive direction of a transverse axis.
After the rectangular plane coordinate system is established, a motion trail equation of the recovery device can be established in the rectangular plane coordinate system according to the target time and the relative position relationship between the recovery device and the target track point. Specifically, the relative position relationship between the two is projected into a rectangular planar coordinate system, and the vertical distance and the horizontal distance of the two in the rectangular planar coordinate system can be obtained, so that a motion trajectory equation shown as follows can be established:
VerticalDis=VerVel×(TargetT-LaunchT)-1/2×g×(TargetT-LaunchT)2
HorizonDis=HorVel×(TargetT-LaunchT)
the vertical distance between the recovery device and the target track point under the rectangular plane coordinate system, namely the distance on the longitudinal axis, the horizontal distance between the recovery device and the target track point under the rectangular plane coordinate system, namely the distance on the horizontal axis, g is the gravitational acceleration, and LaunchT is the launching time of the recovery device, and can be set according to actual conditions. VerVel is the vertical launch velocity of the reclaimer and HorVel is the horizontal launch velocity of the reclaimer. And solving the motion trail equation to obtain VerVel, HorVel and other emission parameters.
It should be noted that the result of the transmission parameters may not be unique, and a set of transmission parameters with smaller values may be selected according to practical situations, so as to reduce the energy consumed in transmitting the recovery device.
And S1054, transmitting the recovery device according to the transmission parameters so as to perform recovery operation on the aircraft.
The recovery device comprises a rope net used for wrapping the aircraft, a container for loading the rope net and a timer, wherein an ejection device is arranged in the container. When wisdom street lamp transmission recovery unit, the fag end is in fold condition and loads in the container, and when the target moment was counted to the time-recorder, start ejection device launches the fag end from the container to the parcel lives the aircraft, realizes the recovery operation to the aircraft.
In a concrete realization of this application embodiment, recovery unit can be connected with wisdom street lamp through predetermined connecting wire, and the connecting wire twines in the pivot of wisdom street lamp. After the wisdom street lamp launches recovery unit, at the motion in-process of recovery unit to target track point, wisdom street lamp can drive the pivot is rotatory according to first direction of rotation to progressively release the connecting wire, wherein, first direction of rotation is the direction of rotation of releasing the connecting wire promptly. Arrive target track point at recovery unit, after the aircraft was lived to the parcel, wisdom street lamp can also drive the pivot and rotate according to second direction of rotation to progressively retrieve the connecting wire, live the recovery unit of aircraft to the parcel and retrieve, wherein, the second direction of rotation is the direction of rotation of retrieving the connecting wire promptly. Through the mode, personnel or property loss caused by aircraft falling can be avoided, and the safety of recovery operation is further improved.
In the above process, it is the default that recovery operations are performed on all aircraft. In practice, however, some aircraft may be compliant aircraft that have been previously approved by the relevant authorities for flight at the designated airspace, and thus in another implementation of the embodiments of the present application, the detected object may be first identity verified after it is determined to be an aircraft.
Specifically, the intelligent street lamp can send identity verification request information to the aircraft through a preset communication frequency and receive feedback information of the aircraft.
If the feedback information is not received within the preset time length, the aircraft can be judged to be not a compliant aircraft, and the step of measuring the flight parameters of the aircraft through the radar on the intelligent street lamp and the subsequent steps are continuously executed.
If the feedback information is received within the preset time length, the aircraft identification and the first verification information corresponding to the aircraft identification can be extracted from the feedback information, wherein the aircraft identifications are uniformly distributed by a preset server, and the aircraft identification of each compliant aircraft is different. The first verification information is a result obtained by performing hash calculation on the aircraft identification by using a hash function stored by the aircraft. Both the compliant aircraft and the intelligent street lamp can obtain the same hash function from a preset server in advance and store the hash function locally so as to verify the identity.
Then, the intelligent street lamp can use a hash function stored in the intelligent street lamp to perform hash calculation on the aircraft identifier, second verification information corresponding to the aircraft identifier is obtained, and the first verification information and the second verification information are compared. If the first verification information is inconsistent with the second verification information, the aircraft can be judged to be not a compliant aircraft, and the step of measuring the flight parameters of the aircraft through the radar on the intelligent street lamp and the subsequent steps are continuously executed. If the first verification information is consistent with the second verification information, the aircraft can be judged to be a compliant aircraft, and the step of measuring the flight parameters of the aircraft through the radar on the intelligent street lamp and the subsequent steps are not executed, namely, the aircraft is not recycled.
Considering that the verification information obtained by calculation is always fixed by using the same hash function for a long time, if the verification information is intercepted by a lawbreaker, a long-term potential safety hazard is caused. Therefore, in another specific implementation of the embodiment of the present application, the server may periodically update the hash function, and record a timestamp at this time and record the timestamp as the update timestamp each time the hash function is updated. After the latest update timestamp is recorded, the previously recorded update timestamp may be deleted.
Before the intelligent street lamp and the compliance aircraft carry out Hash operation each time, the latest Hash function can be obtained from the server. Taking the intelligent street lamp as an example, the intelligent street lamp can send the update timestamp query request information to the server and receive the update timestamp fed back by the server. And if the update timestamp fed back by the server is inconsistent with the update timestamp stored by the intelligent street lamp (in the initial state, the update timestamp stored by the intelligent street lamp is empty), sending hash function update request information to the server, and receiving the hash function fed back by the server. Then, the hash function fed back by the server is used for replacing the hash function stored in the intelligent street lamp (in the initial state, the hash function stored in the intelligent street lamp is empty), and the update timestamp fed back by the server is used for replacing the update timestamp stored in the intelligent street lamp. If the update timestamp fed back by the server is consistent with the update timestamp stored in the intelligent street lamp, the hash function is not required to be updated, and the hash function stored in the intelligent street lamp is directly used for hash operation. In this way, even if a lawbreaker intercepts the verification information, the verification information will soon fail, and no further safety hazard can be created.
In summary, the embodiment of the application acquires the image of the designated airspace through the camera device on the intelligent street lamp, and performs object detection in the image of the designated airspace; if an object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model; if the object is an aircraft, measuring flight parameters of the aircraft through a radar on the intelligent street lamp; predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft; and carrying out recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp. Through this application embodiment, carry on camera device, radar and recovery unit on wisdom street lamp, can discern, orbit prediction and recovery operation to the aircraft in appointed airspace to effectively realized the control to the aircraft, reduced the potential safety hazard that the aircraft abuse and bring.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the above-mentioned embodiment of the airspace control method based on the intelligent street lamp, fig. 3 shows a structural diagram of an embodiment of an airspace control device provided in the embodiment of the present application.
In this embodiment, an airspace regulating apparatus may include:
the object detection module 301 is used for acquiring an image of a designated airspace through a camera device on the intelligent street lamp and detecting an object in the image of the designated airspace;
an aircraft determining module 302, configured to determine whether an object is an aircraft by using a preset object identification model if the object is detected in the image of the designated airspace;
the flight parameter measuring module 303 is configured to measure a flight parameter of the aircraft through a radar on the smart street lamp if the object is the aircraft;
a flight trajectory prediction module 304, configured to predict a flight trajectory of the aircraft according to the flight parameters, so as to obtain a predicted trajectory of the aircraft;
and an aircraft recovery module 305, configured to perform a recovery operation on the aircraft on the predicted trajectory through a recovery device on the smart street lamp.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Fig. 4 shows a schematic block diagram of an intelligent street lamp provided in an embodiment of the present application, and for convenience of description, only the parts related to the embodiment of the present application are shown.
As shown in fig. 4, the intelligent street lamp 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The processor 40 executes the computer program 42 to implement the steps of the aforementioned each intelligent street lamp-based airspace control method embodiment, such as the steps S101 to S105 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of each module/unit in the above-mentioned device embodiments, such as the functions of the modules 301 to 305 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the intelligent street lamp 4.
Those skilled in the art will appreciate that fig. 4 is only an example of the intelligent street lamp 4, and does not constitute a limitation to the intelligent street lamp 4, and may include more or less components than those shown, or combine some components, or different components, for example, the intelligent street lamp 4 may further include an input/output device, a network access device, a bus, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the intelligent street lamp 4, such as a hard disk or a memory of the intelligent street lamp 4. The memory 41 may also be an external storage device of the Smart street lamp 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the Smart street lamp 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the smart street lamp 4. The memory 41 is used for storing the computer program and other programs and data required by the intelligent street lamp 4. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed device/intelligent street lamp and method can be implemented in other ways. For example, the above-described device/intelligent street lamp embodiments are merely illustrative, and for example, the division of the modules or units is only one logical function division, and there may be other division ways in actual implementation, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable storage medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable storage medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable storage media that does not include electrical carrier signals and telecommunications signals in accordance with legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An airspace control method based on an intelligent street lamp is characterized by comprising the following steps:
acquiring an image of a designated airspace through a camera device on the intelligent street lamp, and detecting an object in the image of the designated airspace;
if an object is detected in the image of the designated airspace, judging whether the object is an aircraft or not by using a preset object identification model;
if the object is an aircraft, measuring flight parameters of the aircraft through a radar on the intelligent street lamp;
predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft;
and carrying out recovery operation on the aircraft on the predicted track through a recovery device on the intelligent street lamp.
2. The intelligent street lamp-based airspace control method according to claim 1, wherein the recovery operation of the aircraft on the predicted trajectory by a recovery device on the intelligent street lamp comprises:
selecting a target track point on the predicted track, and calculating a target time when the aircraft travels to the target track point;
determining the relative position relationship between the recovery device and the target track point;
calculating the emission parameters of the recovery device according to the target time and the relative position relation;
and launching the recovery device according to the launching parameters to perform recovery operation on the aircraft.
3. The intelligent street lamp-based airspace control method according to claim 2, wherein the calculating of the emission parameters of the recycling device according to the target time and the relative position relationship comprises:
establishing a plane rectangular coordinate system on a recovery plane, wherein the recovery plane is a plane determined by the intelligent street lamp and the target track point;
constructing a motion trail equation of the recovery device in the plane rectangular coordinate system according to the target time and the relative position relation;
and solving the motion trail equation to obtain the emission parameters of the recovery device.
4. The airspace control method based on the intelligent street lamp according to claim 2, wherein the recycling device is connected with the intelligent street lamp through a preset connecting wire, and the connecting wire is wound on a rotating shaft of the intelligent street lamp;
in the process of the movement of the recovery device to the target track point, the method further comprises the following steps:
driving the rotating shaft to rotate according to a first rotating direction so as to gradually release the connecting wire;
after the recovery device reaches the target track point, the method further comprises the following steps:
and driving the rotating shaft to rotate according to a second rotating direction so as to gradually recover the connecting wire.
5. The intelligent street lamp-based airspace control method according to claim 1, further comprising, before predicting the flight trajectory of the aircraft according to the flight parameters:
measuring auxiliary flight parameters of the aircraft through a preset auxiliary radar group, wherein the auxiliary radar group comprises a plurality of auxiliary radars, and each auxiliary radar is distributed at different preset positions;
respectively determining the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft;
and fusing the auxiliary flight parameters into the flight parameters according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft to obtain fused flight parameters.
6. The intelligent street lamp-based airspace control method according to claim 5, wherein the fusing the auxiliary flight parameters into the flight parameters according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft comprises:
respectively calculating the weight coefficients of each auxiliary radar and the radar on the intelligent street lamp according to the distance between each auxiliary radar in the auxiliary radar group and the aircraft and the distance between the radar on the intelligent street lamp and the aircraft;
and carrying out weighted average on the auxiliary flight parameters and the flight parameters according to the weight coefficients to obtain fused flight parameters.
7. The intelligent street lamp-based airspace control method according to claim 1, further comprising, before measuring the flight parameters of the aircraft by the radar on the intelligent street lamp:
sending identity verification request information to the aircraft through a preset communication frequency, and receiving feedback information of the aircraft;
if the feedback information is received within a preset time length, extracting an aircraft identifier and first verification information corresponding to the aircraft identifier from the feedback information;
performing hash calculation on the aircraft identifier by using a hash function stored in the intelligent street lamp to obtain second verification information corresponding to the aircraft identifier;
and if the first verification information is consistent with the second verification information, the aircraft is judged to be a compliant aircraft, and the step of measuring the flight parameters of the aircraft through the radar on the intelligent street lamp and the subsequent steps are not executed.
8. The intelligent street lamp-based airspace control method according to claim 7, further comprising, before performing hash calculation on the aircraft identifier by using the hash function stored in the intelligent street lamp:
sending update timestamp query request information to a preset server, and receiving an update timestamp fed back by the server; the update timestamp is a timestamp when the server updates the hash function;
if the update timestamp fed back by the server is inconsistent with the update timestamp stored in the intelligent street lamp, sending hash function update request information to the server, and receiving the hash function fed back by the server;
and replacing the hash function stored in the intelligent street lamp by using the hash function fed back by the server, and replacing the update timestamp stored in the intelligent street lamp by using the update timestamp fed back by the server.
9. The intelligent street lamp-based airspace control method according to any one of claims 1-8, wherein the predicting the flight trajectory of the aircraft according to the flight parameters to obtain the predicted trajectory of the aircraft comprises:
arranging the flight parameters into a reference sequence according to the time sequence;
processing the reference sequence by using a preset time sequence prediction model to obtain a prediction parameter of the aircraft;
calculating the next predicted track point of the aircraft according to the predicted parameters;
adding the prediction parameters into the reference sequence, and returning to execute the step of processing the reference sequence by using a preset time sequence prediction model and the subsequent steps until a preset number of predicted track points are obtained through calculation;
and determining the predicted track of the aircraft according to the preset number of predicted track points.
10. An intelligent street lamp comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor when executing the computer program implements the steps of the airspace regulating method according to any one of claims 1 to 9.
CN202110113706.9A 2021-01-27 2021-01-27 Airspace control method based on intelligent street lamp and intelligent street lamp Active CN112908039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110113706.9A CN112908039B (en) 2021-01-27 2021-01-27 Airspace control method based on intelligent street lamp and intelligent street lamp

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110113706.9A CN112908039B (en) 2021-01-27 2021-01-27 Airspace control method based on intelligent street lamp and intelligent street lamp

Publications (2)

Publication Number Publication Date
CN112908039A true CN112908039A (en) 2021-06-04
CN112908039B CN112908039B (en) 2022-02-25

Family

ID=76119157

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110113706.9A Active CN112908039B (en) 2021-01-27 2021-01-27 Airspace control method based on intelligent street lamp and intelligent street lamp

Country Status (1)

Country Link
CN (1) CN112908039B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113359835A (en) * 2021-06-23 2021-09-07 广东万嘉通通信科技有限公司 Smart rod and distributed cloud system based on smart rod
CN117522633A (en) * 2023-12-29 2024-02-06 宝德照明集团有限公司 Management system based on wisdom street lamp cloud platform

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170261604A1 (en) * 2016-03-11 2017-09-14 Raytheon Bbn Technologies Corp. Intercept drone tasked to location of lidar tracked drone
CN107579817A (en) * 2017-09-12 2018-01-12 广州广电运通金融电子股份有限公司 User ID authentication method, apparatus and system based on block chain
CN108255191A (en) * 2016-12-29 2018-07-06 上海三思电子工程有限公司 Unmanned plane management-control method and wisdom road lamp system
CN110329530A (en) * 2019-07-22 2019-10-15 黑龙江大学 A kind of aerial low-speed unmanned aerial vehicle recyclable device of ejection net catching type
CN110471055A (en) * 2019-07-08 2019-11-19 岭澳核电有限公司 Flying object trajectory predictions method, apparatus, readable storage medium storing program for executing and terminal device
CN110703760A (en) * 2019-10-30 2020-01-17 杭州叙简科技股份有限公司 Newly-increased suspicious object detection method for security inspection robot
CN210070745U (en) * 2019-03-13 2020-02-14 北京天剑维安科技发展有限公司 Novel anti-unmanned aerial vehicle system of intelligent net catch
CN111444760A (en) * 2020-02-19 2020-07-24 天津大学 Traffic sign detection and identification method based on pruning and knowledge distillation

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170261604A1 (en) * 2016-03-11 2017-09-14 Raytheon Bbn Technologies Corp. Intercept drone tasked to location of lidar tracked drone
CN108255191A (en) * 2016-12-29 2018-07-06 上海三思电子工程有限公司 Unmanned plane management-control method and wisdom road lamp system
CN107579817A (en) * 2017-09-12 2018-01-12 广州广电运通金融电子股份有限公司 User ID authentication method, apparatus and system based on block chain
CN210070745U (en) * 2019-03-13 2020-02-14 北京天剑维安科技发展有限公司 Novel anti-unmanned aerial vehicle system of intelligent net catch
CN110471055A (en) * 2019-07-08 2019-11-19 岭澳核电有限公司 Flying object trajectory predictions method, apparatus, readable storage medium storing program for executing and terminal device
CN110329530A (en) * 2019-07-22 2019-10-15 黑龙江大学 A kind of aerial low-speed unmanned aerial vehicle recyclable device of ejection net catching type
CN110703760A (en) * 2019-10-30 2020-01-17 杭州叙简科技股份有限公司 Newly-increased suspicious object detection method for security inspection robot
CN111444760A (en) * 2020-02-19 2020-07-24 天津大学 Traffic sign detection and identification method based on pruning and knowledge distillation

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113359835A (en) * 2021-06-23 2021-09-07 广东万嘉通通信科技有限公司 Smart rod and distributed cloud system based on smart rod
CN117522633A (en) * 2023-12-29 2024-02-06 宝德照明集团有限公司 Management system based on wisdom street lamp cloud platform
CN117522633B (en) * 2023-12-29 2024-03-19 宝德照明集团有限公司 Management system based on wisdom street lamp cloud platform

Also Published As

Publication number Publication date
CN112908039B (en) 2022-02-25

Similar Documents

Publication Publication Date Title
US20210294320A1 (en) Data acquisition method and apparatus
CN112908039B (en) Airspace control method based on intelligent street lamp and intelligent street lamp
EP4072173A1 (en) Data transmission method and device
US10885240B2 (en) Deterministic simulation framework for autonomous vehicle testing
CN112712023B (en) Vehicle type recognition method and system and electronic equipment
CN112885112B (en) Vehicle driving detection method, vehicle driving early warning method and device
CN109544996A (en) A kind of unmanned plane management-control method, device, system and readable storage medium storing program for executing
CN113569406A (en) Data testing method and device based on automatic driving and readable storage medium
WO2021146906A1 (en) Test scenario simulation method and apparatus, computer device, and storage medium
CN113129596A (en) Travel data processing method, travel data processing device, travel data processing apparatus, storage medium, and program product
CN113189989A (en) Vehicle intention prediction method, device, equipment and storage medium
JP2023530731A (en) MAP UPDATE DATA PROCESSING METHOD, APPARATUS AND SYSTEM
CN113793080A (en) Real-time simulation method and device for warehouse operation state
KR20190143832A (en) Method for testing air traffic management electronic system, associated electronic device and platform
KR102323228B1 (en) Safety inspection maintenance method and system for structure using drone
CN117094660A (en) Construction monitoring method and system based on digital twin technology
CN110853364A (en) Data monitoring method and device
CN112509384B (en) Intelligent street lamp-based aircraft control method and intelligent street lamp
CN113793490B (en) Pressure testing method and device for electronic fence, storage medium and terminal
CN114202272A (en) Vehicle and goods matching method and device based on electronic fence, storage medium and terminal
KR102221158B1 (en) Apparatus and method for detecting vehicle type, speed and traffic using radar device and image processing
CN113048988B (en) Method and device for detecting change elements of scene corresponding to navigation map
JP7232727B2 (en) Map data management device and map data management method
CN111739322B (en) Data processing method and device
CN114489714A (en) Vehicle-mounted data processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant