CN116506495A - Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium - Google Patents

Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium Download PDF

Info

Publication number
CN116506495A
CN116506495A CN202310550897.4A CN202310550897A CN116506495A CN 116506495 A CN116506495 A CN 116506495A CN 202310550897 A CN202310550897 A CN 202310550897A CN 116506495 A CN116506495 A CN 116506495A
Authority
CN
China
Prior art keywords
real
unmanned aerial
aerial vehicle
time image
target position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310550897.4A
Other languages
Chinese (zh)
Inventor
王劲
董继鹏
董杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huku Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Original Assignee
Shenzhen Huku Technology Co ltd
Zhejiang Geely Holding Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huku Technology Co ltd, Zhejiang Geely Holding Group Co Ltd filed Critical Shenzhen Huku Technology Co ltd
Priority to CN202310550897.4A priority Critical patent/CN116506495A/en
Publication of CN116506495A publication Critical patent/CN116506495A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/51Discovery or management thereof, e.g. service location protocol [SLP] or web services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • H04L67/125Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/53Network services using third party service providers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/55Push-based network services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/56Provisioning of proxy services
    • H04L67/566Grouping or aggregating service requests, e.g. for unified processing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses unmanned aerial vehicle information sharing system, method, device, equipment and storage medium, wherein the method comprises the following steps: receiving an operation instruction aiming at a target position, which is sent by a client; and feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information. The method and the device realize that the real-time image data of the target position corresponding to the client can be checked.

Description

Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle information sharing system, an unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium.
Background
At present, the existing electronic map is mainly shot through a camera of a street or panoramic shooting is carried out by an unmanned aerial vehicle, then a shot picture is uploaded to a position corresponding to the electronic map to form a street view, a user can check a prestored street view image by clicking a specific position when using navigation software or the electronic map, but data shot by the unmanned aerial vehicle are prestored to a cloud end, and the user can only look up the prestored image through the electronic map when looking up, so that a real-time image cannot be observed.
When the user goes out and cannot learn the real-time scenery of the destination by looking at the pre-stored image, the situation that the user goes out to reach a certain destination and the destination is too much or the scenery is not opened is found.
Disclosure of Invention
The main purpose of the application is to provide an unmanned aerial vehicle information sharing system, method, device, equipment and storage medium, and aims to solve the technical problem that in the related technology, a user can only view a prestored street view image through an electronic map and cannot view real-time image data of a target position.
To achieve the above object, an embodiment of the present application provides an unmanned aerial vehicle information sharing system, including:
the server side is used for: receiving an operation instruction aiming at a target position, which is sent by a client; the method is also used for feeding back the acquired real-time image information of the target position to the client according to the operation instruction so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
The method comprises the steps of detecting that the unmanned aerial vehicle flies through a target position in the target position or in a historical time period, and directly determining real-time image data of the target position fed back by the unmanned aerial vehicle to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period;
unmanned aerial vehicle, unmanned aerial vehicle is used for: flying along a preset planning route, shooting real-time image data of a plurality of places on the preset planning route, and sending the real-time image data to a control end; the real-time image processing device is also used for processing the real-time image data to obtain real-time image information, and feeding the real-time image information back to the server side and the control side respectively;
the control end, the control end is used for: receiving real-time image data of a target position fed back by the unmanned aerial vehicle, and processing the real-time image data to obtain real-time image information; and the system is also used for controlling the unmanned aerial vehicle to fly and judging the take-off environment of the unmanned aerial vehicle.
In order to achieve the above object, an embodiment of the present application provides an unmanned aerial vehicle information sharing method, which is applied to a server side in an unmanned aerial vehicle information sharing system, and the unmanned aerial vehicle information sharing method includes:
Receiving an operation instruction aiming at a target position, which is sent by a client;
feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
In one possible implementation manner of the present application, before the step of feeding back the acquired real-time image information of the target location to the client, the method includes any one of the following steps:
receiving real-time image information of the target position sent by an unmanned aerial vehicle, wherein the real-time image information is obtained by processing the unmanned aerial vehicle according to the real-time image data;
And receiving the real-time image information of the target position sent by the control end, wherein the real-time image information is obtained by processing the control end of the unmanned aerial vehicle according to the real-time image data, and the unmanned aerial vehicle sends the real-time image data of the shot target position to the control end for processing, or the unmanned aerial vehicle sends the real-time image data to the control end for further processing after preliminary processing so as to obtain the real-time image information.
In one possible implementation manner of the present application, before the step of receiving the operation instruction for the target location sent by the client, the method further includes:
acquiring a feature code of the unmanned aerial vehicle and an unmanned aerial vehicle icon;
and sending the feature code added with the unmanned aerial vehicle, the unmanned aerial vehicle icon and the data corresponding to the current position of the unmanned aerial vehicle to the client in real time, so that the client can visually display the unmanned aerial vehicle icon and/or the motion trail of the unmanned aerial vehicle icon, wherein the unmanned aerial vehicle icon flies along a preset planning route in real time.
In a possible implementation manner of the present application, when the real-time image information is sent to a server side by an unmanned aerial vehicle, the unmanned aerial vehicle further sends the real-time image information to the control side for backup.
The application also provides an unmanned aerial vehicle information sharing method, which is applied to the unmanned aerial vehicle in the unmanned aerial vehicle information sharing system, and comprises the following steps:
shooting a target position when the target position is located or when a shooting request is received, and obtaining real-time image data, wherein the target position is a position where a client clicks to request real-time image information;
extracting features of the real-time image data based on a preset analysis model to obtain density feature information, wherein the density feature information at least comprises a group density map;
and integrating the group density map to obtain real-time image information of the target position, wherein the real-time image information further comprises real-time traffic flow, real-time traffic flow and/or travel advice aiming at the target position.
In a possible embodiment of the present application, after the step of capturing the target position to obtain real-time image data, the method includes:
extracting picture characteristic information from the real-time image data according to a preset sensitive characteristic extraction model;
comparing the picture characteristic information with preset sensitive information in a preset sensitive characteristic information library, and if the picture characteristic information is determined to contain the sensitive information, blurring or filtering the sensitive data in the picture characteristic information until real-time image information which does not contain the sensitive information is obtained.
The application also provides an unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing device still includes:
the receiving module is used for receiving an operation instruction aiming at a target position and sent by the client;
the feedback module is used for feeding back the acquired real-time image information of the target position to the client according to the operation instruction so that the client can make a corresponding trip decision according to the real-time image information;
after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
The application also provides unmanned aerial vehicle information sharing equipment, unmanned aerial vehicle information sharing equipment is entity node equipment, unmanned aerial vehicle information sharing equipment includes: the unmanned aerial vehicle information sharing system comprises a memory, a processor and a program of the unmanned aerial vehicle information sharing method, wherein the program of the unmanned aerial vehicle information sharing method is stored in the memory and can run on the processor, and the steps of the unmanned aerial vehicle information sharing method can be realized when the program of the unmanned aerial vehicle information sharing method is executed by the processor.
In order to achieve the above object, there is also provided a storage medium having stored thereon an unmanned aerial vehicle information sharing program which, when executed by a processor, implements the steps of any one of the unmanned aerial vehicle information sharing methods described above.
The application provides an unmanned aerial vehicle information sharing system, method, device, equipment and storage medium. Compared with the prior art that when a user views the environment information of a destination, the user can only view the pre-stored image through the electronic map, so that the real-time image cannot be observed, in the method, the operation instruction aiming at the target position and sent by the client is received; feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position; or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period. In the method, the operation instruction of the client side for the target position is received, the real-time image of the target position shot by the unmanned aerial vehicle is fed back to the client side according to the operation instruction, after the operation instruction of the client side is received by the server side, when the unmanned aerial vehicle is not at the target position, the unmanned aerial vehicle is requested to fly to the target position in real time to shoot real-time image data of the target, and then real-time image information corresponding to the real-time image data is fed back to the client side, when the unmanned aerial vehicle is at the target position, the real-time image information of the target position is directly acquired, and when the unmanned aerial vehicle flies through the target position in a historical time period, the historical time period is smaller than a first preset time period, so that the image data of the unmanned aerial vehicle in a short time shot when the target position can be called, the real-time image data of the target position can be fed back to the client side, and accordingly a user can acquire the real-time image information of the target position through the client side and make a corresponding trip decision according to the real-time image information.
Drawings
Fig. 1 is a schematic flow chart of a first embodiment of an information sharing method of a drone of the present application;
fig. 2 is a schematic diagram of a refinement flow for receiving real-time image information in a second embodiment related to an information sharing method of an unmanned plane of the present application;
FIG. 3 is a schematic diagram of a device architecture of a hardware operating environment according to an embodiment of the present application;
fig. 4 is a schematic diagram of data interaction related to an information sharing method of an unmanned plane in the application;
fig. 5 is a general flow diagram related to the information sharing method of the unmanned aerial vehicle of the present application.
Detailed Description
It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
In a first embodiment of the unmanned aerial vehicle information sharing method, referring to fig. 1, the method is applied to a server side in an unmanned aerial vehicle information sharing system, and includes:
step S10, receiving an operation instruction aiming at a target position, which is sent by a client;
step S20, feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information;
After receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
In this embodiment, the research and development background is aimed at:
when a user uses an electronic map or navigation software, the user can view the image information of the target position, but the image information which is generally prestored is not real-time, and when the user wants to go to a certain place, the user can find that the people flow is too large to the place, so that the playing experience of the user is poor, and further, a lot of inconvenience is generated for the user.
The present embodiment aims at: and receiving real-time image data aiming at the target position and shot by the unmanned aerial vehicle, so that a user corresponding to the client can view the real-time image information of the target position.
The method comprises the following specific steps:
step S10, receiving an operation instruction aiming at a target position, which is sent by a client;
as one example, the location information sharing method may be applied to a location information sharing device that belongs to a location information sharing system that belongs to a location information sharing apparatus.
As an example, the location information sharing method may be applied to a server side, which may be a third party server side subordinate to an electronic map or navigation software.
As one example, the target location is a specific location where the user wants to view real-time image information, and the target location may be a scenic spot, a school, a mountain area, or the like.
As an example, the operation instruction is specifically an instruction issued by a user through a client such as a mobile phone or a computer, and the operation instruction may be to view an image or view a movement track of the unmanned aerial vehicle.
Step S20, feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information;
after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
Or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
As an example, the server side feeds back the real-time image information of the target position acquired from the unmanned aerial vehicle to the client side according to the operation instruction, so that the user can acquire the real-time image information of the target position.
As an example, the number of unmanned aerial vehicles is at least one, and all be equipped with GPS on every unmanned aerial vehicle for gather corresponding positional information, control many unmanned aerial vehicles through unmanned aerial vehicle's control end, wherein, every unmanned aerial vehicle all has corresponding unmanned aerial vehicle feature code, and unmanned aerial vehicle feature code is as unmanned aerial vehicle's unique identification for discern corresponding unmanned aerial vehicle.
As an example, before starting the unmanned aerial vehicle, the control end needs to confirm whether the current environment of the unmanned aerial vehicle belongs to the no-fly area.
As an example, the no-fly area may be a mountain area, a forest, or the like that is unsuitable for unmanned aerial vehicle flight.
As an example, when the area where the unmanned aerial vehicle is located belongs to the no-fly area, the unmanned aerial vehicle is not started, or the unmanned aerial vehicle flies according to other preset planned routes.
As an example, if the area where the unmanned aerial vehicle is located does not belong to the no-fly area, the control end issues a corresponding flight instruction to the unmanned aerial vehicle, and makes the unmanned aerial vehicle fly according to a preset planned route.
As an example, the control end of the unmanned aerial vehicle is used for judging the area where the unmanned aerial vehicle is located, so that the situation that equipment is damaged due to the fact that the unmanned aerial vehicle flies in the no-fly area is avoided.
As an example, the control end of the unmanned aerial vehicle may be a ground station or a vehicle (hereinafter, the unmanned aerial vehicle control end is simply referred to as a vehicle), and each unmanned aerial vehicle establishes a communication connection with the vehicle or a third party platform, so that the vehicle transmits corresponding unmanned aerial vehicle data to the vehicle or the third party platform (software service platform such as an electronic map).
As an example, the transmitted drone data may be location information, drone feature codes, images or text information, or the like.
As an example, when capturing images, a plurality of unmanned aerial vehicles fly in real time along a preset planned route, so that a user can find unmanned aerial vehicles flying near a target position after determining the corresponding target position, and send a request instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can capture real-time images of the target position; the manner in which the user views the real-time image of the target location may also be: the user clicks the target position, at the moment, the unmanned aerial vehicle starts to start, flies to the target position to shoot, and then feeds back real-time image data to the user.
As an example, when a user is linked with a third party platform and an unmanned aerial vehicle through a client, the user selects corresponding position information to be checked through the third party platform, after determining a target position, clicks the corresponding target position, the third party platform sends a request to a vehicle according to an operation instruction of the user, the vehicle calls the unmanned aerial vehicle and controls the unmanned aerial vehicle to fly to the target position to shoot a real-time image, the unmanned aerial vehicle or the vehicle sends the processed real-time image information to the third party platform, the third party platform analyzes the received real-time image information, and the analysis process can be to match the image data according to the position information or process the real-time image information through a corresponding data path and feed back to the client.
As an example, a schematic diagram of data interaction related to an unmanned aerial vehicle information sharing method is shown in fig. 4, where a server side may be a third party platform, when a client side corresponding to a user wants to view image data of a target position, an operation instruction is sent to the server side, the server side issues an instruction to a control side, instructs the unmanned aerial vehicle to fly to the target position to shoot an image, processes the shot image data by the unmanned aerial vehicle or the control side, sends the processed image data to the control side and/or the server side, and finally feeds back the processed image data to the client side.
As an example, after the server receives the operation instruction of the client, for the unmanned aerial vehicle, there are three cases:
1. the unmanned aerial vehicle is not located at the target position and on a preset planning route; the manner in which the real-time image of the target location is viewed at this time may be: and finding the unmanned aerial vehicle flying near the target position, sending a request instruction to the unmanned aerial vehicle so that the unmanned aerial vehicle can shoot a real-time image of the target position, and feeding back real-time image information to the client.
2. The unmanned aerial vehicle is at the target position; the manner in which the real-time image of the target location is viewed at this time may be: and the user clicks the unmanned aerial vehicle icon corresponding to the unmanned aerial vehicle at the target position through the client so as to acquire the real-time image information of the target position shot by the unmanned aerial vehicle.
3. The unmanned aerial vehicle just flies over the target position for a short time and is on a preset planning route; the manner in which the real-time image of the target location is viewed at this time may be: clicking the unmanned aerial vehicle on the preset planning route, and calling the image data of the target position in the historical time period, wherein the historical time period is smaller than the first preset time period (the first preset time period is changed according to different requirements of users, and can be 2 minutes or 10 minutes), so that the image data in the historical time period can also be used as real-time image information of the target position.
As an example, in various cases, the client may obtain real-time data of the corresponding target location through the third party platform and the unmanned aerial vehicle, so as to meet the requirement of the user for viewing the real-time image information of the target location.
The application provides an unmanned aerial vehicle information sharing system, method, device, equipment and storage medium. Compared with the prior art that when a user views the environment information of a destination, the user can only view the pre-stored image through the electronic map, so that the real-time image cannot be observed, in the method, the operation instruction aiming at the target position and sent by the client is received; feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position; or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period. In the method, the operation instruction of the client side for the target position is received, the real-time image of the target position shot by the unmanned aerial vehicle is fed back to the client side according to the operation instruction, after the operation instruction of the client side is received by the server side, when the unmanned aerial vehicle is not at the target position, the unmanned aerial vehicle is requested to fly to the target position in real time to shoot real-time image data of the target, and then real-time image information corresponding to the real-time image data is fed back to the client side, when the unmanned aerial vehicle is at the target position, the real-time image information of the target position is directly acquired, and when the unmanned aerial vehicle flies through the target position in a historical time period, the historical time period is smaller than a first preset time period, so that the image data of the unmanned aerial vehicle in a short time shot when the target position can be called, the real-time image data of the target position can be fed back to the client side, and accordingly a user can acquire the real-time image information of the target position through the client side and make a corresponding trip decision according to the real-time image information.
Further, based on the first embodiment in the present application, another embodiment in the present application is provided, in this embodiment, referring to fig. 2, before the step of feeding back the acquired real-time image information of the target location to the client, the method includes any one of the following:
step S101, receiving real-time image information of the target position sent by an unmanned aerial vehicle, wherein the real-time image information is obtained by processing the unmanned aerial vehicle according to the real-time image data;
as an example, the real-time image information received by the server may be obtained by processing the captured image data by the unmanned aerial vehicle, where the unmanned aerial vehicle directly analyzes and processes the real-time image data captured by the unmanned aerial vehicle, and then sends the real-time image data to the server.
Step S102, receiving real-time image information of the target position sent by a control end, wherein the real-time image information is obtained by processing the real-time image data by a control end of an unmanned aerial vehicle, and the unmanned aerial vehicle sends the real-time image data of the shot target position to the control end for processing, or the unmanned aerial vehicle sends the real-time image data to the control end for further processing after preliminary processing so as to obtain the real-time image information;
As an example, the real-time image information received by the server may be real-time image information of the target location obtained by processing the real-time image data by the control end of the unmanned aerial vehicle.
As an example, the process of the unmanned aerial vehicle and/or the control terminal processing the real-time image data to obtain the real-time image information may be: the unmanned aerial vehicle directly sends the real-time image data of shooting to the control end, and the control end makes corresponding analysis processing to the real-time image data, also can be that unmanned aerial vehicle does preliminary processing to the real-time image data of shooting, and the data after preliminary processing is sent to the control end and is further processed again, and the second mode is mainly used unmanned aerial vehicle's end's computational capacity is not enough to use unmanned aerial vehicle and control end to handle jointly, improve the computational speed to image data.
Before the step of receiving the operation instruction for the target position sent by the client, the method further includes:
step A1, acquiring a feature code of an unmanned aerial vehicle and an unmanned aerial vehicle icon;
as an example, the feature code of the drone is used as a unique identification of the drone for identifying the corresponding drone.
As an example, the unmanned aerial vehicle icon corresponds to the position of the unmanned aerial vehicle, and the server side identifies the unmanned aerial vehicle and displays the position of the unmanned aerial vehicle on the client side through the unmanned aerial vehicle icon.
And A2, transmitting the feature code added with the unmanned aerial vehicle, the unmanned aerial vehicle icon and the data corresponding to the current position of the unmanned aerial vehicle to a client in real time, so that the client can visually display the unmanned aerial vehicle icon and/or the motion track of the unmanned aerial vehicle icon, wherein the unmanned aerial vehicle icon flies along a preset planning route in real time.
As an example, the real-time image data shot by the unmanned aerial vehicle is added with the feature code of the unmanned aerial vehicle and the current position of the unmanned aerial vehicle, and the user can check the real-time image data of the corresponding position by clicking the unmanned aerial vehicle icon displayed by the client, and the real-time image data at this time is displayed with the feature code of the unmanned aerial vehicle and the current position of the unmanned aerial vehicle.
As an example, the content visually displayed by the client may be one or more of an unmanned aerial vehicle icon and a motion track of the unmanned aerial vehicle icon, the user may select that the content to be viewed is the unmanned aerial vehicle icon or the motion track of the unmanned aerial vehicle icon, the unmanned aerial vehicle icon moves in real time along a preset planning route, when the user needs to view image data of a certain position, the user may directly click on the unmanned aerial vehicle icon moving in real time, and issue a corresponding instruction to fly the unmanned aerial vehicle to a target position, when the user views the motion track of the unmanned aerial vehicle icon through the client and passes through the target position desired to be viewed by the user, because the unmanned aerial vehicle icon moves in real time, the historical image data captured by the unmanned aerial vehicle in a historical time period can be directly retrieved, and because the historical time period is smaller than the first preset time period, the historical image data directly retrieved at this time can reflect a real-time image of the target position in a short time.
In this embodiment, the real-time image data shot by the unmanned aerial vehicle is jointly processed through the unmanned aerial vehicle and/or the control terminal, so that corresponding real-time image information is obtained, and the calculation speed of the real-time image data can be improved.
Further, based on the first embodiment and the second embodiment in the present application, another embodiment of the present application is provided, in this embodiment, the method for sharing information by a drone applied to a drone information sharing system includes:
step B1, shooting a target position when the target position is located or when a shooting request is received, and obtaining real-time image data, wherein the target position is a position clicked by a client to request real-time image information;
as an example, the shooting request corresponds to an operation instruction of the client, and after receiving the operation instruction, the server side sends the shooting request to the unmanned aerial vehicle, and the unmanned aerial vehicle shoots real-time image data of the target position according to the shooting request.
As an example, the target position is a position where the user wants to view the real-time image, and the user clicks the target position on the third party platform, at this time, the server side sends a request to the control side to call the unmanned aerial vehicle, so that the control side calls the unmanned aerial vehicle to fly to the target position to shoot image data.
Step B2, extracting the characteristics of the real-time image data based on a preset analysis model to obtain density characteristic information, wherein the density characteristic information at least comprises a group density map;
as an example, the preset analysis model is specifically a model with a convolutional neural network as a core, the characteristic extraction is performed on the real-time image data through the preset analysis model, the real-time image data is converted into a plurality of pictures, and the head characteristics and the vehicle characteristics of the people in the pictures are extracted through the convolutional neural network, so that the density characteristic information of the target position can be obtained.
As one example, the density characteristic information may be a group density map of the target location, the group density map being used to reflect real-time traffic of the target location.
And B3, carrying out integral processing on the group density map to obtain real-time image information of the target position, wherein the real-time image information further comprises real-time traffic flow, real-time traffic flow and/or travel advice aiming at the target position.
As an example, the step of integrating the group density map specifically includes obtaining the total number of people or the total number of vehicles in the picture through density map integration, so as to obtain the real-time traffic volume and the real-time traffic volume of the target position, and feeding back the travel advice (travel advice may be advice travel/non-advice travel) for the target position according to the corresponding traffic volume and traffic volume information; for example, an image analysis and judgment scene of the people flow in a scenic spot is carried out, an unmanned aerial vehicle shoots an image of the scenic spot, the image shot in the image is subjected to image information analysis through the unmanned aerial vehicle or a vehicle/ground station, the number of faces in the image is identified, the people flow condition is judged according to the number of faces, the maximum number of people capable of being accommodated in the scenic spot and an open area, if the number of faces/heads is large, but the open area is small, the people flow is judged to be high, and the number of specific people is counted; if the number of faces/heads is large, the open area is large, and the face/head is considered as the flow of people; if the number of people's heads/faces is small and the open area is large, the flow rate of people is low. The user can view the view shot by the unmanned aerial vehicle in real time according to clicking the unmanned aerial vehicle icon through the electronic map, and meanwhile, the user side can receive the people flow condition after image recognition analysis statistics, and the method is not limited to recognizing people flow, and can recognize the open condition of scenic spots and the like. If the user intends to see the rape flowers, the unmanned aerial vehicle can be used for shooting to see whether the rape flowers are full or not and whether the rape flowers are suitable for on-site viewing or not in real time, so that the decision of the user is helped; the above examples are not limited to judgment of people flow, and traffic flow and information which can be obtained by other users through a field mode can be judged, and the scene can be visually seen by clicking the unmanned aerial vehicle icon on the map, which is equivalent to that people do not go out, but the shooting result can be seen through the unmanned aerial vehicle on the map, and the landscape of the area is known.
The step of capturing the target position to obtain real-time image data includes:
step C1, extracting picture feature information from the real-time image data according to a preset sensitive feature extraction model;
as an example, the sensitive feature extraction model is specifically a deep reinforcement learning model, and the sensitive feature extraction model is used for extracting the picture feature information of the real-time image data.
And C2, comparing the picture characteristic information with preset sensitive information in a preset sensitive characteristic information library, and blurring or filtering the sensitive data in the picture characteristic information if the picture characteristic information is determined to contain the sensitive information until the real-time image information which does not contain the sensitive information is obtained.
As an example, the picture feature information is compared with preset sensitive information in a preset sensitive feature information library, and when the picture feature information is determined to contain the sensitive information, desensitization processing is required to be performed on the picture feature information, so that the obtained real-time image information is prevented from containing the sensitive information.
As an example, in the desensitization processing, the single pictures in the picture feature information are compared, if all the areas of the single pictures are sensitive information, the picture containing the sensitive information is directly filtered, and if the partial areas of the single pictures are sensitive information, the partial areas in the picture are subjected to blurring processing.
As an example, after the desensitization processing of the real-time image data is completed, there is also a desensitization review processing of the real-time image data until the real-time image information does not contain sensitive information.
As an example, the desensitization processing and the people flow analysis processing of the real-time image data can be completed by the unmanned aerial vehicle, or the unmanned aerial vehicle can send the shot image data to a control end for processing; when the unmanned plane or the control end processes, the control end stores the image data after related desensitization and the image data without desensitization, thereby facilitating the subsequent call of the image data.
In this embodiment, by analyzing and processing the image shot by the unmanned aerial vehicle, the corresponding traffic flow or traffic flow information of the target position can be obtained, so that the user can more intuitively understand the real-time environment of the target position, and according to the real-time environment, a corresponding trip decision is made, and the transmission of sensitive data can be avoided by performing desensitization processing on the real-time image data.
Further, based on the first, second and third embodiments in the present application, another embodiment of the present application is provided, in which, when the real-time image information is sent to the server side by the unmanned aerial vehicle, the unmanned aerial vehicle further sends the real-time image information to the control side for backup.
As an example, when the real-time image information is sent to the server by the unmanned aerial vehicle, the unmanned aerial vehicle processes the real-time image data, and at this time, the unmanned aerial vehicle sends the real-time image information to the server and sends the real-time image information to the control end for backup, so that the corresponding position data can be conveniently and subsequently invoked.
As an example, the data sent to the control end by the unmanned aerial vehicle may be real-time image data, real-time image information, sensitive image information, image information after desensitization, etc., and the unmanned aerial vehicle may perform data backup before sending the data to the server end, so as to avoid the failure to retrieve when the information is lost.
In this embodiment, when the unmanned aerial vehicle processes the real-time image data alone, the real-time image data or the real-time image information is sent to the control end at the same time, so as to perform data backup, and facilitate subsequent call of the historical image data.
As an example, an implementation flow diagram of an information sharing method of an unmanned aerial vehicle is shown in fig. 5, and is available from fig. 5, when a user does not issue an operation instruction, the unmanned aerial vehicle flies in real time along a preset planned route, after receiving the operation instruction for a target position of the user, if the unmanned aerial vehicle is at the target position, the unmanned aerial vehicle directly shoots the target position, if the unmanned aerial vehicle is not at the target position, the unmanned aerial vehicle flies to the target position to shoot, position information (longitude and latitude information and the like) and unmanned aerial vehicle feature codes are added to real-time image data obtained by shooting, recognition of subsequent images is facilitated, further, the real-time image data is analyzed or desensitized through an unmanned aerial vehicle or a control end to obtain real-time image information, after the real-time image information is sent to a server end, the control end also backs up the real-time image information, the server end decodes the received real-time image information, and feeds back the real-time image information to a corresponding client according to the corresponding unmanned aerial vehicle feature codes and the position information of the target position.
Referring to fig. 3, fig. 3 is a schematic device structure diagram of a hardware running environment according to an embodiment of the present application.
As shown in fig. 3, the unmanned aerial vehicle information sharing apparatus may include: a processor 1001, a memory 1005, and a communication bus 1002. The communication bus 1002 is used to enable connected communication between the processor 1001 and the memory 1005.
Optionally, the unmanned aerial vehicle information sharing device may further include a user interface, a network interface, a camera, an RF (Radio Frequency) circuit, a sensor, a WiFi module, and so on. The user interface may include a Display, an input sub-module such as a Keyboard (Keyboard), and the optional user interface may also include a standard wired interface, a wireless interface. The network interface may include a standard wired interface, a wireless interface (e.g., WI-FI interface).
It will be appreciated by those skilled in the art that the unmanned information sharing device structure shown in fig. 3 does not constitute a limitation of the unmanned information sharing device, and may include more or fewer components than shown, or may combine certain components, or may be a different arrangement of components.
As shown in fig. 3, an operating system, a network communication module, and a drone information sharing program may be included in the memory 1005 as one type of storage medium. The operating system is a program that manages and controls the hardware and software resources of the unmanned aerial vehicle information sharing device, supporting the operation of the unmanned aerial vehicle information sharing program and other software and/or programs. The network communication module is used for realizing communication among components in the memory 1005 and communication with other hardware and software in the unmanned aerial vehicle information sharing system.
In the unmanned aerial vehicle information sharing apparatus shown in fig. 3, the processor 1001 is configured to execute the unmanned aerial vehicle information sharing program stored in the memory 1005, and implement the steps of the unmanned aerial vehicle information sharing method described in any one of the above.
The specific implementation manner of the unmanned aerial vehicle information sharing device is basically the same as the above embodiments of the unmanned aerial vehicle information sharing method, and will not be described herein again.
The application provides an unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing system includes:
the server side is used for: receiving an operation instruction aiming at a target position, which is sent by a client; the method is also used for feeding back the acquired real-time image information of the target position to the client according to the operation instruction so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
the method comprises the steps of detecting that the unmanned aerial vehicle flies through a target position in the target position or in a historical time period, and directly determining real-time image data of the target position fed back by the unmanned aerial vehicle to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period;
Unmanned aerial vehicle, unmanned aerial vehicle is used for: flying along a preset planning route, shooting real-time image data of a plurality of places on the preset planning route, and sending the real-time image data to a control end; the real-time image processing device is also used for processing the real-time image data to obtain real-time image information, and feeding the real-time image information back to the server side and the control side respectively;
the control end, the control end is used for: receiving real-time image data of a target position fed back by the unmanned aerial vehicle, and processing the real-time image data to obtain real-time image information; and the system is also used for controlling the unmanned aerial vehicle to fly and judging the take-off environment of the unmanned aerial vehicle.
In a possible implementation manner of the present application, the server side is further configured to:
receiving an operation instruction aiming at a target position, which is sent by a client;
feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
Or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
In a possible implementation manner of the present application, the server side is further configured to:
receiving real-time image information of the target position sent by an unmanned aerial vehicle, wherein the real-time image information is obtained by processing the unmanned aerial vehicle according to the real-time image data;
and receiving the real-time image information of the target position sent by the control end, wherein the real-time image information is obtained by processing the control end of the unmanned aerial vehicle according to the real-time image data, and the unmanned aerial vehicle sends the real-time image data of the shot target position to the control end for processing, or the unmanned aerial vehicle sends the real-time image data to the control end for further processing after preliminary processing so as to obtain the real-time image information.
In a possible implementation manner of the present application, the server side is further configured to:
acquiring a feature code of the unmanned aerial vehicle and an unmanned aerial vehicle icon;
And sending the feature code added with the unmanned aerial vehicle, the unmanned aerial vehicle icon and the data corresponding to the current position of the unmanned aerial vehicle to the client in real time, so that the client can visually display the unmanned aerial vehicle icon and/or the motion trail of the unmanned aerial vehicle icon, wherein the unmanned aerial vehicle icon flies along a preset planning route in real time.
In a possible embodiment of the present application, the unmanned aerial vehicle is further configured to:
shooting a target position when the target position is located or when a shooting request is received, and obtaining real-time image data, wherein the target position is a position where a client clicks to request real-time image information;
extracting features of the real-time image data based on a preset analysis model to obtain density feature information, wherein the density feature information at least comprises a group density map;
and integrating the group density map to obtain real-time image information of the target position, wherein the real-time image information further comprises real-time traffic flow, real-time traffic flow and/or travel advice aiming at the target position.
In a possible embodiment of the present application, the unmanned aerial vehicle is further configured to:
Extracting picture characteristic information from the real-time image data according to a preset sensitive characteristic extraction model;
comparing the picture characteristic information with preset sensitive information in a preset sensitive characteristic information library, and if the picture characteristic information is determined to contain the sensitive information, blurring or filtering the sensitive data in the picture characteristic information until real-time image information which does not contain the sensitive information is obtained.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or system that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or system. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or system that comprises the element.
The foregoing embodiment numbers of the present application are merely for describing, and do not represent advantages or disadvantages of the embodiments.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) as described above, including several instructions for causing a terminal device (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method described in the embodiments of the present application.
The foregoing description is only of the preferred embodiments of the present application, and is not intended to limit the scope of the claims, and all equivalent structures or equivalent processes using the descriptions and drawings of the present application, or direct or indirect application in other related technical fields are included in the scope of the claims of the present application.

Claims (10)

1. An unmanned aerial vehicle information sharing system, the unmanned aerial vehicle information sharing system comprising:
the server side is used for: receiving an operation instruction aiming at a target position, which is sent by a client; the method is also used for feeding back the acquired real-time image information of the target position to the client according to the operation instruction so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
the method comprises the steps of detecting that the unmanned aerial vehicle flies through a target position in the target position or in a historical time period, and directly determining real-time image data of the target position fed back by the unmanned aerial vehicle to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period;
Unmanned aerial vehicle, unmanned aerial vehicle is used for: flying along a preset planning route, shooting real-time image data of a plurality of places on the preset planning route, and sending the real-time image data to a control end; the real-time image processing device is also used for processing the real-time image data to obtain real-time image information, and feeding the real-time image information back to the server side and the control side respectively;
the control end, the control end is used for: receiving real-time image data of a target position fed back by the unmanned aerial vehicle, and processing the real-time image data to obtain real-time image information; and the system is also used for controlling the unmanned aerial vehicle to fly and judging the take-off environment of the unmanned aerial vehicle.
2. The unmanned aerial vehicle information sharing method is characterized by being applied to a server side in an unmanned aerial vehicle information sharing system, and comprises the following steps:
receiving an operation instruction aiming at a target position, which is sent by a client;
feeding back the acquired real-time image information of the target position to the client according to the operation instruction, so that the client can make a corresponding trip decision according to the real-time image information; after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
Or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
3. The unmanned aerial vehicle information sharing method of claim 2, wherein before the step of feeding back the acquired real-time image information of the target location to the client, any one of the following is included:
receiving real-time image information of the target position sent by an unmanned aerial vehicle, wherein the real-time image information is obtained by processing the unmanned aerial vehicle according to the real-time image data;
and receiving the real-time image information of the target position sent by the control end, wherein the real-time image information is obtained by processing the control end of the unmanned aerial vehicle according to the real-time image data, and the unmanned aerial vehicle sends the real-time image data of the shot target position to the control end for processing, or the unmanned aerial vehicle sends the real-time image data to the control end for further processing after preliminary processing so as to obtain the real-time image information.
4. The unmanned aerial vehicle information sharing method of claim 2 or 3, wherein the step of receiving the operation instruction for the target location sent by the client further comprises:
acquiring a feature code and/or an unmanned aerial vehicle icon of the unmanned aerial vehicle;
and sending the feature code added with the unmanned aerial vehicle, the unmanned aerial vehicle icon and the data corresponding to the current position of the unmanned aerial vehicle to the client in real time, so that the client can visually display the unmanned aerial vehicle icon and/or the motion trail of the unmanned aerial vehicle icon, wherein the unmanned aerial vehicle icon flies along a preset planning route in real time.
5. The unmanned aerial vehicle information sharing method of claim 3, wherein when the real-time image information is sent by an unmanned aerial vehicle to a server side, the unmanned aerial vehicle further sends the real-time image information to the control side for backup.
6. The unmanned aerial vehicle information sharing method is characterized by being applied to unmanned aerial vehicles in an unmanned aerial vehicle information sharing system, and comprises the following steps:
shooting a target position when the target position is located or when a shooting request is received, and obtaining real-time image data, wherein the target position is a position where a client clicks to request real-time image information;
Extracting features of the real-time image data based on a preset analysis model to obtain density feature information, wherein the density feature information at least comprises a group density map;
and integrating the group density map to obtain real-time image information of the target position, wherein the real-time image information further comprises real-time traffic flow, real-time traffic flow and/or travel advice aiming at the target position.
7. The unmanned aerial vehicle information sharing method of claim 6, wherein after the step of capturing the target location to obtain real-time image data, comprising:
extracting picture characteristic information from the real-time image data according to a preset sensitive characteristic extraction model;
comparing the picture characteristic information with preset sensitive information in a preset sensitive characteristic information library, and if the picture characteristic information is determined to contain the sensitive information, blurring or filtering the sensitive data in the picture characteristic information until real-time image information which does not contain the sensitive information is obtained.
8. An unmanned aerial vehicle information sharing device, characterized in that the unmanned aerial vehicle information sharing device includes:
the receiving module is used for receiving an operation instruction aiming at a target position and sent by the client;
The feedback module is used for feeding back the acquired real-time image information of the target position to the client according to the operation instruction so that the client can make a corresponding trip decision according to the real-time image information;
after receiving the operation instruction, the server side requests the unmanned aerial vehicle to fly to a target position to acquire real-time image data of the target position and feeds back real-time image information corresponding to the real-time image data if the unmanned aerial vehicle is detected not to be at the target position;
or if the unmanned aerial vehicle is detected to fly through the target position in the target position or in a historical time period, directly determining real-time image data of the target position fed back by the unmanned aerial vehicle so as to feed back real-time image information corresponding to the real-time image data, wherein the historical time period is smaller than a first preset time period.
9. An unmanned aerial vehicle information sharing device, the device comprising: a memory, a processor, and a drone information sharing program stored on the memory and executable on the processor, the drone information sharing program configured to implement the steps of the drone information sharing method of any one of claims 2 to 7.
10. A computer-readable storage medium, wherein a drone information sharing program is stored on the computer-readable storage medium, which when executed by a processor, implements the steps of the drone information sharing method of any one of claims 2 to 7.
CN202310550897.4A 2023-05-16 2023-05-16 Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium Pending CN116506495A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310550897.4A CN116506495A (en) 2023-05-16 2023-05-16 Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310550897.4A CN116506495A (en) 2023-05-16 2023-05-16 Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116506495A true CN116506495A (en) 2023-07-28

Family

ID=87324718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310550897.4A Pending CN116506495A (en) 2023-05-16 2023-05-16 Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116506495A (en)

Similar Documents

Publication Publication Date Title
US9714089B1 (en) Trigger agents in video streams from drones
US11041737B2 (en) Method, device and system for processing a flight task
US11363185B1 (en) Determining capture instructions for drone photography based on images on a user device
CN110313174B (en) Shooting control method and device, control equipment and shooting equipment
US11805202B2 (en) Real-time crime center solution with dispatch directed digital media payloads
CN107357305A (en) Flight control method, unmanned plane and computer-readable storage medium
US20240062395A1 (en) Crime center system providing video-based object tracking using an active camera and a 360-degree next-up camera set
CN110781265A (en) Scenic spot person searching method and device and storage medium
US10275844B2 (en) Handheld photo enforcement systems and methods
WO2019087891A1 (en) Information processing device and flight control system
CN116506495A (en) Unmanned aerial vehicle information sharing system, unmanned aerial vehicle information sharing method, unmanned aerial vehicle information sharing device, unmanned aerial vehicle information sharing equipment and storage medium
CN112235355A (en) Highway road administration unmanned aerial vehicle inspection method and system
CN116520886A (en) Vehicle unmanned plane following method, system, device, electronic equipment and storage medium
CN111739346A (en) Air-ground cooperative scheduling command method and platform system
CN111654521A (en) Remote vehicle moving method, device and system
CN115134533B (en) Shooting method and equipment for automatically calling vehicle-mounted image acquisition device
US20200286307A1 (en) Self-driving vehicle stop position notification system and vehicle stop range registration method
KR102144231B1 (en) Method for transmitting video pictures being captured by a drone through cellular networks in real time and apparatus for said method
CN113342048B (en) Unmanned aerial vehicle express delivery method and system based on smart rod
CN111611897A (en) Unmanned aerial vehicle detection system and method based on camera network
CN111818481A (en) Unmanned aerial vehicle data interaction method, device, system and storage medium
CN112002145B (en) Unmanned aerial vehicle illegal flight reporting method and system
CN113359835A (en) Smart rod and distributed cloud system based on smart rod
CN113778125B (en) Flight equipment control method and device based on voice, vehicle and storage medium
US20230421729A1 (en) Information processing apparatus, method, and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination