CN111739346A - Air-ground cooperative scheduling command method and platform system - Google Patents

Air-ground cooperative scheduling command method and platform system Download PDF

Info

Publication number
CN111739346A
CN111739346A CN202010393924.8A CN202010393924A CN111739346A CN 111739346 A CN111739346 A CN 111739346A CN 202010393924 A CN202010393924 A CN 202010393924A CN 111739346 A CN111739346 A CN 111739346A
Authority
CN
China
Prior art keywords
information
target
event
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010393924.8A
Other languages
Chinese (zh)
Inventor
曾崛
贾泽露
赖海斌
潘梓颖
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Zhongke Baotai Aerospace Technology Co.,Ltd.
Original Assignee
Shenzhen Zhongke Baotai Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Zhongke Baotai Technology Co ltd filed Critical Shenzhen Zhongke Baotai Technology Co ltd
Priority to CN202010393924.8A priority Critical patent/CN111739346A/en
Publication of CN111739346A publication Critical patent/CN111739346A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18506Communications with or from aircraft, i.e. aeronautical mobile service
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/50Network services
    • H04L67/52Network services specially adapted for the location of the user terminal
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/14Systems for two-way working
    • H04N7/15Conference systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources

Landscapes

  • Engineering & Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Multimedia (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Astronomy & Astrophysics (AREA)
  • Traffic Control Systems (AREA)

Abstract

The application discloses an air-ground cooperative scheduling command method and a platform system. The method comprises the following steps: determining event information of a target event, wherein the event information comprises position information; determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range; determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle; after sending a scheduling instruction to a camera-enabled mobile terminal of a target person and a target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; and displaying the global information and the local information. By the method and the device, the global information and the local information of the target event (such as emergency events such as fire, traffic accidents and the like) site can be effectively acquired in time, and the commanding and dispatching work can be conveniently carried out.

Description

Air-ground cooperative scheduling command method and platform system
Technical Field
The application belongs to the technical field of unmanned aerial vehicles and air-ground cooperation, and particularly relates to an air-ground cooperative dispatching command method and a platform system.
Background
At present, when a target event occurs, especially some emergency events, such as fire and traffic accident, it is mainly the ground personnel that drive to the scene of the event to check the situation and feed back the situation of the event or command the event to be processed. On one hand, the method cannot respond quickly due to the problems of road construction, traffic jam and the like, namely, the target events cannot be found, treated and solved in the first time; on the other hand, the ground personnel can be limited by factors such as terrain after arriving at the site, and the dispatching command center is difficult to carry out global observation through the video pictures of the ground personnel, so that the event handling command work is not facilitated.
Disclosure of Invention
The embodiment of the application provides an air-ground cooperative scheduling command method and a platform system, which are used for solving the problem that global information and local information of a target event site cannot be timely and effectively acquired in the prior art.
In a first aspect, an embodiment of the present application provides an air-to-ground collaborative scheduling command method, which may include:
determining event information of a target event, wherein the event information comprises position information used for indicating the occurrence place of the target event;
determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is a region range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range;
determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle;
after sending a scheduling instruction to a camera-enabled mobile terminal of a target person and a target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location;
and displaying the global information and the local information.
Therefore, the global information and the local information of the target event (especially the emergency events such as fire and traffic accidents) can be timely and effectively acquired by the method and the device for acquiring the global information of the target event by the unmanned aerial vehicle near the place where the target event occurs and the target personnel near the place where the target event occurs going to the event field to acquire the local information. The global information and the local information of the target event are effectively acquired in time, and the command and scheduling work of a user is facilitated.
In a possible implementation manner of the first aspect, determining the target drone and the target person according to the scheduling information and the location information includes:
calculating the distance between the position of each person in the event range and the occurrence place of the target event according to the person position information in the person information and the position information of the target event;
calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information;
selecting a person closest to the target event occurrence place in the event range as a target person according to the distance between the position of each person in the event range and the target event occurrence place;
and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
In a possible implementation manner of the first aspect, the calculating, according to the position information of the person or the position information of the unmanned aerial vehicle, a distance between the position of the person or the position of the unmanned aerial vehicle and the occurrence location of the target event specifically includes:
calculating the distance between the two points by the following formula;
Figure BDA0002486963770000031
z represents the distance between two points, and R represents the approximate radius of the earth; WA represents the latitude value of the point A, WB represents the latitude value of the point B, JA represents the longitude value of the point A, and JB represents the longitude value of the point B;
the point A is the position of a person or an unmanned aerial vehicle, and the point B is the target event occurrence place.
In a possible implementation manner of the first aspect, if there is no unmanned aerial vehicle and/or no person within the event range, the method further includes:
and according to the unmanned aerial vehicle position information and the personnel position information, taking the unmanned aerial vehicle closest to the target event occurrence place as a target unmanned aerial vehicle, and/or taking the personnel closest to the target event occurrence place as target personnel.
In a possible implementation manner of the first aspect, planning a route from the target drone to a place where the target event occurs according to the exploration survey information and a location of the target drone includes:
and marking out a route by a route planning and calculating rule according to the position of the target unmanned aerial vehicle and the flight limiting area information, the obstacle information, the communication signal information and the wireless link information in the exploration and investigation information.
In a possible implementation manner of the first aspect, determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range includes:
taking an area which is a preset area and comprises a region range of a position corresponding to the position information as an event range;
acquiring exploration information of an event range from exploration information acquired in advance;
and obtaining scheduling information within an event range according to the real-time position information of the personnel and the real-time position information of the unmanned aerial vehicle.
In a possible implementation manner of the first aspect, the global information includes audio information and/or image information of a target event occurrence location acquired by the target drone, and the local information includes audio information and/or image information of a target event occurrence location acquired by a terminal of the target person.
In a second aspect, an air-ground cooperative dispatching command platform system provided in the embodiments of the present application includes a dispatching command center, an unmanned aerial vehicle, and a camera-shooting mobile terminal;
the scheduling command center is used for acquiring event information of the target event, and the event information comprises position information used for indicating the occurrence place of the target event; determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is a region range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range; determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle; after sending a scheduling instruction to a camera-enabled mobile terminal of a target person and a target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location; displaying the global information and the local information;
the unmanned aerial vehicle is used for acquiring route information transmitted by the dispatching command center, acquiring global information of a target event according to the route information and transmitting the global information to the dispatching command center;
the camera shooting mobile terminal is used for collecting local information of the target event and transmitting the local information to the dispatching command center.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method according to any one of the first aspect is implemented.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method according to any one of the above first aspects.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method of any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic architecture diagram of an air-ground cooperative dispatching command platform system according to an embodiment of the present application;
fig. 2 is another schematic diagram of an air-to-ground collaborative dispatching command platform system according to an embodiment of the present application;
fig. 3 is a schematic diagram of a three-way video call provided in an embodiment of the present application;
fig. 4 is a schematic block diagram of a flow of an air-ground cooperative scheduling command method according to an embodiment of the present application;
FIG. 5 is a schematic illustration of an event range provided by an embodiment of the present application;
fig. 6 is a schematic diagram of selecting an unmanned aerial vehicle, an unmanned aerial vehicle driver, and a ground handler according to an embodiment of the present application;
fig. 7 is a schematic block diagram of a structure of an air-ground cooperative scheduling directing device according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application.
The following first describes a system architecture to which embodiments of the present application may relate.
Referring to the schematic architecture diagram of the air-ground cooperative dispatching command platform system shown in fig. 1, the platform system may include a server (or dispatching command center) 11, an unmanned aerial vehicle 12, and a camera-enabled mobile terminal 13. The camera-enabled mobile terminal 13 refers to a camera-enabled mobile terminal device corresponding to the ground handling personnel, and may be, but is not limited to, a mobile phone, a tablet computer, a wearable device, or the like, and fig. 1 exemplarily shows that the camera-enabled mobile terminal is a mobile phone.
The server (or dispatch command center) 11 may communicate with the camera-enabled mobile terminal 13 of the ground treatment staff, and the communication mode may be any, for example, the mobile phone of the ground treatment staff may communicate with the server (or dispatch command center) 11 through 4G or 5G or Wifi. The communication between the server 11 and the unmanned aerial vehicle 12 can be through the camera-enabled mobile terminal device of the unmanned aerial vehicle driver, that is, the unmanned aerial vehicle 12 communicates with the camera-enabled mobile terminal device of the unmanned aerial vehicle driver first, and then communicates with the server 11; or, the server 11 communicates with the camera-enabled mobile terminal device of the drone driver first, and then communicates with the drone 12.
The camera-enabled mobile terminal 13 and the unmanned aerial vehicle 12 both have audio and/or video capturing capabilities, and specifically, the camera-enabled mobile terminal 13 and the unmanned aerial vehicle 12 may each include, but are not limited to, a camera, a speaker, an audio capturing module, and the like, for taking pictures, capturing video pictures, capturing audio information, and the like.
By way of example and not limitation, refer to another schematic diagram of the air-ground cooperative dispatching command platform system shown in fig. 2, which includes a dispatching command center 21, an air-ground cooperative dispatching command cloud platform 22, an air unmanned plane side 23, an event site 24 and a ground handling personnel side 25, as shown in fig. 2.
The dispatching command center 21 includes a video dispatching command interface (i.e., a Web browser) and a commander, and the video dispatching command interface may include a large screen for receiving and displaying global information of the event site 24 collected by the aerial unmanned aerial vehicle pushed by the air-ground cooperative dispatching command cloud platform 22 and local information of the event site 24 collected by the ground service personnel. The scheduling command center 21 is communicated with the air-ground cooperative scheduling command cloud platform 22. The air-ground cooperative scheduling director cloud platform 22 may be equivalent to the server in fig. 1. The air-ground cooperative dispatching command cloud platform 22 is in communication with the dispatching command center 21, one side 23 of the air unmanned aerial vehicle and one side 25 of the ground disposal personnel, and is used for receiving global information of an event site 24 acquired by the one side 23 of the air unmanned aerial vehicle and local information of the event site 24 acquired by the one side 25 of the ground disposal personnel, and then pushing the global information and the local information to a video dispatching command interface of the dispatching command center 21.
Aerial unmanned aerial vehicle one side 22, it includes unmanned aerial vehicle, the unmanned aerial vehicle driver of control unmanned aerial vehicle and the mobile terminal equipment that can make a video recording (not shown in the figure) that unmanned aerial vehicle driver corresponds, and the unmanned aerial vehicle driver can carry out three-party video conversation through the little letter applet on the mobile terminal equipment that can make a video recording (for example, the little letter applet on the cell-phone). The drone driver may control the drone to capture audio and/or video information of the global perspective of the incident scene 24 through the pan-tilt camera. It should be noted that the driver of the unmanned aerial vehicle can operate the unmanned aerial vehicle and perform a three-way video call with the dispatch command center 21 and ground service personnel. However, this approach may present a risk, for example, that the drone driver does not have control of the drone, resulting in a crash event for the drone. Under general conditions, there is the assistant beside the unmanned aerial vehicle driver, and the unmanned aerial vehicle driver can control unmanned aerial vehicle by attentive, and the assistant carries out three-party video conversation beside, conveys information like this.
The ground service personnel side 25 may collect audio and/or video information of a local perspective of the event site 24 through a wechat applet on the camera-enabled mobile terminal device, and in particular, the wechat applet may collect audio and/or video information of a local perspective of the event site through a camera on the camera-enabled mobile terminal device (e.g., a cell phone).
The audio and/or video information of the global visual angle acquired by the aerial unmanned aerial vehicle and the audio and/or video information of the local visual angle acquired by the ground disposal personnel are transmitted to the air-ground cooperative dispatching command cloud platform 22, and the air-ground cooperative dispatching command cloud platform 22 pushes the global information and the local information to the dispatching command center 21 and displays the global information and the local information on the video dispatching command interface. In this way, the director of the dispatching command center 21 can make a real-time decision according to the audio and/or video information of the global view and the audio and/or video information of the local view to dispatch the command.
The dispatch command center 21 can know that a target event happens to a place through people or unmanned aerial vehicle patrol, and the target event is generally an emergency event such as a fire, a traffic accident and the like. After the background system of the dispatching command center 21 knows that an emergency such as a fire or a traffic accident occurs in a certain place, the emergency range can be determined according to the place where the emergency occurs. The emergency area may be an area including a place where the emergency occurs. After the emergency event range is determined, the information of the flight-limiting area, the obstacle information, the 4G signal, the wireless link and the like in the event range and the related information of ground treatment personnel, the unmanned aerial vehicle driver and the unmanned aerial vehicle in the emergency event range are obtained.
According to the reported information of the real-time position of the ground disposal personnel, the real-time position of a driver of the unmanned aerial vehicle, the real-time position of the unmanned aerial vehicle and the like, a background system of the dispatching command center 21 selects the ground disposal personnel and the unmanned aerial vehicle which are closest to the emergency occurrence place, the ground disposal personnel who are selected in dispatching go to the emergency occurrence place, and the ground disposal personnel carry out three-party video call with the dispatching command center and the aerial unmanned aerial vehicle through a WeChat applet on a mobile phone. After the ground disposal personnel obtain the command of the dispatching command center, local information of an event site is collected through a camera of the mobile phone and is transmitted back to the air-ground cooperative dispatching command cloud platform in real time so as to be pushed to a video dispatching command interface of the dispatching command center for displaying. After the target unmanned aerial vehicle is determined, an unmanned aerial vehicle driver on one side of the aerial unmanned aerial vehicle or an assistant beside the unmanned aerial vehicle driver can carry out three-party video call with a scheduling command center and one side of a ground disposal person through a WeChat applet of a mobile phone. And the unmanned aerial vehicle driver controls the unmanned aerial vehicle to acquire global information of an event site according to the scheduling of the scheduling command center, and transmits the global information back to the air-ground cooperative scheduling command cloud platform, and the air-ground cooperative scheduling command cloud platform pushes the global information to a video scheduling command interface of the scheduling command center for displaying. Therefore, the scheduling command center can timely and effectively acquire local information transmitted by mobile equipment of ground disposal personnel and global information transmitted by the aerial unmanned aerial vehicle. After global information and local information of a target event such as a fire, a traffic accident, and the like are timely and effectively acquired, a commander can perform command and scheduling work, for example, fire rescue command work, based on the acquired global information and local information.
In specific application, a background system of the dispatching command center can plan a flight path from the position where the unmanned aerial vehicle flies to an event occurrence place according to information such as flight limiting area information, obstacle information, 4G signals and wireless links of an emergency range. And then, the background system of the dispatching command center transmits the air route information to the terminal equipment of the unmanned aerial vehicle driver, and the unmanned aerial vehicle driver uses the air route to control the unmanned aerial vehicle to fly to the place where the emergency happens. The unmanned aerial vehicle is connected with the camera mobile terminal equipment of the unmanned aerial vehicle driver, the unmanned aerial vehicle transmits the global information of the acquired emergency to the camera mobile terminal equipment in real time, the camera mobile terminal equipment transmits the global information to the air-ground cooperative dispatching command cloud platform, and the air-ground cooperative dispatching command cloud platform pushes the global information to the video dispatching command interface of the dispatching command center.
It should be noted that, in some embodiments, the scheduling command center 21 includes an air-ground collaborative scheduling command cloud platform 22 and a video scheduling command interface. In other embodiments, the air-ground cooperative dispatching command cloud platform 22 may not be included in the dispatching command center 21.
Referring to fig. 3, a three-way video call diagram is shown, which includes a dispatch command center interface 31, a ground handler side 32, and an drone driver side 33, as shown in fig. 3. The large screen of the dispatching command center interface 31 displays a video image 311 returned by the unmanned aerial vehicle, a video image 312 monitored on the ground, and a video image 313 of the WeChat applet. That is, the dispatch center interface 31 displays a global view video frame collected by an available unmanned aerial vehicle, a local view video frame collected by a ground handler, a call video frame of a driver of the unmanned aerial vehicle, the ground handler, and a commander of the dispatch center, and the like.
On the ground attendant side 32, the ground attendant joins the XXX conference via the messenger applet on the cell phone to enter the three-way video call screen 321. Similarly, the drone driver or an assistant beside the drone driver may join the XXX conference through the WeChat applet of the cell phone to enter the three-way video call screen 331.
It should be noted that the multiparty video call of the dispatch command center, the drone video and the applet video may be implemented based on Real-Time audio and video (RTC), but the specific implementation manner is not limited thereto.
After the system architecture that may be involved in the embodiments of the present application is described, an air-ground cooperative dispatching command process from the dispatching command center side will be described below.
Referring to fig. 4, a schematic flow chart of an air-ground cooperative dispatching command method is shown, and the method is applied to a dispatching command center. The method specifically comprises the following steps:
step S401, determining event information of the target event, wherein the event information comprises position information used for indicating the occurrence place of the target event.
It should be noted that the target event generally refers to an emergency event such as a fire or a traffic accident. Of course, the target event may also be a non-emergency event. The event information of the target event may include, but is not limited to, location information, event type, and the like. The location information refers to geographical location information of the place where the target event occurs. And the event type refers to what the target event is, for example, a fire, a traffic accident.
The event information of the target event can be a system discovered and reported to a dispatching command center in the unmanned aerial vehicle patrolling and patrolling process, and can also be a system reported to the dispatching command center by the masses. According to the related information reported by the unmanned aerial vehicle or the related information reported by the masses, the dispatching command center can determine where to send the emergency event and the related information such as the event type of the emergency event.
Step S402, determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is an area range including a target event occurrence place, and the scheduling information includes unmanned aerial vehicle information and personnel information in the event range.
Specifically, the geographical information of the target event transmission place is obtained, and the event range can be determined based on the geographical position information. And then acquiring exploration and survey information of the event range, and information of personnel and unmanned aerial vehicles positioned in the event range.
In some embodiments, an area range having an area as a preset area and including a position corresponding to the position information is used as the event range.
The event range may be a regular graph area, for example, a position corresponding to the position information may be used as a center of a circle, and a range with a radius of a preset radius value may be used as the event range, in this case, the event range is a circular curve. Or an irregular pattern region, which only needs to include the event sending place and has an area of a preset area.
The preset radius value may be, for example, 2 kilometers, and at this time, the event range is a circular area with a radius of 2 kilometers and the event occurrence location as a center of a circle. The preset area can be set according to actual conditions. See in particular the event range diagram shown in fig. 5. As shown in fig. 5, point b in the area a is the target event occurrence point, and in the left diagram, a circular area with point b as the center is the event range. In the right diagram, an area range including the point b is an event range. The dashed graph in fig. 5 is the event range.
After the event range of the target event is determined, exploration and investigation information of the event range is obtained from exploration and investigation information obtained in advance. The pre-acquired survey information may include, but is not limited to, flight-limit zone information, obstacle information, 4G signals, and wireless links.
Taking fig. 5 as an example, the area a is prospecting in advance to obtain the prospecting information of the area a. At the moment, the point b in the area A is in fire, the event range corresponding to the point b is determined, and then the exploration and investigation information of the event range is obtained from the exploration and investigation information of the area A.
After the event range of the target event is determined, scheduling information in the event range can be obtained according to the real-time personnel position information and the real-time unmanned aerial vehicle position information. The scheduling information includes ground treatment personnel information, unmanned aerial vehicle driver information, and the like, which are located within the event range.
And S403, determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route of the target unmanned aerial vehicle flying to the target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle.
In specific application, the unmanned aerial vehicle and the ground disposal personnel closest to the occurrence place of the target event can be scheduled to go to the event field to collect information; the unmanned aerial vehicle and the ground disposal personnel can be scheduled according to the distance and the working state, and at the moment, the unmanned aerial vehicle and the ground disposal personnel can be selected according to the working state information such as whether the unmanned aerial vehicle is executing a task and whether the ground disposal personnel are idle besides the distance between the unmanned aerial vehicle and the ground disposal personnel and the occurrence place of an event.
The scheduling information comprises real-time position information of ground treating personnel, the unmanned aerial vehicle and an unmanned aerial vehicle driver in the event range, and the distance between each ground treating personnel and the unmanned aerial vehicle and the event occurrence place in the event range is calculated according to the real-time position information and the position information of the event occurrence place. And selecting the unmanned aerial vehicle closest to the event occurrence place as a target unmanned aerial vehicle, and selecting the ground treating personnel closest to the event occurrence place as target personnel.
In specific application, after the background system determines the longitude and latitude information of the occurrence place of the target event, the longitude and latitude information of each ground handler, namely the real-time position information of each ground handler, is obtained in real time. And then, calculating the distance between each ground handling personnel and the event occurrence place according to the longitude and latitude information of the event occurrence place and the real-time position information of each ground handling personnel.
By way of example and not limitation, the formula for calculating the distance between two points from the longitude and latitude between the two points is as follows:
Figure BDA0002486963770000111
wherein Z represents the distance between two points (i.e., point-to-point distance), and R represents the approximate radius of the earth; WA represents a latitude value of a point a, WB represents a latitude value of a point B, JA represents a longitude value of a point a, and JB represents a longitude value of a point B.
In some embodiments, the backend system may first calculate a distance between each ground handler and the site where the target event occurs, and then determine the target drone and the target person according to the distance.
In yet other embodiments, the back-office system may also display the real-time location of each ground treatment person, as well as location information of the event occurrence on a large screen. Therefore, personnel of the dispatching command center can manually select target personnel and target unmanned aerial vehicles close to the event occurrence place by looking up the large screen.
In other words, the distance between the position of each person in the event range and the occurrence place of the target event can be calculated through the person position information in the person information and the position information of the target event; and selecting the person closest to the target event occurrence place in the event range as the target person according to the distance between the position of each person in the event range and the target event occurrence place. Calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information; and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
After the target unmanned aerial vehicle is determined, the background system can plan a route by a route planning algorithm according to the real-time position of the target unmanned aerial vehicle and the position of the occurrence place of the event, and by combining with the flight limiting area information, the obstacle information, the wireless link and the like in the event range, and the driver of the unmanned aerial vehicle takes the route and loads the route to the target unmanned aerial vehicle. The target drone may fly from the current location to the event occurrence by the route.
In other embodiments, if there is no drone, or no ground handling personnel, or both, for the event horizon. At this time, unmanned aerial vehicles and/or ground handling personnel outside the event range can be called according to the distance to the event scene.
Referring to fig. 6, which is a schematic diagram of selecting the unmanned aerial vehicle and the ground handler, as shown in fig. 6, point b is an event occurrence location, and a dotted line area is an event range corresponding to point b. A plurality of ground handling personnel and a plurality of drones are included within the event scope. Two ground treating personnel nearest to the event occurrence place, an unmanned aerial vehicle and an unmanned aerial vehicle driver nearest to the event occurrence place are selected, and the dispatching command center carries out video call with the selected ground treating personnel and the selected unmanned aerial vehicle driver and transmits information. The unmanned aerial vehicle driver is instructed to use the unmanned aerial vehicle to go to the event field to collect global information, and ground disposal personnel are instructed to go to the event field to collect local information.
And S404, after sending a scheduling instruction to the camera mobile terminal of the target person and the target unmanned aerial vehicle, receiving local information returned by the camera mobile terminal of the target person and global information returned by the target unmanned aerial vehicle. The scheduling instructions include instructions for instructing the target personnel to reach the location of the target event and instructions for instructing the target drone to use the flight path to reach the location of the target event.
In specific application, after the target unmanned aerial vehicle and the target personnel are determined, scheduling instructions can be respectively sent to the target unmanned aerial vehicle and the target personnel to indicate that the target unmanned aerial vehicle flies to an event occurrence place and indicate that the target personnel arrive at the event occurrence place. A background system of the dispatching command center can mark a route through a route planning and calculating rule, then a driver of the unmanned aerial vehicle can pick up the route on the camera shooting mobile terminal device and load the route to a target unmanned aerial vehicle, the target unmanned aerial vehicle flies to an event occurrence site according to the route, and then the image acquisition device acquires global information of the event site.
And after receiving the dispatching instruction, the target person can go to the incident scene, local information of the incident scene is collected through the camera-shooting mobile terminal, the camera-shooting mobile terminal transmits the local information to the server, and the server pushes the local information to a video dispatching command interface of the dispatching command center.
It should be noted that, in some embodiments, the scheduling command center includes a server, which is equivalent to the air-ground cooperative scheduling command cloud platform of fig. 2. In other embodiments, the dispatching command center does not include a server, which is the case of fig. 2, and the dispatching command center does not include an air-to-ground collaborative dispatching command cloud platform.
The global information comprises audio information and/or image data of a target event occurrence place acquired by the target unmanned aerial vehicle, and the local information comprises audio information and/or image data of the target event occurrence place acquired by a camera-shooting mobile terminal of a target person. The image data may be embodied as picture data or video data. Generally, the target unmanned aerial vehicle collects audio and/or video of an event site, and the target personnel collect the audio and/or video of the event site through the camera mobile terminal. Generally, the global information is video information of a global view of an event site collected by the drone, and the local information is video information of a local view of the event site.
Global information differs from local information in that: the global information is information of an event scene obtained by the unmanned aerial vehicle from the inspection from the sky and shooting; the local information is information shot or seen by ground treatment personnel at the emergency scene, and the information can not be obtained by the unmanned aerial vehicle. For example, when a fire breaks out in a building, the unmanned aerial vehicle can shoot video information or picture information outside the building but cannot shoot the situation inside the building. And ground disposal personnel can acquire video information and the like inside and below the building through terminals such as mobile phones and the like.
And step S405, displaying the global information and the local information.
In the concrete application, global information and the local information that target personnel gathered that can gather unmanned aerial vehicle. For example, a global view video acquired by the unmanned aerial vehicle and a local view video acquired by the target person may be displayed on one large screen, and the two global view pictures and the local view picture are integrated on the same display interface. Therefore, the dispatching command center can command the unmanned aerial vehicle driver to control the unmanned aerial vehicle to acquire the global situation of the event site according to the displayed picture, and meanwhile, the dispatching command center can command the target personnel to acquire the local details of the event site through the user terminal.
In other words, the global visual angle video shot by the unmanned aerial vehicle from the air and the ground local visual angle video collected by the ground disposal personnel are transmitted to the air-ground cooperative dispatching command cloud platform, and then the air-ground cooperative dispatching command cloud platform pushes the global visual angle video and the ground local visual angle video to the dispatching command center and displays the videos on the video dispatching interface, so that a commander can carry out real-time command and immediate decision.
Therefore, the global information and the local information of the target event (especially the emergency events such as fire and traffic accidents) can be timely and effectively acquired by the method and the device for acquiring the global information of the target event by the unmanned aerial vehicle near the place where the target event occurs and the target personnel near the place where the target event occurs going to the event field to acquire the local information.
Corresponding to the air-ground cooperative scheduling commanding method described in the above embodiments, fig. 7 shows a schematic block diagram of a structure of an air-ground cooperative scheduling commanding device provided in the embodiments of the present application, and for convenience of description, only the parts related to the embodiments of the present application are shown. The apparatus may be integrated with a server.
Referring to fig. 7, the apparatus may include:
an event information acquiring module 71, configured to determine event information of the target event, where the event information includes location information indicating an occurrence location of the target event;
the event range determining module 72 is configured to determine an event range according to the position information, and acquire exploration survey information and scheduling information of the event range, where the event range is an area range including a place where the target event occurs, and the scheduling information includes unmanned aerial vehicle information and personnel information within the event range;
the scheduling target determining module 73 is used for determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle;
the information acquisition module 74 is configured to receive local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle after sending a scheduling instruction to the camera-enabled mobile terminal of the target person and the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location;
and a display module 75 for displaying the global information and the local information.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
calculating the distance between the position of each person in the event range and the occurrence place of the target event according to the person position information in the person information and the position information of the target event;
calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information;
selecting a person closest to the target event occurrence place in the event range as a target person according to the distance between the position of each person in the event range and the target event occurrence place;
and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
calculating the distance between the two points by the following formula;
Figure BDA0002486963770000151
z represents the distance between two points, and R represents the approximate radius of the earth; WA represents the latitude value of the point A, WB represents the latitude value of the point B, JA represents the longitude value of the point A, and JB represents the longitude value of the point B;
the point A is the position of a person or an unmanned aerial vehicle, and the point B is the target event occurrence place.
In a possible implementation manner, if there is no unmanned aerial vehicle and/or person within the event range, the scheduling objective determining module is further specifically configured to:
and according to the unmanned aerial vehicle position information and the personnel position information, taking the unmanned aerial vehicle closest to the target event occurrence place as a target unmanned aerial vehicle, and/or taking the personnel closest to the target event occurrence place as target personnel.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
and marking out a route by a route planning and calculating rule according to the position of the target unmanned aerial vehicle and the flight limiting area information, the obstacle information, the communication signal information and the wireless link information in the exploration and investigation information.
In a possible implementation manner, the event range determining module is specifically configured to:
taking an area which is a preset area and comprises a region range of a position corresponding to the position information as an event range;
acquiring exploration information of an event range from exploration information acquired in advance;
and obtaining scheduling information within an event range according to the real-time position information of the personnel and the real-time position information of the unmanned aerial vehicle.
In one possible implementation manner, the global information includes audio information and/or image information of a target event occurrence place acquired by the target unmanned aerial vehicle, and the local information includes audio information and/or image information of a target event occurrence place acquired by a terminal of the target person.
The air-ground cooperative scheduling commanding device has the function of implementing the air-ground cooperative scheduling commanding method, the function can be implemented by hardware, and can also be implemented by hardware executing corresponding software, the hardware or the software comprises one or more modules corresponding to the function, and the modules can be software and/or hardware.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/modules, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and reference may be made to the part of the embodiment of the method specifically, and details are not described here.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal device 8 of this embodiment includes: at least one processor 80, a memory 81, and a computer program 82 stored in the memory 81 and executable on the at least one processor 80, the processor 80 implementing the steps in any of the various method embodiments described above when executing the computer program 82.
The terminal device 8 may be a computing device such as a server. The terminal device may include, but is not limited to, a processor 80, a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of the terminal device 8, and does not constitute a limitation of the terminal device 8, and may include more or less components than those shown, or combine some components, or different components, such as an input-output device, a network access device, and the like.
The Processor 80 may be a Central Processing Unit (CPU), and the Processor 80 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may in some embodiments be an internal storage unit of the terminal device 8, such as a hard disk or a memory of the terminal device 8. In other embodiments, the memory 81 may also be an external storage device of the terminal device 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the terminal device 8. The memory 81 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which, when running on a terminal device, enables the terminal device to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), random-access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An air-ground cooperative scheduling command method is characterized by comprising the following steps:
determining event information of a target event, wherein the event information comprises position information used for indicating the occurrence place of the target event;
determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is an area range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range;
determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to the target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle;
after sending a scheduling instruction to the camera-enabled mobile terminal of the target person and the target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions include instructions for instructing the target personnel to reach a target event occurrence and instructions for instructing the target drone to reach a target event occurrence using the airline;
and displaying the global information and the local information.
2. The method of claim 1, wherein determining a target drone and a target person from the scheduling information and the location information comprises:
calculating the distance between the position of each person in the event range and the occurrence place of the target event according to the person position information in the person information and the position information of the target event;
calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the unmanned aerial vehicle position information in the unmanned aerial vehicle information and the position information of the target event;
selecting the person closest to the target event occurrence place in the event range as the target person according to the distance between the position of each person in the event range and the target event occurrence place;
and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
3. The method according to claim 2, wherein the step of calculating the distance between the position of the person or the position of the unmanned aerial vehicle and the occurrence location of the target event according to the position information of the person or the position information of the unmanned aerial vehicle specifically comprises:
calculating the distance between the two points by the following formula;
Figure FDA0002486963760000021
z represents the distance between two points, and R represents the approximate radius of the earth; WA represents the latitude value of the point A, WB represents the latitude value of the point B, JA represents the longitude value of the point A, and JB represents the longitude value of the point B;
the point A is the position of a person or an unmanned aerial vehicle, and the point B is the target event occurrence place.
4. The method of claim 2, wherein if no drones and/or people are present within the event horizon, the method further comprises:
and according to the unmanned aerial vehicle position information and the personnel position information, taking the unmanned aerial vehicle closest to the target event occurrence place as a target unmanned aerial vehicle, and/or taking the personnel closest to the target event occurrence place as target personnel.
5. The method of claim 1, wherein planning a route for the target drone to fly to the location of the target event based on the survey information and the location of the target drone comprises:
and marking out the air route by an air route planning and calculating rule according to the position of the target unmanned aerial vehicle and the flight limiting area information, the obstacle information, the communication signal information and the wireless link information in the exploration and survey information.
6. The method of any one of claims 1 to 5, wherein determining an event range from the location information and obtaining survey information and scheduling information for the event range comprises:
taking an area which is a preset area and comprises a position corresponding to the position information as the event range;
acquiring exploration survey information of the event range from exploration survey information acquired in advance;
and obtaining the scheduling information in the event range according to the real-time personnel position information and the real-time unmanned aerial vehicle position information.
7. The method of claim 6, wherein the global information comprises audio information and/or image information of a target event occurrence location collected by the target drone, and the local information comprises audio information and/or image information of a target event occurrence location collected by a camera-enabled mobile terminal of the target person.
8. An air-ground cooperative dispatching command platform system is characterized by comprising a dispatching command center, an unmanned aerial vehicle and a camera-shooting mobile terminal;
the dispatching command center is used for acquiring event information of a target event, wherein the event information comprises position information used for indicating the occurrence place of the target event; determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is an area range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range; determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to the target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle; after sending a scheduling instruction to the camera-enabled mobile terminal of the target person and the target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions include instructions for instructing the target personnel to reach a target event occurrence and instructions for instructing the target drone to reach a target event occurrence using the airline; displaying the global information and the local information;
the unmanned aerial vehicle is used for acquiring route information transmitted by the dispatching command center, acquiring global information of a target event according to the route information, and transmitting the global information to the dispatching command center;
the camera shooting mobile terminal is used for collecting local information of a target event and transmitting the local information to the dispatching command center.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
CN202010393924.8A 2020-05-11 2020-05-11 Air-ground cooperative scheduling command method and platform system Pending CN111739346A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010393924.8A CN111739346A (en) 2020-05-11 2020-05-11 Air-ground cooperative scheduling command method and platform system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010393924.8A CN111739346A (en) 2020-05-11 2020-05-11 Air-ground cooperative scheduling command method and platform system

Publications (1)

Publication Number Publication Date
CN111739346A true CN111739346A (en) 2020-10-02

Family

ID=72647066

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010393924.8A Pending CN111739346A (en) 2020-05-11 2020-05-11 Air-ground cooperative scheduling command method and platform system

Country Status (1)

Country Link
CN (1) CN111739346A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113869598A (en) * 2021-10-13 2021-12-31 深圳联和智慧科技有限公司 Unmanned aerial vehicle intelligent remote management method and system based on smart city and cloud platform
WO2023040985A1 (en) * 2021-09-15 2023-03-23 深圳市道通智能航空技术股份有限公司 Method, apparatus, and device for displaying data of unmanned aerial vehicle, and storage medium
CN113869598B (en) * 2021-10-13 2024-05-28 深圳联和智慧科技有限公司 Intelligent remote management method, system and cloud platform for unmanned aerial vehicle based on smart city

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000012360U (en) * 1998-12-17 2000-07-05 전주범 V-Cal charge status display circuit of external device
CN102044011A (en) * 2009-10-20 2011-05-04 北京交通大学 Method and system for scheduling police force resource
CN102457814A (en) * 2010-10-22 2012-05-16 中兴通讯股份有限公司 Cluster dispatching method and system
CN106210627A (en) * 2016-07-04 2016-12-07 广东天米教育科技有限公司 A kind of unmanned plane fire dispatch system
CN107491827A (en) * 2016-06-13 2017-12-19 滴滴(中国)科技有限公司 A kind of vehicle resources scheduling processing method and system
CN107680378A (en) * 2017-11-07 2018-02-09 中车株洲电力机车有限公司 A kind of accident surveying method, system, equipment and computer-readable storage medium
CN109034446A (en) * 2018-06-13 2018-12-18 南京理工大学 The smart city traffic incident emergency response system collected evidence online based on unmanned plane
CN109255519A (en) * 2018-08-02 2019-01-22 佛山世寰智能科技有限公司 A kind of public security intelligence command scheduling method and system based on unmanned plane
CN109272755A (en) * 2018-09-30 2019-01-25 侍雨 A kind of additional transport Command Management System based on unmanned plane
CN109345871A (en) * 2018-11-08 2019-02-15 上海歌尔泰克机器人有限公司 Safety method for early warning, device, unmanned plane and computer readable storage medium
CN208691298U (en) * 2018-10-09 2019-04-02 蔡永生 A kind of urban fire control system based on Internet of Things
CN110365946A (en) * 2019-07-26 2019-10-22 浙江大华技术股份有限公司 Generation method, device, storage medium and the electronic device of the condition of a disaster counte-rplan

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20000012360U (en) * 1998-12-17 2000-07-05 전주범 V-Cal charge status display circuit of external device
CN102044011A (en) * 2009-10-20 2011-05-04 北京交通大学 Method and system for scheduling police force resource
CN102457814A (en) * 2010-10-22 2012-05-16 中兴通讯股份有限公司 Cluster dispatching method and system
CN107491827A (en) * 2016-06-13 2017-12-19 滴滴(中国)科技有限公司 A kind of vehicle resources scheduling processing method and system
CN106210627A (en) * 2016-07-04 2016-12-07 广东天米教育科技有限公司 A kind of unmanned plane fire dispatch system
CN107680378A (en) * 2017-11-07 2018-02-09 中车株洲电力机车有限公司 A kind of accident surveying method, system, equipment and computer-readable storage medium
CN109034446A (en) * 2018-06-13 2018-12-18 南京理工大学 The smart city traffic incident emergency response system collected evidence online based on unmanned plane
CN109255519A (en) * 2018-08-02 2019-01-22 佛山世寰智能科技有限公司 A kind of public security intelligence command scheduling method and system based on unmanned plane
CN109272755A (en) * 2018-09-30 2019-01-25 侍雨 A kind of additional transport Command Management System based on unmanned plane
CN208691298U (en) * 2018-10-09 2019-04-02 蔡永生 A kind of urban fire control system based on Internet of Things
CN109345871A (en) * 2018-11-08 2019-02-15 上海歌尔泰克机器人有限公司 Safety method for early warning, device, unmanned plane and computer readable storage medium
CN110365946A (en) * 2019-07-26 2019-10-22 浙江大华技术股份有限公司 Generation method, device, storage medium and the electronic device of the condition of a disaster counte-rplan

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023040985A1 (en) * 2021-09-15 2023-03-23 深圳市道通智能航空技术股份有限公司 Method, apparatus, and device for displaying data of unmanned aerial vehicle, and storage medium
CN113869598A (en) * 2021-10-13 2021-12-31 深圳联和智慧科技有限公司 Unmanned aerial vehicle intelligent remote management method and system based on smart city and cloud platform
CN113869598B (en) * 2021-10-13 2024-05-28 深圳联和智慧科技有限公司 Intelligent remote management method, system and cloud platform for unmanned aerial vehicle based on smart city

Similar Documents

Publication Publication Date Title
CN206481394U (en) Wide-angle view video conference promotion system
DE112018006556T5 (en) TRAINING A MACHINE LEARNING MODEL WITH DIGITAL AUDIO AND / OR VIDEO
CN106054928B (en) A kind of full region fire generation measuring method based on unmanned plane network
US7720458B2 (en) Rapidly deployable emergency communications system and method
CN107481507A (en) A kind of intelligent traffic administration system method and system
CN107103750A (en) A kind of alert system and method for command scheduling group based on alert rank
US20200106818A1 (en) Drone real-time interactive communications system
CN110365946A (en) Generation method, device, storage medium and the electronic device of the condition of a disaster counte-rplan
CN102654940A (en) Traffic information acquisition system based on unmanned aerial vehicle and processing method of traffic information acquisition system
CN113643520A (en) Intelligent traffic accident processing system and method
US20190196473A1 (en) Information collection system and server apparatus
JP4555884B1 (en) Movable information collection device
US20170269590A1 (en) Controlled device and communication system and method utilizing the same
CN109523193A (en) Flight control and task management system, method, apparatus and readable storage medium storing program for executing
CN107872656A (en) Monitoring system and command system
CN113447623A (en) Atmospheric environment monitoring method and system
CN110636255A (en) Unmanned aerial vehicle image and video transmission and distribution system and method based on 4G network
JP2012216102A (en) Image storage system
KR101483949B1 (en) Total mornitoring system
US11699348B2 (en) Air traffic tolling system
AU2021282389A1 (en) System for monitoring and influencing objects of interest and processes carried out by the objects, and corresponding method
CN115550860A (en) Unmanned aerial vehicle networking communication system and method
CN111739346A (en) Air-ground cooperative scheduling command method and platform system
KR20210041337A (en) Control system and control device for patrol and moving out, and operation method thereof
CN111212272B (en) Disaster monitoring method and device, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20220215

Address after: 518000 2515, building 2, Huilong business center, North Station community, Minzhi street, Longhua District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Zhongke Baotai Aerospace Technology Co.,Ltd.

Address before: Room 1101-1102, building 1, Changfu Jinmao building, No.5, Shihua Road, free trade zone, Fubao street, Futian District, Shenzhen, Guangdong 518000

Applicant before: Shenzhen Zhongke Baotai Technology Co.,Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20201002

RJ01 Rejection of invention patent application after publication