Disclosure of Invention
The embodiment of the application provides an air-ground cooperative scheduling command method and a platform system, which are used for solving the problem that global information and local information of a target event site cannot be timely and effectively acquired in the prior art.
In a first aspect, an embodiment of the present application provides an air-to-ground collaborative scheduling command method, which may include:
determining event information of a target event, wherein the event information comprises position information used for indicating the occurrence place of the target event;
determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is a region range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range;
determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle;
after sending a scheduling instruction to a camera-enabled mobile terminal of a target person and a target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location;
and displaying the global information and the local information.
Therefore, the global information and the local information of the target event (especially the emergency events such as fire and traffic accidents) can be timely and effectively acquired by the method and the device for acquiring the global information of the target event by the unmanned aerial vehicle near the place where the target event occurs and the target personnel near the place where the target event occurs going to the event field to acquire the local information. The global information and the local information of the target event are effectively acquired in time, and the command and scheduling work of a user is facilitated.
In a possible implementation manner of the first aspect, determining the target drone and the target person according to the scheduling information and the location information includes:
calculating the distance between the position of each person in the event range and the occurrence place of the target event according to the person position information in the person information and the position information of the target event;
calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information;
selecting a person closest to the target event occurrence place in the event range as a target person according to the distance between the position of each person in the event range and the target event occurrence place;
and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
In a possible implementation manner of the first aspect, the calculating, according to the position information of the person or the position information of the unmanned aerial vehicle, a distance between the position of the person or the position of the unmanned aerial vehicle and the occurrence location of the target event specifically includes:
calculating the distance between the two points by the following formula;
z represents the distance between two points, and R represents the approximate radius of the earth; WA represents the latitude value of the point A, WB represents the latitude value of the point B, JA represents the longitude value of the point A, and JB represents the longitude value of the point B;
the point A is the position of a person or an unmanned aerial vehicle, and the point B is the target event occurrence place.
In a possible implementation manner of the first aspect, if there is no unmanned aerial vehicle and/or no person within the event range, the method further includes:
and according to the unmanned aerial vehicle position information and the personnel position information, taking the unmanned aerial vehicle closest to the target event occurrence place as a target unmanned aerial vehicle, and/or taking the personnel closest to the target event occurrence place as target personnel.
In a possible implementation manner of the first aspect, planning a route from the target drone to a place where the target event occurs according to the exploration survey information and a location of the target drone includes:
and marking out a route by a route planning and calculating rule according to the position of the target unmanned aerial vehicle and the flight limiting area information, the obstacle information, the communication signal information and the wireless link information in the exploration and investigation information.
In a possible implementation manner of the first aspect, determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range includes:
taking an area which is a preset area and comprises a region range of a position corresponding to the position information as an event range;
acquiring exploration information of an event range from exploration information acquired in advance;
and obtaining scheduling information within an event range according to the real-time position information of the personnel and the real-time position information of the unmanned aerial vehicle.
In a possible implementation manner of the first aspect, the global information includes audio information and/or image information of a target event occurrence location acquired by the target drone, and the local information includes audio information and/or image information of a target event occurrence location acquired by a terminal of the target person.
In a second aspect, an air-ground cooperative dispatching command platform system provided in the embodiments of the present application includes a dispatching command center, an unmanned aerial vehicle, and a camera-shooting mobile terminal;
the scheduling command center is used for acquiring event information of the target event, and the event information comprises position information used for indicating the occurrence place of the target event; determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is a region range including a target event occurrence place, and the scheduling information comprises unmanned aerial vehicle information and personnel information in the event range; determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle; after sending a scheduling instruction to a camera-enabled mobile terminal of a target person and a target unmanned aerial vehicle, receiving local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location; displaying the global information and the local information;
the unmanned aerial vehicle is used for acquiring route information transmitted by the dispatching command center, acquiring global information of a target event according to the route information and transmitting the global information to the dispatching command center;
the camera shooting mobile terminal is used for collecting local information of the target event and transmitting the local information to the dispatching command center.
In a third aspect, an embodiment of the present application provides a terminal device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the method according to any one of the first aspect is implemented.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the method according to any one of the above first aspects.
In a fifth aspect, embodiments of the present application provide a computer program product, which, when run on a terminal device, causes the terminal device to perform the method of any one of the first aspect.
It is understood that the beneficial effects of the second aspect to the fifth aspect can be referred to the related description of the first aspect, and are not described herein again.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application.
The following first describes a system architecture to which embodiments of the present application may relate.
Referring to the schematic architecture diagram of the air-ground cooperative dispatching command platform system shown in fig. 1, the platform system may include a server (or dispatching command center) 11, an unmanned aerial vehicle 12, and a camera-enabled mobile terminal 13. The camera-enabled mobile terminal 13 refers to a camera-enabled mobile terminal device corresponding to the ground handling personnel, and may be, but is not limited to, a mobile phone, a tablet computer, a wearable device, or the like, and fig. 1 exemplarily shows that the camera-enabled mobile terminal is a mobile phone.
The server (or dispatch command center) 11 may communicate with the camera-enabled mobile terminal 13 of the ground treatment staff, and the communication mode may be any, for example, the mobile phone of the ground treatment staff may communicate with the server (or dispatch command center) 11 through 4G or 5G or Wifi. The communication between the server 11 and the unmanned aerial vehicle 12 can be through the camera-enabled mobile terminal device of the unmanned aerial vehicle driver, that is, the unmanned aerial vehicle 12 communicates with the camera-enabled mobile terminal device of the unmanned aerial vehicle driver first, and then communicates with the server 11; or, the server 11 communicates with the camera-enabled mobile terminal device of the drone driver first, and then communicates with the drone 12.
The camera-enabled mobile terminal 13 and the unmanned aerial vehicle 12 both have audio and/or video capturing capabilities, and specifically, the camera-enabled mobile terminal 13 and the unmanned aerial vehicle 12 may each include, but are not limited to, a camera, a speaker, an audio capturing module, and the like, for taking pictures, capturing video pictures, capturing audio information, and the like.
By way of example and not limitation, refer to another schematic diagram of the air-ground cooperative dispatching command platform system shown in fig. 2, which includes a dispatching command center 21, an air-ground cooperative dispatching command cloud platform 22, an air unmanned plane side 23, an event site 24 and a ground handling personnel side 25, as shown in fig. 2.
The dispatching command center 21 includes a video dispatching command interface (i.e., a Web browser) and a commander, and the video dispatching command interface may include a large screen for receiving and displaying global information of the event site 24 collected by the aerial unmanned aerial vehicle pushed by the air-ground cooperative dispatching command cloud platform 22 and local information of the event site 24 collected by the ground service personnel. The scheduling command center 21 is communicated with the air-ground cooperative scheduling command cloud platform 22. The air-ground cooperative scheduling director cloud platform 22 may be equivalent to the server in fig. 1. The air-ground cooperative dispatching command cloud platform 22 is in communication with the dispatching command center 21, one side 23 of the air unmanned aerial vehicle and one side 25 of the ground disposal personnel, and is used for receiving global information of an event site 24 acquired by the one side 23 of the air unmanned aerial vehicle and local information of the event site 24 acquired by the one side 25 of the ground disposal personnel, and then pushing the global information and the local information to a video dispatching command interface of the dispatching command center 21.
Aerial unmanned aerial vehicle one side 22, it includes unmanned aerial vehicle, the unmanned aerial vehicle driver of control unmanned aerial vehicle and the mobile terminal equipment that can make a video recording (not shown in the figure) that unmanned aerial vehicle driver corresponds, and the unmanned aerial vehicle driver can carry out three-party video conversation through the little letter applet on the mobile terminal equipment that can make a video recording (for example, the little letter applet on the cell-phone). The drone driver may control the drone to capture audio and/or video information of the global perspective of the incident scene 24 through the pan-tilt camera. It should be noted that the driver of the unmanned aerial vehicle can operate the unmanned aerial vehicle and perform a three-way video call with the dispatch command center 21 and ground service personnel. However, this approach may present a risk, for example, that the drone driver does not have control of the drone, resulting in a crash event for the drone. Under general conditions, there is the assistant beside the unmanned aerial vehicle driver, and the unmanned aerial vehicle driver can control unmanned aerial vehicle by attentive, and the assistant carries out three-party video conversation beside, conveys information like this.
The ground service personnel side 25 may collect audio and/or video information of a local perspective of the event site 24 through a wechat applet on the camera-enabled mobile terminal device, and in particular, the wechat applet may collect audio and/or video information of a local perspective of the event site through a camera on the camera-enabled mobile terminal device (e.g., a cell phone).
The audio and/or video information of the global visual angle acquired by the aerial unmanned aerial vehicle and the audio and/or video information of the local visual angle acquired by the ground disposal personnel are transmitted to the air-ground cooperative dispatching command cloud platform 22, and the air-ground cooperative dispatching command cloud platform 22 pushes the global information and the local information to the dispatching command center 21 and displays the global information and the local information on the video dispatching command interface. In this way, the director of the dispatching command center 21 can make a real-time decision according to the audio and/or video information of the global view and the audio and/or video information of the local view to dispatch the command.
The dispatch command center 21 can know that a target event happens to a place through people or unmanned aerial vehicle patrol, and the target event is generally an emergency event such as a fire, a traffic accident and the like. After the background system of the dispatching command center 21 knows that an emergency such as a fire or a traffic accident occurs in a certain place, the emergency range can be determined according to the place where the emergency occurs. The emergency area may be an area including a place where the emergency occurs. After the emergency event range is determined, the information of the flight-limiting area, the obstacle information, the 4G signal, the wireless link and the like in the event range and the related information of ground treatment personnel, the unmanned aerial vehicle driver and the unmanned aerial vehicle in the emergency event range are obtained.
According to the reported information of the real-time position of the ground disposal personnel, the real-time position of a driver of the unmanned aerial vehicle, the real-time position of the unmanned aerial vehicle and the like, a background system of the dispatching command center 21 selects the ground disposal personnel and the unmanned aerial vehicle which are closest to the emergency occurrence place, the ground disposal personnel who are selected in dispatching go to the emergency occurrence place, and the ground disposal personnel carry out three-party video call with the dispatching command center and the aerial unmanned aerial vehicle through a WeChat applet on a mobile phone. After the ground disposal personnel obtain the command of the dispatching command center, local information of an event site is collected through a camera of the mobile phone and is transmitted back to the air-ground cooperative dispatching command cloud platform in real time so as to be pushed to a video dispatching command interface of the dispatching command center for displaying. After the target unmanned aerial vehicle is determined, an unmanned aerial vehicle driver on one side of the aerial unmanned aerial vehicle or an assistant beside the unmanned aerial vehicle driver can carry out three-party video call with a scheduling command center and one side of a ground disposal person through a WeChat applet of a mobile phone. And the unmanned aerial vehicle driver controls the unmanned aerial vehicle to acquire global information of an event site according to the scheduling of the scheduling command center, and transmits the global information back to the air-ground cooperative scheduling command cloud platform, and the air-ground cooperative scheduling command cloud platform pushes the global information to a video scheduling command interface of the scheduling command center for displaying. Therefore, the scheduling command center can timely and effectively acquire local information transmitted by mobile equipment of ground disposal personnel and global information transmitted by the aerial unmanned aerial vehicle. After global information and local information of a target event such as a fire, a traffic accident, and the like are timely and effectively acquired, a commander can perform command and scheduling work, for example, fire rescue command work, based on the acquired global information and local information.
In specific application, a background system of the dispatching command center can plan a flight path from the position where the unmanned aerial vehicle flies to an event occurrence place according to information such as flight limiting area information, obstacle information, 4G signals and wireless links of an emergency range. And then, the background system of the dispatching command center transmits the air route information to the terminal equipment of the unmanned aerial vehicle driver, and the unmanned aerial vehicle driver uses the air route to control the unmanned aerial vehicle to fly to the place where the emergency happens. The unmanned aerial vehicle is connected with the camera mobile terminal equipment of the unmanned aerial vehicle driver, the unmanned aerial vehicle transmits the global information of the acquired emergency to the camera mobile terminal equipment in real time, the camera mobile terminal equipment transmits the global information to the air-ground cooperative dispatching command cloud platform, and the air-ground cooperative dispatching command cloud platform pushes the global information to the video dispatching command interface of the dispatching command center.
It should be noted that, in some embodiments, the scheduling command center 21 includes an air-ground collaborative scheduling command cloud platform 22 and a video scheduling command interface. In other embodiments, the air-ground cooperative dispatching command cloud platform 22 may not be included in the dispatching command center 21.
Referring to fig. 3, a three-way video call diagram is shown, which includes a dispatch command center interface 31, a ground handler side 32, and an drone driver side 33, as shown in fig. 3. The large screen of the dispatching command center interface 31 displays a video image 311 returned by the unmanned aerial vehicle, a video image 312 monitored on the ground, and a video image 313 of the WeChat applet. That is, the dispatch center interface 31 displays a global view video frame collected by an available unmanned aerial vehicle, a local view video frame collected by a ground handler, a call video frame of a driver of the unmanned aerial vehicle, the ground handler, and a commander of the dispatch center, and the like.
On the ground attendant side 32, the ground attendant joins the XXX conference via the messenger applet on the cell phone to enter the three-way video call screen 321. Similarly, the drone driver or an assistant beside the drone driver may join the XXX conference through the WeChat applet of the cell phone to enter the three-way video call screen 331.
It should be noted that the multiparty video call of the dispatch command center, the drone video and the applet video may be implemented based on Real-Time audio and video (RTC), but the specific implementation manner is not limited thereto.
After the system architecture that may be involved in the embodiments of the present application is described, an air-ground cooperative dispatching command process from the dispatching command center side will be described below.
Referring to fig. 4, a schematic flow chart of an air-ground cooperative dispatching command method is shown, and the method is applied to a dispatching command center. The method specifically comprises the following steps:
step S401, determining event information of the target event, wherein the event information comprises position information used for indicating the occurrence place of the target event.
It should be noted that the target event generally refers to an emergency event such as a fire or a traffic accident. Of course, the target event may also be a non-emergency event. The event information of the target event may include, but is not limited to, location information, event type, and the like. The location information refers to geographical location information of the place where the target event occurs. And the event type refers to what the target event is, for example, a fire, a traffic accident.
The event information of the target event can be a system discovered and reported to a dispatching command center in the unmanned aerial vehicle patrolling and patrolling process, and can also be a system reported to the dispatching command center by the masses. According to the related information reported by the unmanned aerial vehicle or the related information reported by the masses, the dispatching command center can determine where to send the emergency event and the related information such as the event type of the emergency event.
Step S402, determining an event range according to the position information, and acquiring exploration survey information and scheduling information of the event range, wherein the event range is an area range including a target event occurrence place, and the scheduling information includes unmanned aerial vehicle information and personnel information in the event range.
Specifically, the geographical information of the target event transmission place is obtained, and the event range can be determined based on the geographical position information. And then acquiring exploration and survey information of the event range, and information of personnel and unmanned aerial vehicles positioned in the event range.
In some embodiments, an area range having an area as a preset area and including a position corresponding to the position information is used as the event range.
The event range may be a regular graph area, for example, a position corresponding to the position information may be used as a center of a circle, and a range with a radius of a preset radius value may be used as the event range, in this case, the event range is a circular curve. Or an irregular pattern region, which only needs to include the event sending place and has an area of a preset area.
The preset radius value may be, for example, 2 kilometers, and at this time, the event range is a circular area with a radius of 2 kilometers and the event occurrence location as a center of a circle. The preset area can be set according to actual conditions. See in particular the event range diagram shown in fig. 5. As shown in fig. 5, point b in the area a is the target event occurrence point, and in the left diagram, a circular area with point b as the center is the event range. In the right diagram, an area range including the point b is an event range. The dashed graph in fig. 5 is the event range.
After the event range of the target event is determined, exploration and investigation information of the event range is obtained from exploration and investigation information obtained in advance. The pre-acquired survey information may include, but is not limited to, flight-limit zone information, obstacle information, 4G signals, and wireless links.
Taking fig. 5 as an example, the area a is prospecting in advance to obtain the prospecting information of the area a. At the moment, the point b in the area A is in fire, the event range corresponding to the point b is determined, and then the exploration and investigation information of the event range is obtained from the exploration and investigation information of the area A.
After the event range of the target event is determined, scheduling information in the event range can be obtained according to the real-time personnel position information and the real-time unmanned aerial vehicle position information. The scheduling information includes ground treatment personnel information, unmanned aerial vehicle driver information, and the like, which are located within the event range.
And S403, determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route of the target unmanned aerial vehicle flying to the target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle.
In specific application, the unmanned aerial vehicle and the ground disposal personnel closest to the occurrence place of the target event can be scheduled to go to the event field to collect information; the unmanned aerial vehicle and the ground disposal personnel can be scheduled according to the distance and the working state, and at the moment, the unmanned aerial vehicle and the ground disposal personnel can be selected according to the working state information such as whether the unmanned aerial vehicle is executing a task and whether the ground disposal personnel are idle besides the distance between the unmanned aerial vehicle and the ground disposal personnel and the occurrence place of an event.
The scheduling information comprises real-time position information of ground treating personnel, the unmanned aerial vehicle and an unmanned aerial vehicle driver in the event range, and the distance between each ground treating personnel and the unmanned aerial vehicle and the event occurrence place in the event range is calculated according to the real-time position information and the position information of the event occurrence place. And selecting the unmanned aerial vehicle closest to the event occurrence place as a target unmanned aerial vehicle, and selecting the ground treating personnel closest to the event occurrence place as target personnel.
In specific application, after the background system determines the longitude and latitude information of the occurrence place of the target event, the longitude and latitude information of each ground handler, namely the real-time position information of each ground handler, is obtained in real time. And then, calculating the distance between each ground handling personnel and the event occurrence place according to the longitude and latitude information of the event occurrence place and the real-time position information of each ground handling personnel.
By way of example and not limitation, the formula for calculating the distance between two points from the longitude and latitude between the two points is as follows:
wherein Z represents the distance between two points (i.e., point-to-point distance), and R represents the approximate radius of the earth; WA represents a latitude value of a point a, WB represents a latitude value of a point B, JA represents a longitude value of a point a, and JB represents a longitude value of a point B.
In some embodiments, the backend system may first calculate a distance between each ground handler and the site where the target event occurs, and then determine the target drone and the target person according to the distance.
In yet other embodiments, the back-office system may also display the real-time location of each ground treatment person, as well as location information of the event occurrence on a large screen. Therefore, personnel of the dispatching command center can manually select target personnel and target unmanned aerial vehicles close to the event occurrence place by looking up the large screen.
In other words, the distance between the position of each person in the event range and the occurrence place of the target event can be calculated through the person position information in the person information and the position information of the target event; and selecting the person closest to the target event occurrence place in the event range as the target person according to the distance between the position of each person in the event range and the target event occurrence place. Calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information; and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
After the target unmanned aerial vehicle is determined, the background system can plan a route by a route planning algorithm according to the real-time position of the target unmanned aerial vehicle and the position of the occurrence place of the event, and by combining with the flight limiting area information, the obstacle information, the wireless link and the like in the event range, and the driver of the unmanned aerial vehicle takes the route and loads the route to the target unmanned aerial vehicle. The target drone may fly from the current location to the event occurrence by the route.
In other embodiments, if there is no drone, or no ground handling personnel, or both, for the event horizon. At this time, unmanned aerial vehicles and/or ground handling personnel outside the event range can be called according to the distance to the event scene.
Referring to fig. 6, which is a schematic diagram of selecting the unmanned aerial vehicle and the ground handler, as shown in fig. 6, point b is an event occurrence location, and a dotted line area is an event range corresponding to point b. A plurality of ground handling personnel and a plurality of drones are included within the event scope. Two ground treating personnel nearest to the event occurrence place, an unmanned aerial vehicle and an unmanned aerial vehicle driver nearest to the event occurrence place are selected, and the dispatching command center carries out video call with the selected ground treating personnel and the selected unmanned aerial vehicle driver and transmits information. The unmanned aerial vehicle driver is instructed to use the unmanned aerial vehicle to go to the event field to collect global information, and ground disposal personnel are instructed to go to the event field to collect local information.
And S404, after sending a scheduling instruction to the camera mobile terminal of the target person and the target unmanned aerial vehicle, receiving local information returned by the camera mobile terminal of the target person and global information returned by the target unmanned aerial vehicle. The scheduling instructions include instructions for instructing the target personnel to reach the location of the target event and instructions for instructing the target drone to use the flight path to reach the location of the target event.
In specific application, after the target unmanned aerial vehicle and the target personnel are determined, scheduling instructions can be respectively sent to the target unmanned aerial vehicle and the target personnel to indicate that the target unmanned aerial vehicle flies to an event occurrence place and indicate that the target personnel arrive at the event occurrence place. A background system of the dispatching command center can mark a route through a route planning and calculating rule, then a driver of the unmanned aerial vehicle can pick up the route on the camera shooting mobile terminal device and load the route to a target unmanned aerial vehicle, the target unmanned aerial vehicle flies to an event occurrence site according to the route, and then the image acquisition device acquires global information of the event site.
And after receiving the dispatching instruction, the target person can go to the incident scene, local information of the incident scene is collected through the camera-shooting mobile terminal, the camera-shooting mobile terminal transmits the local information to the server, and the server pushes the local information to a video dispatching command interface of the dispatching command center.
It should be noted that, in some embodiments, the scheduling command center includes a server, which is equivalent to the air-ground cooperative scheduling command cloud platform of fig. 2. In other embodiments, the dispatching command center does not include a server, which is the case of fig. 2, and the dispatching command center does not include an air-to-ground collaborative dispatching command cloud platform.
The global information comprises audio information and/or image data of a target event occurrence place acquired by the target unmanned aerial vehicle, and the local information comprises audio information and/or image data of the target event occurrence place acquired by a camera-shooting mobile terminal of a target person. The image data may be embodied as picture data or video data. Generally, the target unmanned aerial vehicle collects audio and/or video of an event site, and the target personnel collect the audio and/or video of the event site through the camera mobile terminal. Generally, the global information is video information of a global view of an event site collected by the drone, and the local information is video information of a local view of the event site.
Global information differs from local information in that: the global information is information of an event scene obtained by the unmanned aerial vehicle from the inspection from the sky and shooting; the local information is information shot or seen by ground treatment personnel at the emergency scene, and the information can not be obtained by the unmanned aerial vehicle. For example, when a fire breaks out in a building, the unmanned aerial vehicle can shoot video information or picture information outside the building but cannot shoot the situation inside the building. And ground disposal personnel can acquire video information and the like inside and below the building through terminals such as mobile phones and the like.
And step S405, displaying the global information and the local information.
In the concrete application, global information and the local information that target personnel gathered that can gather unmanned aerial vehicle. For example, a global view video acquired by the unmanned aerial vehicle and a local view video acquired by the target person may be displayed on one large screen, and the two global view pictures and the local view picture are integrated on the same display interface. Therefore, the dispatching command center can command the unmanned aerial vehicle driver to control the unmanned aerial vehicle to acquire the global situation of the event site according to the displayed picture, and meanwhile, the dispatching command center can command the target personnel to acquire the local details of the event site through the user terminal.
In other words, the global visual angle video shot by the unmanned aerial vehicle from the air and the ground local visual angle video collected by the ground disposal personnel are transmitted to the air-ground cooperative dispatching command cloud platform, and then the air-ground cooperative dispatching command cloud platform pushes the global visual angle video and the ground local visual angle video to the dispatching command center and displays the videos on the video dispatching interface, so that a commander can carry out real-time command and immediate decision.
Therefore, the global information and the local information of the target event (especially the emergency events such as fire and traffic accidents) can be timely and effectively acquired by the method and the device for acquiring the global information of the target event by the unmanned aerial vehicle near the place where the target event occurs and the target personnel near the place where the target event occurs going to the event field to acquire the local information.
Corresponding to the air-ground cooperative scheduling commanding method described in the above embodiments, fig. 7 shows a schematic block diagram of a structure of an air-ground cooperative scheduling commanding device provided in the embodiments of the present application, and for convenience of description, only the parts related to the embodiments of the present application are shown. The apparatus may be integrated with a server.
Referring to fig. 7, the apparatus may include:
an event information acquiring module 71, configured to determine event information of the target event, where the event information includes location information indicating an occurrence location of the target event;
the event range determining module 72 is configured to determine an event range according to the position information, and acquire exploration survey information and scheduling information of the event range, where the event range is an area range including a place where the target event occurs, and the scheduling information includes unmanned aerial vehicle information and personnel information within the event range;
the scheduling target determining module 73 is used for determining a target unmanned aerial vehicle and target personnel according to the scheduling information and the position information, and planning to obtain a route from the target unmanned aerial vehicle to a target event occurrence place according to the exploration survey information and the position of the target unmanned aerial vehicle;
the information acquisition module 74 is configured to receive local information returned by the camera-enabled mobile terminal of the target person and global information returned by the target unmanned aerial vehicle after sending a scheduling instruction to the camera-enabled mobile terminal of the target person and the target unmanned aerial vehicle; the scheduling instructions comprise instructions for instructing target personnel to reach the target event occurrence location and instructions for instructing the target unmanned aerial vehicle to use the air route to reach the target event occurrence location;
and a display module 75 for displaying the global information and the local information.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
calculating the distance between the position of each person in the event range and the occurrence place of the target event according to the person position information in the person information and the position information of the target event;
calculating the distance between the position of each unmanned aerial vehicle and the occurrence place of the target event in the event range according to the position information of the unmanned aerial vehicle and the position information of the target event in the unmanned aerial vehicle information;
selecting a person closest to the target event occurrence place in the event range as a target person according to the distance between the position of each person in the event range and the target event occurrence place;
and selecting the unmanned aerial vehicle closest to the target event occurrence place in the event range as the target unmanned aerial vehicle according to the distance between the position of each unmanned aerial vehicle in the event range and the target event occurrence place.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
calculating the distance between the two points by the following formula;
z represents the distance between two points, and R represents the approximate radius of the earth; WA represents the latitude value of the point A, WB represents the latitude value of the point B, JA represents the longitude value of the point A, and JB represents the longitude value of the point B;
the point A is the position of a person or an unmanned aerial vehicle, and the point B is the target event occurrence place.
In a possible implementation manner, if there is no unmanned aerial vehicle and/or person within the event range, the scheduling objective determining module is further specifically configured to:
and according to the unmanned aerial vehicle position information and the personnel position information, taking the unmanned aerial vehicle closest to the target event occurrence place as a target unmanned aerial vehicle, and/or taking the personnel closest to the target event occurrence place as target personnel.
In a possible implementation manner, the scheduling objective determining module is specifically configured to:
and marking out a route by a route planning and calculating rule according to the position of the target unmanned aerial vehicle and the flight limiting area information, the obstacle information, the communication signal information and the wireless link information in the exploration and investigation information.
In a possible implementation manner, the event range determining module is specifically configured to:
taking an area which is a preset area and comprises a region range of a position corresponding to the position information as an event range;
acquiring exploration information of an event range from exploration information acquired in advance;
and obtaining scheduling information within an event range according to the real-time position information of the personnel and the real-time position information of the unmanned aerial vehicle.
In one possible implementation manner, the global information includes audio information and/or image information of a target event occurrence place acquired by the target unmanned aerial vehicle, and the local information includes audio information and/or image information of a target event occurrence place acquired by a terminal of the target person.
The air-ground cooperative scheduling commanding device has the function of implementing the air-ground cooperative scheduling commanding method, the function can be implemented by hardware, and can also be implemented by hardware executing corresponding software, the hardware or the software comprises one or more modules corresponding to the function, and the modules can be software and/or hardware.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/modules, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and reference may be made to the part of the embodiment of the method specifically, and details are not described here.
Fig. 8 is a schematic structural diagram of a terminal device according to an embodiment of the present application. As shown in fig. 8, the terminal device 8 of this embodiment includes: at least one processor 80, a memory 81, and a computer program 82 stored in the memory 81 and executable on the at least one processor 80, the processor 80 implementing the steps in any of the various method embodiments described above when executing the computer program 82.
The terminal device 8 may be a computing device such as a server. The terminal device may include, but is not limited to, a processor 80, a memory 81. Those skilled in the art will appreciate that fig. 8 is merely an example of the terminal device 8, and does not constitute a limitation of the terminal device 8, and may include more or less components than those shown, or combine some components, or different components, such as an input-output device, a network access device, and the like.
The Processor 80 may be a Central Processing Unit (CPU), and the Processor 80 may be other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 81 may in some embodiments be an internal storage unit of the terminal device 8, such as a hard disk or a memory of the terminal device 8. In other embodiments, the memory 81 may also be an external storage device of the terminal device 8, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 8. Further, the memory 81 may also include both an internal storage unit and an external storage device of the terminal device 8. The memory 81 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, and other programs, such as program codes of the computer program. The memory 81 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which, when running on a terminal device, enables the terminal device to implement the steps in the above method embodiments when executed.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), random-access Memory (RAM), an electrical carrier signal, a telecommunications signal, and a software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.