CN109996039B - Target tracking method and device based on edge calculation - Google Patents

Target tracking method and device based on edge calculation Download PDF

Info

Publication number
CN109996039B
CN109996039B CN201910270937.3A CN201910270937A CN109996039B CN 109996039 B CN109996039 B CN 109996039B CN 201910270937 A CN201910270937 A CN 201910270937A CN 109996039 B CN109996039 B CN 109996039B
Authority
CN
China
Prior art keywords
target
aerial vehicle
unmanned aerial
ground
edge node
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910270937.3A
Other languages
Chinese (zh)
Other versions
CN109996039A (en
Inventor
邓晓衡
刘亚军
李君�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Central South University
Original Assignee
Central South University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Central South University filed Critical Central South University
Priority to CN201910270937.3A priority Critical patent/CN109996039B/en
Publication of CN109996039A publication Critical patent/CN109996039A/en
Application granted granted Critical
Publication of CN109996039B publication Critical patent/CN109996039B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B7/00Radio transmission systems, i.e. using radiation field
    • H04B7/14Relay systems
    • H04B7/15Active relay systems
    • H04B7/185Space-based or airborne stations; Stations for satellite systems
    • H04B7/18502Airborne stations
    • H04B7/18506Communications with or from aircraft, i.e. aeronautical mobile service
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/10Protocols in which an application is distributed across nodes in the network
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/60Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client 
    • H04N21/63Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
    • H04N21/643Communication protocols
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/14Picture signal circuitry for video frequency region
    • H04N5/144Movement detection
    • H04N5/145Movement estimation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Astronomy & Astrophysics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a target tracking method based on edge calculation, which mainly shoots a target through an unmanned aerial vehicle, a camera network on the ground assists in shooting the target when the unmanned aerial vehicle cannot work, and an edge node calculation cluster consisting of terminal equipment on the ground is responsible for processing and analyzing a video data stream. When the unmanned aerial vehicle cannot track normally, the ground camera network and the edge node are selectively requested to be combined to participate in the tracking task, so that the robustness of the unmanned aerial vehicle tracking can be improved, and even if the target is completely lost, the target can be efficiently recovered due to the close range communication between the edge node and the ground camera network so as to continue tracking; platform differences are isolated by using a container technology, so that the problem of incompatibility of video analysis processing application on different platforms can be solved; the ground-air cooperation mode is reasonably selected according to the current tracked state of the target, so that the energy consumption of the whole network can be reduced, and the response time of the network can be prolonged.

Description

Target tracking method and device based on edge calculation
Technical Field
The invention relates to the technical field of monitoring and tracking, in particular to a target tracking method and device based on edge calculation.
Background
The traditional urban target tracking network depends on a large number of ground cameras and a cloud server, as shown in fig. 1, the ground cameras are always in shooting, massive data are uploaded to the cloud server for centralized processing and analysis, and the cloud server feeds results back to a client for viewing and tracking. The network bandwidth is a real-time variable and limited resource, so that massive video data are uploaded to the cloud server for centralized processing and analysis, a large amount of network bandwidth is occupied, network congestion is caused, and the whole communication process is too high in delay to complete real-time tracking and even lose targets.
Along with the development of unmanned aerial vehicle technology, the application based on unmanned aerial vehicle is also more and more, and wherein target tracking based on unmanned aerial vehicle is also one of the hotspots in the security protection field. As shown in fig. 2, the target tracking method based on the unmanned aerial vehicle is composed of the unmanned aerial vehicle and a ground station, a camera carried by the unmanned aerial vehicle is responsible for shooting and data transmission, and the ground station is a communication station with high performance computing and is responsible for shooting by the unmanned aerial vehicle and processing video streams transmitted to the ground station. Utilize unmanned aerial vehicle's flexibility and wireless communication ability, can follow the target and move, can reduce the video data handling capacity for traditional tracking network relatively to reduce time delay, but the limited duration and the wireless communication scope of unmanned aerial vehicle make on the one hand to track the mission time and must not overlength, and on the other hand ground satellite station must follow that unmanned aerial vehicle removes and just can ensure reliable communication.
Therefore, when designing an urban target tracking network, the high delay of video processing of the traditional tracking network needs to be overcome, and the problems of endurance and low robustness of the novel unmanned aerial vehicle tracking network also need to be overcome. However, the above-mentioned conventional tracking method is obviously not suitable for actual urban target tracking application, and thus, the problems of non-real-time target tracking and target loss often occur.
Therefore, the existing target tracking method cannot meet the real-time performance and reliability of target tracking application.
Disclosure of Invention
The present invention is directed to at least solving the problems of the prior art. Therefore, the invention discloses a target tracking method based on edge calculation, which mainly shoots a target through an unmanned aerial vehicle, assists the shooting of the target through a ground camera network when the unmanned aerial vehicle is incapable of working, and processes and analyzes a video data stream shot by the unmanned aerial vehicle and the ground camera network by adopting an edge node calculation cluster consisting of ground terminal equipment.
Furthermore, the terminal device refers to an edge node in edge computing, and includes a mobile phone, a notebook, a router, an edge server, and a base station, which are devices located at the edge of a network and having computing capability, storage capability, and communication capability.
Furthermore, when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to available edge nodes on the ground; when the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to part of the ground camera network, and the ground camera network and the edge node jointly take over the temporary target tracking task until the unmanned aerial vehicle recovers to work normally or the target appears at the sight line of the unmanned aerial vehicle again.
Furthermore, when the target is determined to be lost, the target losing point and the direction and speed of the target movement are calculated to determine that the target can appear in the monitoring coverage range of a plurality of nearby cameras, the video between the time point when the target finally appears and the time point when the target is found to be lost is unloaded to an edge node for analysis and processing, the escaping direction and speed of the target are found, and the target is found again.
Further, unmanned aerial vehicle and ground camera adopt 5G, wifi connected mode, and unmanned aerial vehicle and ground edge node adopt 5G, wifi connected mode, adopt wired, 5G connected mode between the ground camera, and ground camera and ground edge node adopt wifi, 5G, wired connected mode.
Furthermore, isolation differences among the heterogeneous platforms are isolated by adopting a virtualization container technology (docker technology) or other container technologies; and installing an API (application programming interface) of a target tracking application program on a container of the edge node, wherein the unmanned aerial vehicle communicates the video which is tracked and shot with the ground edge node through WiFi (wireless fidelity) or other modes, the WAMP interface in the container is utilized to send a request, the edge node receives the request and then processes a target tracking task, and the result is responded to the unmanned aerial vehicle.
Furthermore, the tracking algorithm adopted by the method is a target tracking algorithm based on deep learning, the processing steps comprise two processes of target identification and target tracking, and the obtained results comprise the moving direction, the moving speed and the position of the target.
Further, the virtual container technology uses a WAMP protocol to communicate through the API reserved by the docker, wherein the WAMP protocol is built on the Websocket and supports the publish/subscribe publish & describe and remote procedure call rpc communication protocols. The virtual container technology uses a WAMP protocol for communication through an API reserved by a Docker, the WAMP protocol is established on a network protocol Websocket and supports publishing/subscribing publish & describe and rpc communication protocols, the Docker container packages video analysis processing application, and resources are isolated by using a namespace of a linux core and a cgroup from a control group, so that the video analysis processing application is standardized.
Furthermore, when the ground camera network needs to request an edge node to perform video analysis, the request is issued through issuing a publish communication protocol, the edge node near a certain camera in the ground camera network subscribes the service of the camera through subscribing a subscribe protocol, and after the ground camera network is successfully connected with the edge node, the ground camera and the edge node transmit and process the target tracking application needing to be unloaded through a remote procedure call rpc protocol.
The invention also discloses a target tracking system based on edge calculation, which is characterized by comprising the following steps: the main tracking module is used for mainly shooting a target through the unmanned aerial vehicle; the auxiliary tracking module is used for carrying out auxiliary shooting on the target by the aid of the camera network when the unmanned aerial vehicle is incapable of working; the analysis module is used for processing and analyzing an edge node calculation cluster formed by terminal equipment on the ground, wherein the video data streams shot by the unmanned aerial vehicle and the ground camera network are processed and analyzed by the edge node calculation cluster, the terminal equipment refers to edge nodes in edge calculation, and the edge nodes comprise a mobile phone, a notebook, a router, an edge server and a base station, and the equipment has calculation capacity, storage capacity and communication capacity and is positioned at the edge of the network; when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to the available edge node on the ground; when the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to a part of ground camera network, the ground camera network and the edge node jointly take temporary target tracking task until the unmanned aerial vehicle recovers normal work or the target reappears at the sight line of the unmanned aerial vehicle, and when the target is determined to be lost, the analysis module analyzes the video data stream of the target, the target loss point and the direction and speed of the target movement are calculated to determine that the target can appear in the range covered by the monitoring of a plurality of nearby cameras, and the video between the last appearing time point of the target and the time point of finding the target loss is unloaded to the edge node for analysis and processing, so that the target escaping direction and speed are found, and the target is found and recovered; the main tracking module, the auxiliary tracking module and the analysis module are provided with different heterogeneous platforms, and the isolation difference among the heterogeneous platforms is isolated by adopting a container technology; and installing an API (application programming interface) of a target tracking application program on a container of the edge node, wherein the unmanned aerial vehicle communicates the video which is tracked and shot with the ground edge node through WiFi (wireless fidelity) or other modes, the WAMP interface in the container is utilized to send a request, the edge node receives the request and then processes a target tracking task, and the result is responded to the unmanned aerial vehicle.
The invention further discloses an electronic device, which is characterized by comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-described edge-computing-based object tracking method via execution of the executable instructions.
The invention further discloses a computer-readable storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the above-mentioned object tracking method based on edge calculation.
The method provided by the invention aims to solve the problems of high delay and low reliability of the existing urban target tracking method, and provides a target tracking method based on edge calculation, namely an unmanned aerial vehicle is responsible for main shooting, a ground camera network is used for assisting shooting when the unmanned aerial vehicle is incapable of working, and a ground terminal device is responsible for processing and analyzing a video data stream, as shown in fig. 3. The terminal device refers to an edge node in edge calculation, and includes a mobile phone, a notebook, a router, an edge server and a base station, and these devices have calculation capability, storage capability and communication capability and are located at the edge of a network. When the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to the available edge nodes on the ground. When the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to part of the ground camera network, and the ground camera network and the edge node jointly take over the temporary target tracking task until the unmanned aerial vehicle recovers to work normally or the target appears at the sight line of the unmanned aerial vehicle again. When the target is determined to be lost in the worst case, the target loss point and the direction and speed of the target movement are calculated to determine that the target can appear in the monitoring coverage range of a plurality of nearby cameras, the video between the time point when the target finally appears and the time point when the target is found to be lost is unloaded to an edge node for analysis and processing, the escape direction and speed of the target are found, and the target is found again. In the target tracking method, the related communication modes include that the unmanned aerial vehicle and the ground camera adopt various possible connection modes such as 5G, wifi and the like, the unmanned aerial vehicle and the ground edge node adopt various connection modes such as 5G, wifi and the like, the ground camera adopts connection modes such as wired and 5G and the like, and the ground camera and the ground edge node adopt various connection modes such as wifi, 5G and wired and the like. In the above target tracking method, the isolation difference between the heterogeneous platforms is implemented by using a virtualized container technology or another container technology, for example, a container engine docker is used for isolation. Docker or other containers are installed on the unmanned aerial vehicle, the ground camera network and the ground edge node, and platform differences among the three types of equipment and differences among the equipment are isolated. In the target tracking method, the used tracking algorithm is a target tracking algorithm based on deep learning, the processing steps comprise two flows of target identification and target tracking, and the obtained results comprise the moving direction, the moving speed and the position of the target. The method combines the reliability of the traditional target tracking method and the real-time performance of target tracking of the unmanned aerial vehicle, on one hand, the unmanned aerial vehicle and the ground edge node are used for jointly processing the target tracking task, the time delay of video processing of the traditional target tracking method is reduced, on the other hand, when the unmanned aerial vehicle cannot track normally, the robustness of tracking of the unmanned aerial vehicle can be improved by selectively requesting the ground camera network and the edge node to jointly participate in the tracking task, and even if the final target is completely lost, the target can be efficiently recovered due to the close-range communication of the edge node and the ground camera network, so that the tracking can be continued. In terms of deployment, platform differences are isolated by using docker or other container technologies, and the problem of incompatibility of video analysis processing application on different platforms can be solved. From the overall structure of the network, the ground-air cooperation mode is reasonably selected according to the current tracked state of the target, so that the energy consumption of the whole network can be reduced, and the response time of the network can be prolonged.
The invention has the problems that the unmanned aerial vehicle, the ground camera, the edge computing cluster and other heterogeneous platforms are not compatible, the communication cannot be identified, the video coding conversion is wrong and the like if the platforms directly transmit and process the video data stream. For this reason, we adopt the currently popular heterogeneous platform isolation technology docker container technology and make improved application thereof. The Docker container can pack the application of video analysis processing, and then utilize namespace and cgroup of linux core to isolate resources, so as to standardize the application of video analysis processing, and make the resources between different applications of the same platform not interfere with each other.
Secondly, in order to solve the communication problem between heterogeneous platforms and a multi-docker communication protocol, the WAMP protocol is used for communication by using the API reserved by the current docker. The WAMP protocol is a protocol which is established on Websocket and can carry out publish & describe and rpc communication. When a certain ground camera needs to request an edge computing node to perform video analysis, the request can be issued by using a publish communication protocol, and when the edge computing node near the camera is very interested in the service of the camera, the service can be subscribed by using a subscribe protocol, and once the ground camera and the edge computing node are successfully connected, the target tracking application needing to be unloaded can be transmitted and processed by using an rpc protocol. This approach can better decouple direct contact between communication endpoints. This is the main contribution of the present invention in solving the problem of multi-platform data transmission.
Drawings
The invention will be further understood from the following description in conjunction with the accompanying drawings. The components in the figures are not necessarily to scale, emphasis instead being placed upon illustrating the principles of the embodiments. In the drawings, like reference numerals designate corresponding parts throughout the different views.
FIG. 1 is a prior art conventional target tracking network topology;
FIG. 2 is a prior art drone-based target tracking network topology;
FIG. 3 is a network topology diagram of an edge-based target tracking method according to an embodiment of the present invention;
FIG. 4 is a flowchart of a target tracking method based on edge calculation according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention clearer, the present invention will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. It should be noted that the detailed description set forth in connection with the appended drawings is intended as a description of various configurations and is not intended to represent the only configurations in which the concepts described herein may be practiced. The apparatus embodiments and method embodiments described herein are described in the following detailed description and illustrated in the accompanying drawings by various blocks, modules, units, components, circuits, steps, processes, algorithms, etc. (collectively referred to as "elements"). These elements may be implemented using electronic hardware, computer software, or any combination thereof. Whether such elements are implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. The terms first, second, etc. in the description and claims of the present invention and in the drawings of the specification, if used in describing various aspects, are used for distinguishing between different objects and not for describing a particular order.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It is also to be understood that the terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used in the specification of the present invention and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
It should be noted that, unless otherwise specified, various technical features in the embodiments of the present invention may be regarded as being capable of being combined or coupled with each other as long as the combination or coupling is not technically impossible to implement. While certain exemplary, optional, or preferred features may be described in combination with other features in various embodiments of the invention for a more complete description of the invention, it is not necessary for such combination to be considered, and it is to be understood that the exemplary, optional, or preferred features and the other features may be separable or separable from each other, provided that such separation or separation is not technically impractical. Some functional descriptions of technical features in method embodiments may be understood as performing the function, method, or step, and some functional descriptions of technical features in apparatus embodiments may be understood as performing the function, method, or step using the apparatus.
Example one
As shown in fig. 4, in the target tracking method based on edge calculation, the unmanned aerial vehicle mainly shoots the target, the ground camera network assists to shoot the target when the unmanned aerial vehicle is unable to work, and the video data streams shot by the unmanned aerial vehicle and the ground camera network are processed and analyzed by using an edge node calculation cluster formed by ground terminal devices.
Furthermore, the terminal device refers to an edge node in edge computing, and includes a mobile phone, a notebook, a router, an edge server, and a base station, which are devices located at the edge of a network and having computing capability, storage capability, and communication capability.
Furthermore, when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to available edge nodes on the ground; when the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to part of the ground camera network, and the ground camera network and the edge node jointly take over the temporary target tracking task until the unmanned aerial vehicle recovers to work normally or the target appears at the sight line of the unmanned aerial vehicle again.
Furthermore, when the target is determined to be lost, the target losing point and the direction and speed of the target movement are calculated to determine that the target can appear in the monitoring coverage range of a plurality of nearby cameras, the video between the time point when the target finally appears and the time point when the target is found to be lost is unloaded to an edge node for analysis and processing, the escaping direction and speed of the target are found, and the target is found again.
Furthermore, unmanned aerial vehicle and ground camera adopt 5G, wifi connected mode, and unmanned aerial vehicle and ground edge node adopt 5G, wifi connected mode, and ground camera adopt wired, 5G connected mode, and ground camera and ground edge node adopt wifi, 5G, wired connected mode.
Furthermore, the isolation difference between the heterogeneous platforms is isolated by adopting docker or other container technology; and installing an API (application programming interface) of a target tracking application program on a docker or other containers of the edge nodes, wherein the unmanned aerial vehicle communicates the tracked and shot video with the ground edge nodes through Wi-Fi (wireless fidelity) or other modes, a request is sent by using a WAMP (wireless local area network) interface in the docker or other containers, the edge nodes process a target tracking task after receiving the request, and the result is responded to the unmanned aerial vehicle.
Furthermore, the tracking algorithm adopted by the method is a target tracking algorithm based on deep learning, the processing steps comprise two processes of target identification and target tracking, and the obtained results comprise the moving direction, the moving speed and the position of the target.
The embodiment further discloses an electronic device, which is characterized by comprising: a processor; and a memory for storing executable instructions of the processor; wherein the processor is configured to perform the above-described edge-computing-based object tracking method via execution of the executable instructions.
In this embodiment, the electronic device may be an unmanned aerial vehicle type flight monitoring device, and the target is tracked and shot by executing an algorithm through a GPS, a beidou positioning system, and a camera device.
The present embodiment further discloses a computer-readable storage medium, on which a computer program is stored, wherein the computer program is executed by a processor to implement the above-mentioned target tracking method based on edge calculation.
In this embodiment, a system constructed by an edge-based target tracking method includes: the unmanned aerial vehicle comprises a positioning and navigation module such as a GPS or a Beidou, an accelerometer, a gyroscope, a camera, a micro-processing module and a communication module; the ground camera network can be a road monitoring camera or a road traffic camera of each manufacturer on the road side. The ground edge node may be a node with computing, storage and communication capabilities, such as a mobile terminal, router, edge server or base station. Each platform is provided with a docker or other containers, API interfaces of target tracking application programs are arranged on the docker or other containers of the edge nodes, the unmanned aerial vehicle communicates the tracked and shot videos with the ground edge nodes through WiFi or other modes, requests are sent through interfaces such as WAMP in the docker or other containers, the edge nodes process target tracking tasks after receiving the requests, and the results are responded to the unmanned aerial vehicle. When the unmanned aerial vehicle is about to lose the target, the ground cameras are requested to relay and track the target by sending a request, partial cameras are selectively requested according to the current position of the unmanned aerial vehicle, the maximum communication range and the network topology structure of the ground camera network to ensure that the target is still in the tracking and monitoring range, the ground camera networks can also communicate and interact with each other, and therefore the energy consumption and the time consumption of the unmanned aerial vehicle are reduced. When the target is completely lost, the video shot by the camera nearby the lost position is unloaded to a nearby edge node for analysis and recovery. The available edge nodes refer to computing devices with certain computing capacity, storage capacity and communication capacity, and computing devices can temporarily form a computing cluster through possible communication modes for processing video analysis processing tasks.
In this embodiment, based on the difference between the multi-container isolation tracking network platforms of the edge calculation, the unmanned aerial vehicle is used for target tracking by combining the ground camera network and the ground edge node. The joint mode is selected according to the current tracked state of the target.
In a possible scenario, for example, there is no roadside camera in some places in the ground tracking system, and/or the place where the target is located, such as a road and a building, results in that the unmanned aerial vehicle cannot be used, when such a situation occurs, the conventional tracking method and the target are easily lost. The method comprises the steps that a calculation cluster composed of edge nodes analyzes the escape direction and speed of a target, the escape track of the target is predicted through a preset algorithm, the analyzed result is fed back to an unmanned aerial vehicle, the lost target is found out, under the possible condition that the target is completely lost, the escape direction and speed of the target are determined by calculating the target lost point and the direction and speed of the target movement, the target can appear in the range covered by monitoring of a plurality of nearby cameras, videos between the time point when the target appears last and the time point when the target is found to be lost are unloaded to the edge nodes to be analyzed and processed, the escape direction and speed of the target are found, the unmanned aerial vehicle is informed to carry out cooperative monitoring shooting on key sites in the process, more accurate prediction is obtained, and the target can be continuously tracked by the unmanned aerial vehicle after the site of the target is determined.
Example two
The present embodiment describes the present invention from a hardware perspective, and referring to a topology diagram of the present invention shown in fig. 3, the present embodiment discloses an edge calculation-based target tracking system, which is characterized by comprising: the main tracking module is used for mainly shooting a target through the unmanned aerial vehicle; the auxiliary tracking module is used for carrying out auxiliary shooting on the target by the aid of the camera network when the unmanned aerial vehicle is incapable of working; the analysis module is used for processing and analyzing an edge node calculation cluster formed by terminal equipment on the ground, wherein the video data streams shot by the unmanned aerial vehicle and the ground camera network are processed and analyzed by the edge node calculation cluster, the terminal equipment refers to edge nodes in edge calculation, and the edge nodes comprise a mobile phone, a notebook, a router, an edge server and a base station, and the equipment has calculation capacity, storage capacity and communication capacity and is positioned at the edge of the network; when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to the available edge node on the ground; when the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to a part of ground camera network, the ground camera network and the edge node jointly take temporary target tracking task until the unmanned aerial vehicle recovers normal work or the target reappears at the sight line of the unmanned aerial vehicle, and when the target is determined to be lost, the analysis module analyzes the video data stream of the target, the target loss point and the direction and speed of the target movement are calculated to determine that the target can appear in the range covered by the monitoring of a plurality of nearby cameras, and the video between the last appearing time point of the target and the time point of finding the target loss is unloaded to the edge node for analysis and processing, so that the target escaping direction and speed are found, and the target is found and recovered; the main tracking module, the auxiliary tracking module and the analysis module are provided with different heterogeneous platforms, and the isolation difference among the heterogeneous platforms is isolated by adopting docker or other container technology; and installing an API (application programming interface) of a target tracking application program on a docker or other containers of the edge node, wherein the unmanned aerial vehicle communicates the tracked and shot video with the ground edge node through WiFi (wireless fidelity) or other modes, a request is sent by using a WAMP (wireless local area network) interface in the docker or other containers, the edge node processes a target tracking task after receiving the request, and the result is responded to the unmanned aerial vehicle.
The terminal device may be implemented in various forms in the present embodiment. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palm top computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and a fixed terminal such as a Digital TV, a desktop computer, and the like, such electronic terminals having computing capability, storage capability, and communication capability.
The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000 (Code Division Multiple Access 2000 ), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), FDD-LTE (Frequency Division duplex Long Term Evolution), and TDD-LTE (Time Division duplex Long Term Evolution), etc.
The integrated unit in the above embodiments, if implemented in the form of a software functional unit and sold or used as a separate product, may be stored in the above computer-readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing one or more computer devices (which may be personal computers, servers, network devices, etc.) to execute all or part of the steps of the method according to the embodiments of the present invention.
In the above embodiments of the present invention, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed client may be implemented in other manners. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, units or modules, and may be in an electrical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The foregoing is only a preferred embodiment of the present invention, and it should be noted that, for those skilled in the art, various modifications and decorations can be made without departing from the principle of the present invention, and these modifications and decorations should also be regarded as the protection scope of the present invention.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
Although the invention has been described above with reference to various embodiments, it should be understood that many changes and modifications may be made without departing from the scope of the invention. It is therefore intended that the foregoing detailed description be regarded as illustrative rather than limiting, and that it be understood that it is the following claims, including all equivalents, that are intended to define the spirit and scope of this invention. The above examples are to be construed as merely illustrative and not limitative of the remainder of the disclosure. After reading the description of the invention, the skilled person can make various changes or modifications to the invention, and these equivalent changes and modifications also fall into the scope of the invention defined by the claims.

Claims (5)

1. A target tracking method based on edge calculation is characterized in that a target is mainly shot through an unmanned aerial vehicle, a ground camera network is used for assisting shooting when the unmanned aerial vehicle cannot work, and an edge node calculation cluster consisting of ground terminal equipment is used for processing and analyzing video data streams shot by the unmanned aerial vehicle and the ground camera network;
the terminal equipment refers to an edge node in edge calculation, comprises a mobile phone, a notebook, a router, an edge server and a base station, and the equipment which has the calculation capacity, the storage capacity and the communication capacity and is positioned at the edge of a network processes video data stream information in a cooperative analysis mode;
when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to the available edge node on the ground; when the unmanned aerial vehicle loses the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to part of the ground camera network, and the ground camera network and the edge node jointly take the temporary target tracking task until the unmanned aerial vehicle returns to normal work or the target appears in the sight line of the unmanned aerial vehicle again;
when the target is determined to be lost, determining that the target can appear in the monitoring coverage range of a plurality of nearby cameras through calculation of a target loss point and the direction and speed of target movement, unloading the video between the time point when the target finally appears and the time point when the target is found to be lost to an edge node for analysis and processing, finding out the escape direction and speed of the target, informing an unmanned aerial vehicle of cooperative shooting, and realizing the re-finding of the target;
the unmanned aerial vehicle and the ground camera adopt a 5G, wifi connection mode, the unmanned aerial vehicle and the ground edge node adopt a 5G, wifi connection mode, the ground camera adopts a wired and 5G connection mode, and the ground camera and the ground edge node adopt wifi, 5G and wired connection modes;
the isolation difference between the heterogeneous platforms is isolated by adopting a virtualization container technology or other container technologies; and installing an API (application programming interface) of a target tracking application program on a container of the edge node, wherein the unmanned aerial vehicle communicates the video which is tracked and shot with the ground edge node through WiFi (wireless fidelity) or other modes, the WAMP interface in the container is utilized to send a request, the edge node receives the request and then processes a target tracking task, and the result is responded to the unmanned aerial vehicle.
2. The edge-computing-based target tracking method as claimed in claim 1, wherein the tracking algorithm adopted by the target tracking method is a deep learning-based target tracking algorithm, the processing steps include two processes of target identification and target tracking, and the obtained results include the moving direction, the moving speed and the position of the target.
3. The edge-computing-based target tracking method as claimed in claim 2, wherein the virtual container technology uses a WAMP protocol to communicate through an API reserved by the virtual container technology, the WAMP protocol is built on a Websocket and supports publish/subscribe publish & subscribe and remote procedure call rpc communication protocols, the virtual container packages the video analytics processing application, and then uses namespace of linux core and cgroup derived from control group to isolate resources, thereby standardizing the video analytics processing application.
4. The edge computing-based target tracking method according to claim 3, wherein when the ground camera network needs to request an edge node to perform video analysis, the request is issued through issuing a publish communication protocol, an edge node near a certain camera in the ground camera network subscribes to a service of the camera through a subscription sub protocol, and when the ground camera network and the edge node are successfully connected, the ground camera and the edge node transmit and process a target tracking application to be offloaded through a remote procedure call rpc protocol.
5. An edge-computation-based target tracking device, comprising: the main tracking module is used for mainly shooting a target through the unmanned aerial vehicle; the auxiliary tracking module is used for carrying out auxiliary shooting on the target by the aid of the camera network when the unmanned aerial vehicle cannot work; the analysis module is used for processing and analyzing an edge node calculation cluster formed by terminal equipment on the ground, wherein the video data streams shot by the unmanned aerial vehicle and the ground camera network are processed and analyzed by the edge node calculation cluster, the terminal equipment refers to edge nodes in edge calculation, and the edge nodes comprise a mobile phone, a notebook, a router, an edge server and a base station, and the equipment has calculation capacity, storage capacity and communication capacity and is positioned at the edge of the network; when the unmanned aerial vehicle tracks the target, the unmanned aerial vehicle unloads the video analysis task to the available edge node on the ground; when the unmanned aerial vehicle is about to lose the target due to insufficient cruising ability or obstructed sight line, the unmanned aerial vehicle unloads the target tracking task to a part of ground camera network, the ground camera network and the edge node jointly take temporary target tracking task until the unmanned aerial vehicle recovers normal work or the target reappears at the sight line of the unmanned aerial vehicle, and when the target is determined to be lost, the analysis module analyzes the video data stream of the target, the target loss point and the direction and speed of the target movement are calculated to determine that the target can appear in the range covered by the monitoring of a plurality of nearby cameras, and the video between the last appearing time point of the target and the time point of finding the target loss is unloaded to the edge node for analysis and processing, so that the target escaping direction and speed are found, and the target is found and recovered; the main tracking module, the auxiliary tracking module and the analysis module are provided with different heterogeneous platforms, and the isolation difference among the heterogeneous platforms is isolated by adopting a virtualization container technology or other container technologies; and installing an API (application programming interface) of a target tracking application program on a container of the edge node, wherein the unmanned aerial vehicle communicates the video which is tracked and shot with the ground edge node through WiFi (wireless fidelity) or other modes, the WAMP interface in the container is utilized to send a request, the edge node receives the request and then processes a target tracking task, and the result is responded to the unmanned aerial vehicle.
CN201910270937.3A 2019-04-04 2019-04-04 Target tracking method and device based on edge calculation Active CN109996039B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910270937.3A CN109996039B (en) 2019-04-04 2019-04-04 Target tracking method and device based on edge calculation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910270937.3A CN109996039B (en) 2019-04-04 2019-04-04 Target tracking method and device based on edge calculation

Publications (2)

Publication Number Publication Date
CN109996039A CN109996039A (en) 2019-07-09
CN109996039B true CN109996039B (en) 2021-06-25

Family

ID=67132381

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910270937.3A Active CN109996039B (en) 2019-04-04 2019-04-04 Target tracking method and device based on edge calculation

Country Status (1)

Country Link
CN (1) CN109996039B (en)

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110429973B (en) * 2019-08-05 2022-02-08 西北工业大学 Unmanned aerial vehicle and ground base station collaborative computing unloading and optimizing method
CN110553629B (en) * 2019-09-20 2020-12-15 中南大学 Unmanned aerial vehicle target tracking power consumption optimization method and system based on edge calculation
CN112788227B (en) * 2019-11-07 2022-06-14 富泰华工业(深圳)有限公司 Target tracking shooting method, target tracking shooting device, computer device and storage medium
CN111405242A (en) * 2020-02-26 2020-07-10 北京大学(天津滨海)新一代信息技术研究院 Ground camera and sky moving unmanned aerial vehicle linkage analysis method and system
CN111565225B (en) * 2020-04-27 2023-08-04 银河水滴科技(宁波)有限公司 Character action track determining method and device
CN111753664A (en) * 2020-05-25 2020-10-09 西南石油大学 Suspect identification and positioning tracking system and method based on 5G wireless network
CN111836326B (en) * 2020-07-03 2022-06-14 杭州电子科技大学 Edge network routing method based on target tracking scene
CN112153334B (en) * 2020-09-15 2023-02-21 公安部第三研究所 Intelligent video box equipment for safety management and corresponding intelligent video analysis method
CN114531559A (en) * 2020-11-02 2022-05-24 青岛海尔多媒体有限公司 Method, device and system for joining video call
CN112562405A (en) * 2020-11-27 2021-03-26 山东高速建设管理集团有限公司 Radar video intelligent fusion and early warning method and system
WO2022126415A1 (en) * 2020-12-16 2022-06-23 深圳市大疆创新科技有限公司 Method and apparatus for operating tracking algorithm, and electronic device and computer-readable storage medium
CN112788238A (en) * 2021-01-05 2021-05-11 中国工商银行股份有限公司 Control method and device for robot following
CN112822450B (en) * 2021-01-08 2024-03-19 鹏城实验室 Effective node dynamic selection method in large-scale visual computing system
CN113115216B (en) * 2021-02-22 2022-09-06 浙江大华技术股份有限公司 Indoor positioning method, service management server and computer storage medium
CN113724295A (en) * 2021-09-02 2021-11-30 中南大学 Unmanned aerial vehicle tracking system and method based on computer vision
CN114170619B (en) * 2021-10-18 2022-08-19 中标慧安信息技术股份有限公司 Data checking method and system based on edge calculation
CN114172809B (en) * 2021-12-13 2023-10-03 重庆邮电大学 Video computing cloud edge collaborative task scheduling method based on target tracking
CN114760605A (en) * 2022-03-04 2022-07-15 福云智控(厦门)智能科技有限公司 Multi-address edge computing system of unmanned aerial vehicle network
CN114979497B (en) * 2022-07-28 2022-11-08 深圳联和智慧科技有限公司 Unmanned aerial vehicle linkage tracking method and system based on pole loading and cloud platform

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102045549A (en) * 2010-12-28 2011-05-04 天津市亚安科技电子有限公司 Method and device for controlling linkage-tracking moving target of monitoring device
CN104777847A (en) * 2014-01-13 2015-07-15 中南大学 Unmanned aerial vehicle target tracking system based on machine vision and ultra-wideband positioning technology
CN105357296A (en) * 2015-10-30 2016-02-24 河海大学 Elastic caching system based on Docker cloud platform
CN107809277A (en) * 2017-10-17 2018-03-16 安徽工业大学 A kind of emergency management and rescue communication network and network-building method based on unmanned plane and wireless device
CN109408234A (en) * 2018-10-19 2019-03-01 国云科技股份有限公司 A kind of augmented reality data-optimized systems and method based on edge calculations

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2924831B1 (en) * 2007-12-11 2010-11-19 Airbus France METHOD AND DEVICE FOR GENERATING A LACET SPEED ORDER FOR A GROUNDING AIRCRAFT
US9940842B2 (en) * 2015-11-02 2018-04-10 At&T Intellectual Property I, L.P. Intelligent drone traffic management via radio access network
WO2018009159A1 (en) * 2016-07-02 2018-01-11 Intel Corporation Resource orchestration brokerage for internet-of-things networks

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102045549A (en) * 2010-12-28 2011-05-04 天津市亚安科技电子有限公司 Method and device for controlling linkage-tracking moving target of monitoring device
CN104777847A (en) * 2014-01-13 2015-07-15 中南大学 Unmanned aerial vehicle target tracking system based on machine vision and ultra-wideband positioning technology
CN105357296A (en) * 2015-10-30 2016-02-24 河海大学 Elastic caching system based on Docker cloud platform
CN107809277A (en) * 2017-10-17 2018-03-16 安徽工业大学 A kind of emergency management and rescue communication network and network-building method based on unmanned plane and wireless device
CN109408234A (en) * 2018-10-19 2019-03-01 国云科技股份有限公司 A kind of augmented reality data-optimized systems and method based on edge calculations

Also Published As

Publication number Publication date
CN109996039A (en) 2019-07-09

Similar Documents

Publication Publication Date Title
CN109996039B (en) Target tracking method and device based on edge calculation
US20220086195A1 (en) Systems and methods for allocating and managing resources in an internet of things environment using location based focus of attention
US10973083B2 (en) Multiple mesh drone communication
CN109769207B (en) System and method for sharing computing power in dynamic networking of mobile equipment
Amento et al. FocusStack: Orchestrating edge clouds using location-based focus of attention
US20210219004A1 (en) Method and apparatus for an enhanced data pipeline
CN112351055B (en) Searching method of edge computing server and related equipment
CN109739221A (en) Automatic driving vehicle monitoring method, device and storage medium
US10111033B2 (en) GIS based compression and reconstruction of GPS data for transmission from a vehicular edge platform to the cloud
CA2877360C (en) Methods and systems for content consumption
CN113872680B (en) TDOA (time difference of arrival) auxiliary RID (Rich Internet protocol) signal receiving control method, device and system
CN114760605A (en) Multi-address edge computing system of unmanned aerial vehicle network
EP3591937B1 (en) Communication device, method and computer program product for processing sensor data with edge server assistance
CN114071706A (en) Positioning method, positioning device, positioning apparatus, positioning system, and storage medium
CN116233790A (en) Unmanned aerial vehicle cluster edge computing architecture system based on 5G and processing method
CN109714437B (en) Emergency communication network system
Jang et al. A mobile ad hoc cloud for automated video surveillance system
CN111431950A (en) Task unloading method and device, mobile terminal, fog node and storage medium
US11570594B2 (en) Method of facilitating on-demand wireless connectivity using device-to-device resources and data pooling with a vehicle platoon
CN113783963A (en) Data transmission method, server node, gateway device and network system
Chang et al. Mobile fog computing
CN110784512A (en) Airborne dynamic cloud system and real-time response resource allocation method thereof
CN113885515B (en) Network architecture system for connecting various automatic driving sensors
CN117177306B (en) Unmanned aerial vehicle MEC network system based on NFV and SDN
Metzler et al. N-CET: Network-centric exploitation and tracking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant