CN113536993A - Target tracking method, device, system, electronic device and storage medium - Google Patents

Target tracking method, device, system, electronic device and storage medium Download PDF

Info

Publication number
CN113536993A
CN113536993A CN202110731985.5A CN202110731985A CN113536993A CN 113536993 A CN113536993 A CN 113536993A CN 202110731985 A CN202110731985 A CN 202110731985A CN 113536993 A CN113536993 A CN 113536993A
Authority
CN
China
Prior art keywords
user terminal
video stream
flight
target
acquiring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110731985.5A
Other languages
Chinese (zh)
Inventor
包剑冰
葛主办
史正涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Dahua Technology Co Ltd
Original Assignee
Zhejiang Dahua Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Dahua Technology Co Ltd filed Critical Zhejiang Dahua Technology Co Ltd
Priority to CN202110731985.5A priority Critical patent/CN113536993A/en
Publication of CN113536993A publication Critical patent/CN113536993A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The present application relates to a target tracking method, an apparatus, a system, an electronic apparatus, and a storage medium, wherein the target tracking method includes: acquiring a video stream of the video monitoring equipment, and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment; acquiring flight positioning information; generating an airplane AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the airplane AR target frame to a user terminal for displaying; and under the condition of receiving a tracking instruction of the user terminal aiming at the AR target frame of the airplane, acquiring a first tracking result aiming at the dynamic target of the airplane. By the aid of the method and the device, the problem of low target tracking accuracy in an airport monitoring scene is solved, and high-precision airport target tracking based on video monitoring and AR technology is achieved.

Description

Target tracking method, device, system, electronic device and storage medium
Technical Field
The present application relates to the field of video surveillance technology, and in particular, to a target tracking method, apparatus, system, electronic apparatus, and storage medium.
Background
The guarantee activities of a single airplane in the airport monitoring system are in a competitive relationship in resource distribution, the activities such as plan decision, airplane selection, resource allocation and the like are all carried out before the start of the guarantee activities, and purposeful adjustment is difficult to carry out after the start of the guarantee activities. The main reason for this is that the support personnel of a single aircraft knows little about the use purpose of the aircraft and the overall state of the fleet, and although the fleet manager knows about the use purpose of the aircraft, the fleet manager cannot adjust the aircraft to follow the change due to low working efficiency and no real-time property of status information gathering, which is difficult to achieve, thereby resulting in low target tracking accuracy in the airport monitoring scene.
At present, no effective solution is provided for the problem of low target tracking accuracy in the related art when applied to airport monitoring scenes.
Disclosure of Invention
The embodiment of the application provides a target tracking method, a target tracking device, a target tracking system, an electronic device and a storage medium, and aims to at least solve the problem of low target tracking accuracy in an airport monitoring scene in the related art.
In a first aspect, an embodiment of the present application provides a method, including:
acquiring a video stream of a video monitoring device, and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment;
acquiring flight positioning information;
generating an aircraft Augmented Reality (AR) target frame according to the conversion relation, the flight positioning message and the video stream, and sending the AR target frame to a user terminal for displaying;
and under the condition that a tracking instruction of the user terminal for the AR target frame of the airplane is received, acquiring a first tracking result for the dynamic target of the airplane.
In one embodiment, after the obtaining the conversion relationship between the camera coordinate system and the world coordinate system based on the video stream, the method further comprises:
acquiring vehicle positioning information of an airport vehicle system;
generating a vehicle AR target frame according to the conversion relation, the vehicle positioning message and the video stream, and sending the vehicle AR target frame to a user terminal for displaying;
and under the condition that a tracking instruction of the user terminal for the vehicle AR target frame is received, acquiring a second tracking result for the vehicle dynamic target.
In one embodiment, after the obtaining the vehicle positioning message of the airport vehicle system, the method further comprises:
pushing the vehicle positioning message to the user terminal; and the user terminal displays the license plate number in the vehicle positioning message in the vehicle AR target frame.
In one embodiment, the method further comprises:
pushing a flight status message of an airport flight support system to the user terminal;
and the user terminal displays the flight status message and displays the video stream matched with the flight status message under the condition of receiving a viewing instruction for the AR target frame.
In one embodiment, the obtaining a transformation relationship between a camera coordinate system and a world coordinate system based on the video stream comprises:
acquiring a plurality of point locations on the video stream through the user terminal, and acquiring point location pixel coordinates on the video stream corresponding to the point locations;
acquiring a geographical coordinate corresponding to the point location through positioning equipment;
and acquiring the conversion relation according to the point location pixel coordinate and the geographic coordinate.
In one embodiment, after the obtaining the flight location message, the method further comprises:
acquiring the flight positioning message through a broadcast monitoring system, and storing the flight positioning message into a message queue;
the flight positioning messages are sequentially taken out from the message queue and pushed to the user terminal; or storing the flight positioning message into a database through the message queue.
In one embodiment, the acquiring the video stream of the video monitoring device includes:
acquiring the identity information of the user uploaded by the user terminal, and verifying the identity information;
under the condition that the verification is passed, sending the acquired video stream to the user terminal for displaying; or sending the video stream to a storage device for storage.
In one embodiment, after the sending the airplane AR target box to the user terminal for displaying, the method further includes:
pushing the flight location message to the user terminal; wherein the user terminal displays the flight information in the flight positioning message in the airplane AR target box.
In a second aspect, an embodiment of the present application provides a target tracking apparatus, including: the device comprises a conversion module, an acquisition module, a generation module and a tracking module;
the conversion module is used for acquiring a video stream of the video monitoring equipment and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment;
the acquisition module is used for acquiring flight positioning information;
the generating module is used for generating an aircraft AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the aircraft AR target frame to a user terminal for displaying;
the tracking module is used for acquiring a first tracking result aiming at the aircraft dynamic target under the condition that a tracking instruction aiming at the aircraft AR target frame of the user terminal is received.
In a third aspect, an embodiment of the present application provides a target tracking system, where the system includes: the system comprises a server, video monitoring equipment and a user terminal; the server is respectively connected with the video monitoring equipment and the user terminal;
the video monitoring equipment is used for acquiring a video stream and sending the video stream to the server;
the user terminal is used for receiving the video stream of the server and displaying the video stream;
the server is configured to perform the object tracking method according to the first aspect.
In a fourth aspect, an embodiment of the present application provides an electronic apparatus, which includes a memory, a processor, and a computer program stored on the memory and executable on the processor, and the processor executes the computer program to implement the object tracking method according to the first aspect.
In a fifth aspect, the present application provides a storage medium, on which a computer program is stored, and when the program is executed by a processor, the method for tracking an object as described in the first aspect above is implemented.
Compared with the related art, the target tracking method, the target tracking device, the target tracking system, the target tracking electronic device and the storage medium provided by the embodiment of the application acquire the video stream of the video monitoring equipment and acquire the conversion relation between the camera coordinate system and the world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment; acquiring flight positioning information; generating an airplane AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the airplane AR target frame to a user terminal for displaying; under the condition that a tracking instruction aiming at the AR target frame of the airplane of the user terminal is received, a first tracking result aiming at the dynamic target of the airplane is obtained, the problem of low target tracking accuracy in an airport monitoring scene is solved, and high-precision airport target tracking based on video monitoring and AR technology is realized.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic diagram of an application scenario of a target tracking method according to an embodiment of the present application;
FIG. 2 is a flow chart of a target tracking method according to an embodiment of the present application;
FIG. 3 is a flow chart of another method of target tracking according to an embodiment of the present application;
FIG. 4 is a flow chart of yet another method of target tracking according to an embodiment of the present application;
FIG. 5 is a schematic diagram of a video frame according to an embodiment of the present application;
FIG. 6 is a schematic diagram of a target tracking architecture according to an embodiment of the present application;
FIG. 7 is a schematic diagram illustrating the effect of a target tracking method according to an embodiment of the present application;
FIG. 8 is a block diagram of a target tracking device according to an embodiment of the present application;
FIG. 9 is a block diagram of a target tracking system according to an embodiment of the present application;
fig. 10 is a block diagram of the inside of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be described and illustrated below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments provided in the present application without any inventive step are within the scope of protection of the present application. Moreover, it should be appreciated that in the development of any such actual implementation, as in any engineering or design project, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which may vary from one implementation to another.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of ordinary skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms referred to herein shall have the ordinary meaning as understood by those of ordinary skill in the art to which this application belongs. Reference to "a," "an," "the," and similar words throughout this application are not to be construed as limiting in number, and may refer to the singular or the plural. The present application is directed to the use of the terms "including," "comprising," "having," and any variations thereof, which are intended to cover non-exclusive inclusions; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to the listed steps or elements, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. Reference to "connected," "coupled," and the like in this application is not intended to be limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. Reference herein to "a plurality" means greater than or equal to two. "and/or" describes an association relationship of associated objects, meaning that three relationships may exist, for example, "A and/or B" may mean: a exists alone, A and B exist simultaneously, and B exists alone. Reference herein to the terms "first," "second," "third," and the like, are merely to distinguish similar objects and do not denote a particular ordering for the objects.
In the present embodiment, an application scenario of a target tracking method is provided, and fig. 1 is a schematic diagram of an application scenario of a target tracking method according to an embodiment of the present application, as shown in fig. 1, in the application environment, including a user terminal 12 and a server 14. The user terminal 12 communicates with the server 14 via a network; the server 14 acquires a video stream of the video monitoring device, and acquires a conversion relationship between a camera coordinate system and a world coordinate system based on the video stream; the server 14 obtains the flight positioning message, generates an airplane AR target frame according to the conversion relationship, the flight positioning message and the video stream, sends the airplane AR target frame to the user terminal 12 for display, and finally obtains a first tracking result for the airplane dynamic target. The user terminal 12 may be, but is not limited to, various smart phones, personal computers, notebook computers, and tablet computers, and the server 14 may be implemented by a stand-alone server or a server cluster composed of a plurality of servers.
The present embodiment provides a target tracking method, which is applied to an airport management system, and fig. 2 is a flowchart of a target tracking method according to an embodiment of the present application, as shown in fig. 2, the flowchart includes the following steps:
step S202, acquiring a video stream of the video monitoring equipment, and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment.
The video monitoring equipment can be equipment such as a panoramic gun camera, a panoramic camera, a binocular camera or a video recorder; the video surveillance device may be deployed in a scene such as an airport. The video monitoring equipment is calibrated through the video stream which is shot by the video monitoring equipment and aims at the airport, so that the conversion relation of the coordinates between the camera coordinate system and the world coordinate system is obtained based on the calibration result.
And step S204, acquiring flight positioning information.
The flight positioning message refers to positioning information corresponding to each flight airplane, and includes: flight information, flight location longitude, flight location latitude, flight location altitude, and the like. Specifically, the airport management system acquires flight messages from an external system connected to the airport management system, such as an Automatic Dependent-Broadcast (ADS-B) system, filters out airport-related flight messages, and assembles the flight location messages to generate flight location messages.
And step S206, generating an airplane AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the airplane AR target frame to the user terminal for displaying.
Specifically, the airport management system converts the geographic coordinates of the airplane in the flight positioning message, i.e., the flight position longitude, the flight position latitude and the flight position altitude, into the camera coordinate system of the video monitoring device by using the conversion relationship, so as to obtain the airplane pixel coordinates in the camera coordinate system. And then drawing and generating the airplane AR target frame at a corresponding position in the frame of the video stream based on the airplane pixel coordinate, wherein the airplane AR target frame is sent to the user terminal, and the user terminal can display the video stream frame on which the airplane AR target frame is superimposed to a user, namely airport staff.
In an embodiment, after the sending the aircraft AR target frame to the user terminal for displaying, the target tracking method further includes the following steps: pushing the flight location message to the user terminal; wherein the user terminal displays the flight information in the flight positioning message in the airplane AR target box. That is, when the plane AR target frame is generated by drawing at the specified position of the screen in the video stream, the flight positioning message may be pushed to the user terminal, and the flight positioning message and the plane AR target frame are synchronously displayed for the user to view.
Step S208, under the condition that a tracking instruction of the user terminal for the aircraft AR target frame is received, a first tracking result for the aircraft dynamic target is obtained.
Specifically, the user may interact with the user terminal and send the tracking instruction for the AR target frame of the airplane to the airport management system by the user terminal; for example, the user may click on the airplane AR target box in a display screen of the user terminal. At this time, the airport management system receives the tracking instruction for the AR target frame of the airplane, and may acquire the first tracking result from an external monitoring device based on the tracking instruction for the AR target frame of the airplane. It should be noted that the external monitoring device may be a ball machine device, a gun device, a magnetic suspension camera device, or other devices for monitoring shooting detailed scenes and adjusting shooting angles. The external monitoring equipment can be bound with the video monitoring equipment, and the airport management system controls the rotation angle of the external monitoring equipment by controlling the video monitoring equipment and monitors the airplane dynamic target indicated by the tracking instruction; or, this external supervisory equipment also can set up on the cloud platform, instructs the cloud platform by this airport management system and drives this external supervisory equipment rotation, does not give unnecessary details here. Taking the external monitoring device as a ball machine device as an example, the airport management system synchronizes the vehicle geographic coordinates in the flight positioning message to the ball machine device in real time, so as to control the ball machine device to continuously track the airplane dynamic target, and take an amplified video image for continuously tracking the airplane dynamic target as the first tracking result. It is understood that the user terminal may display the first tracking result at a suitable position in the frame of the video stream, for example, an enlarged video frame of the airplane dynamic target may be displayed at the lower right corner of the video stream frame for the user to preview.
In the related art, the efficiency and accuracy of status information summary in an airport are low, and only augmented reality of an airplane in a static state is supported to obtain the state of the airplane, so that a visual and effective aid decision-making means cannot be provided. In the embodiment of the application, through the steps S202 to S208, an AR target frame is generated for a static or dynamic target on a video picture of a video monitoring device such as a panoramic gun through technical means such as a video monitoring technology and an AR technology based on the obtained conversion relationship, flight positioning information and video stream, and the target is tracked and amplified by a video, so that a very visual picture display is provided for a user, the dynamic target in an airport flight area can be fed back to airport staff in real time and comprehensively, the manual scheduling in the airport flight area is improved to intelligent scheduling, and an intelligent aid decision-making means is provided, so that the problem of low target tracking accuracy in an airport monitoring scene is effectively solved, and high-precision airport target tracking based on the video monitoring and the AR technology is realized.
In one embodiment, a target tracking method is provided, and fig. 3 is a flowchart of another target tracking method according to an embodiment of the present application, and as shown in fig. 3, the flowchart includes the following steps:
step S302, vehicle positioning information of the airport vehicle system is obtained.
The airport vehicle system is an external system for generating vehicle information in an airport scene. The airport management system applied to the target tracking method in the application is connected with the airport vehicle system, so that the real-time information of the vehicle stored in the airport vehicle system can be acquired, and then the airport management system assembles the real-time information of the vehicle into the vehicle positioning message. The vehicle positioning message comprises a license plate number, geographic coordinates of the vehicle and the like; wherein the geographic coordinates of the vehicle include a vehicle position longitude, a vehicle position latitude, and a vehicle position altitude.
And step S304, generating a vehicle AR target frame according to the conversion relation, the vehicle positioning message and the video stream, and sending the vehicle AR target frame to the user terminal for displaying.
Specifically, the airport management system converts the geographic coordinates of the vehicle in the vehicle positioning message, i.e., the vehicle position longitude, the vehicle position latitude and the vehicle position altitude, into the camera coordinate system of the video monitoring device by using the conversion relationship, and further obtains the vehicle pixel coordinates in the camera coordinate system. And then, based on the vehicle pixel coordinates, drawing and generating the vehicle AR target frame at a corresponding position in the picture of the video stream, wherein the vehicle AR target frame is sent to the user terminal, and the user terminal can display the video stream picture on which the vehicle AR target frame is superimposed to a user, namely an airport staff.
And step S306, acquiring a second tracking result aiming at the vehicle dynamic target under the condition that a tracking instruction aiming at the vehicle AR target frame of the user terminal is received.
Specifically, the user may interact with the user terminal and send the tracking instruction for the AR target frame of the vehicle to the airport management system by the user terminal; for example, the user may click on the vehicle AR target frame in a display screen of the user terminal. Similar to the step S208, when the airport management system receives the tracking instruction for the vehicle AR target frame, the second tracking result for the vehicle dynamic target may be obtained from the external monitoring device based on the tracking instruction for the vehicle AR target frame; for example, the geographic coordinates of the vehicle in the vehicle positioning message are synchronized to the dome camera device bound to the video monitoring device in real time, so that the dome camera device is controlled to continuously track the dynamic target of the vehicle, and an enlarged video image for continuously tracking the dynamic target of the vehicle is used as the second tracking result. The user terminal may display the second tracking result at an appropriate position in the video stream picture, for example, an enlarged video picture of the vehicle dynamic target may be displayed at a lower right corner of the video stream picture, so as to be previewed by the user.
Through the steps S302 to S306, the vehicle AR target frame is generated and displayed through the user terminal based on the vehicle positioning information of the airport vehicle system, and then the vehicle dynamic target is tracked in a targeted manner according to the vehicle positioning information when the tracking instruction of the user terminal is received, so that the vehicle dynamic target in the airport flight area is fed back to airport staff in real time and comprehensively, and the target tracking accuracy in the airport application scene is effectively improved.
In an embodiment, after the step S302, the target tracking method further includes the following steps: pushing the vehicle positioning message to the user terminal; and the user terminal displays the license plate number in the vehicle positioning message in the vehicle AR target frame.
After the airport management system acquires the vehicle positioning message, the vehicle positioning message can be sent into a message queue deployed in the airport management system, and the vehicle positioning messages stored in the message queue are sequentially taken out so as to be stored in a database of the airport management system; in addition, the airport flight area management service of the airport management system can also cancel the vehicle positioning message in the message queue and push the vehicle positioning message to the user terminal through the message push service. And under the condition that the user terminal receives the pushed vehicle positioning message, the license plate number of the corresponding vehicle can be displayed in the vehicle AR target frame drawn in the video stream by the user terminal.
By the embodiment, the vehicle positioning information is pushed to the user terminal, and the license plate number of the corresponding vehicle is displayed in the vehicle AR target frame by the user terminal based on the vehicle positioning information, so that a user can quickly and conveniently search the target dynamic vehicle in a video stream picture, and further the dynamic target in the airport flight area is fed back to airport staff in real time and comprehensively, and the accuracy of airport management is improved.
In one embodiment, a target tracking method is provided, and fig. 4 is a flowchart of another target tracking method according to an embodiment of the present application, as shown in fig. 4, the flowchart includes the following steps:
step S402, pushing the flight status message of the airport flight support system to the user terminal; and the user terminal displays the flight status message and displays the video stream matched with the flight status message under the condition of receiving a viewing instruction aiming at the AR target frame of the airplane.
Wherein, the airport management system acquires flight status information from an external system connected with the airport management system, such as an airport flight support system; the flight status message comprises a flight number, a status including a front takeoff status, a stop status, a landing status, an off-track status, a docking status, a gate status and the like. The airport management system may push the flight status message to a user terminal. If the user interacts with the user terminal, for example, the user clicks the airplane AR target frame on the user terminal, the user terminal receives a viewing instruction corresponding to the user operation for the airplane AR target frame; at this time, the received flight status message is displayed to the user through the user terminal so as to check the details of the airplane flight historical status. Meanwhile, the airport management system can also support the playback of the video stream corresponding to the time of the flight status of the flight history message.
Through the step S402, the user terminal is indicated to display the flight history state and the video stream based on the flight state message of the airport flight support system, so that the user can trace the source of the flight state, and convenience of airport flight management is further improved.
In one embodiment, the obtaining of the conversion relationship between the camera coordinate system and the world coordinate system based on the video stream further includes:
step S502, obtaining a plurality of point locations on the video stream through the user terminal, and obtaining point location pixel coordinates on the video stream corresponding to the point location.
Specifically, the video monitoring device is exemplified by a panoramic camera device; the panoramic camera device usually includes several lenses, and the lenses are spliced into a panoramic picture. The following description will be given taking an example in which the panoramic camera apparatus has 4 lenses. Fig. 5 is a schematic diagram of a video frame according to an embodiment of the present application, and as shown in fig. 5, a real-time video frame of the panoramic camera device is displayed on the user terminal, the video frame is equally divided by 4, at least 3 point locations can be selected in each equally divided interval, and the user terminal analyzes the point locations through analysis software to obtain point location pixel coordinates of each point location in the video frame based on a video frame resolution; the point location pixel coordinates may be represented by (a, B), where a represents a distance length in the pixel relative to the starting point and B represents a distance width in the pixel relative to the starting point. In this embodiment of the present application, the pixel coordinate of the starting point at the upper left corner of the video frame is (0, 0).
Step S504, the geographical coordinates corresponding to the point location are obtained through the positioning equipment.
The method comprises the following steps that (1) the ground of the airport can be moved to each point location marked on panoramic camera equipment through manual intervention, and then high-precision positioning equipment is used for collecting geographic coordinates corresponding to the point locations; the geographic coordinates may be represented by (X, Y, H), X representing the longitude of a point of location deployed in the airport, Y representing the latitude, and H representing the altitude.
Step S506, obtaining the conversion relationship according to the point location pixel coordinate and the geographic coordinate.
The user can input point location pixel coordinates (A, B) and geographic coordinates (X, Y, H) of the point locations one by one through the user terminal, and then the conversion relation between the pixel coordinates and the geographic coordinates in the video picture is calculated through the user terminal. It should be noted that, if there are multiple video monitoring devices in the airport, each video monitoring device repeats the above calibration procedure.
Through the steps S502 to S506, each video monitoring device is calibrated based on the point location pixel coordinate, and the conversion relationship between the camera coordinate system and the world coordinate system of each video monitoring device is obtained, so that the calibration efficiency and accuracy of the video monitoring devices can be effectively improved, and the target tracking efficiency and accuracy are further improved.
In one embodiment, after the step S204 is executed, the target tracking method further includes the following steps: acquiring the flight positioning message through a broadcast monitoring system, and storing the flight positioning message into a message queue; the flight positioning messages are sequentially taken out from the message queue and pushed to the user terminal; alternatively, the flight location message is stored in a database via the message queue.
The broadcast monitoring system may be an ADS-B system. The airport management system sequentially stores the flight positioning messages acquired by the broadcast monitoring system into a message queue, and sequentially takes out the flight positioning messages stored in the message queue so as to store the flight positioning messages into a database of the airport management system; in addition, the airport flight area management service of the airport management system may also obtain the flight location message in the message queue, and push the flight location message to the user terminal through a message push service. In the case where the user terminal receives the pushed flight location message, the flight details of the corresponding flight may be displayed in a flight AR target box drawn in the video stream by the user terminal. Through the embodiment, the flight positioning message is stored in the message queue, so that the access to the flight positioning message is more ordered, and the data processing efficiency in the target tracking process is improved.
In an embodiment, the acquiring the video stream of the video monitoring device further includes the following steps: acquiring the identity information of the user uploaded by the user terminal, and verifying the identity information; under the condition that the verification is passed, sending the acquired video stream to the user terminal for displaying; or sending the video stream to a storage device for storage.
The identity of the logged-in user can be verified through the user terminal, that is, the identity information of the user obtained through verification is matched with the password information stored in the user terminal or the airport management system. If the matching is successful, the authentication is passed, and the video stream can be sent to a user terminal so that the user with qualified identity can check the picture of the video stream; or, the airport management system may also store the video stream picture data of the video monitoring device to the storage device through a streaming media service. The storage device may be a device that may store graphics or audiovisual data deployed in the airport management system.
The authentication process may be as follows: setting the password information as preset password data; and the user terminal performs password identification, takes the password input by the user on the user terminal as user information, and compares and authenticates the user information and the corresponding password data. Alternatively, the password information may be pre-stored fingerprint information; the user terminal carries out a fingerprint safety identification mode, takes the fingerprint input by the user on the user terminal as user information, and compares and authenticates the user information and the corresponding fingerprint information. It can be understood that the password information may also be information such as smart card identification information, face identification data, and the like, and identity identification corresponding to the password information is performed through the user terminal, which is not described herein again.
Through the embodiment, the identity of the user is verified through the identity information of the user uploaded by the user terminal, and the pushed video stream is displayed through the user terminal when the identity verification is passed, so that the identity authentication of the user is realized, irrelevant personnel is prevented from inquiring and managing the flight state of the airport, and the management safety of target tracking in the airport scene is effectively improved.
In the following, embodiments of the present application are described in detail with reference to practical application scenarios, and fig. 6 is a schematic diagram of a target tracking architecture according to an embodiment of the present application, as shown in fig. 6, the target tracking architecture includes a user terminal, an airport management system, and a video monitoring device. The airport management system is provided with a storage device, a streaming media service, a message queue, a database, a message pushing service, an airport flight area management service and a plurality of docking services. The message queue in the airport management system is respectively connected with the corresponding external system through each connection service, and further respectively obtains the flight positioning message, the vehicle positioning message and the flight status message of each external system. The message queue sends the flight positioning information and the like to a database for storage, and pushes the flight positioning information and the like to the user terminal through airport flight area management service and message pushing service. The video monitoring equipment sends the audio and video data to the streaming media service, the streaming media service stores the audio and video data to the storage equipment, and the audio and video data is sent to the user terminal for displaying. And finally, displaying the video monitoring picture with the real-time dynamic AR labels aiming at the airplane and the vehicle by the user terminal.
Fig. 7 is a schematic diagram of an effect of a target tracking method according to an embodiment of the present application, as shown in fig. 7, when the video monitoring apparatus adopts a panoramic camera apparatus. The overall picture in fig. 7 is a video picture taken by the panoramic camera apparatus; two identified airplane AR target frames and a vehicle AR target frame are distributed on the whole picture, and flight numbers are displayed in each airplane AR target frame, namely QDA9794 and CSN 6224; at the moment, the aircraft dynamic target with the flight number of QDA9794 is in a selected state, and the airport management system controls the ball machine equipment to continuously track the aircraft dynamic target and acquire a tracking video picture. The lower left corner of fig. 7 is a tracking video frame in which the ball machine device tracks the target clicked by the user and enlarges.
It should be noted that the steps illustrated in the above-described flow diagrams or in the flow diagrams of the figures may be performed in a computer system, such as a set of computer-executable instructions, and that, although a logical order is illustrated in the flow diagrams, in some cases, the steps illustrated or described may be performed in an order different than here.
The present embodiment further provides a target tracking apparatus, which is used to implement the foregoing embodiments and preferred embodiments, and the description of the target tracking apparatus is omitted here. As used hereinafter, the terms "module," "unit," "subunit," and the like may implement a combination of software and/or hardware for a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 8 is a block diagram of a target tracking apparatus according to an embodiment of the present application, and as shown in fig. 8, the apparatus includes: a conversion module 82, an acquisition module 84, a generation module 86, and a tracking module 88.
The conversion module 82 is configured to obtain a video stream of the video monitoring device, and obtain a conversion relationship between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment; the obtaining module 84 is configured to obtain a flight location message; the generating module 86 is configured to generate an airplane AR target frame according to the conversion relationship, the flight positioning message, and the video stream, and send the airplane AR target frame to the user terminal for display; the tracking module 88 is configured to obtain a first tracking result for the aircraft dynamic target when receiving a tracking instruction of the user terminal for the aircraft AR target frame.
Through the embodiment, the tracking module 88 generates the AR target frame for the static or dynamic target on the video picture of the video monitoring equipment such as the panoramic rifle bolt and the like based on the acquired conversion relation, flight positioning information and video stream through the technical means such as the video monitoring technology and the AR technology, and simultaneously video tracks the target and amplifies the target, thereby providing very visual picture display for the user, being capable of feeding the dynamic target in the airport flight area back to the airport staff in real time and comprehensively, promoting the manual scheduling of the airport flight area into intelligent scheduling, and providing an intelligent auxiliary decision-making means, thereby effectively solving the problem of low target tracking accuracy in the airport monitoring scene, and realizing high-precision airport target tracking based on the video monitoring and the AR technology.
In one embodiment, the acquisition module 84 is further configured to acquire a vehicle location message for an airport vehicle system; the generating module 86 is further configured to generate a vehicle AR target frame according to the conversion relationship, the vehicle positioning message, and the video stream, and send the vehicle AR target frame to the user terminal for display; the tracking module 88 is further configured to, in a case that a tracking instruction for the vehicle AR target frame of the user terminal is received, control to obtain a second tracking result for the vehicle dynamic target.
In one embodiment, the target tracking device further comprises a first push module; the first pushing module is used for pushing the vehicle positioning message to the user terminal; and the user terminal displays the license plate number in the vehicle positioning message in the vehicle AR target frame.
In one embodiment, the target tracking device further comprises a second push module; the second sending module is used for pushing the flight status message of the airport flight support system to the user terminal; and the user terminal displays the flight status message and displays the video stream matched with the flight status message under the condition of receiving a viewing instruction aiming at the AR target box of the airplane.
In one embodiment, the conversion module 82 is further configured to obtain, by the user terminal, a plurality of point locations on the video stream, and obtain point location pixel coordinates on the video stream corresponding to the point locations; the conversion module acquires the geographic coordinate corresponding to the point location through the positioning equipment; the conversion module acquires the conversion relation according to the point location pixel coordinate and the geographic coordinate.
In one embodiment, the target tracking device further comprises a queue module; the queue module is used for acquiring the flight positioning message through a broadcast monitoring system and storing the flight positioning message into a message queue; the queue module sequentially takes out the flight positioning messages from the message queue and pushes the flight positioning messages to the user terminal; alternatively, the queue module stores the flight location message in a database via the message queue.
In one embodiment, the conversion module 82 is further configured to obtain identity information of the user uploaded by the user terminal, and verify the identity information; the conversion module sends the acquired video stream to the user terminal for display under the condition that the verification is passed; or sending the video stream to a storage device for storage.
In one embodiment the target tracking apparatus further comprises a third push module; the third sending module is configured to send the flight location message to the user terminal; wherein the user terminal displays the flight information in the flight positioning message in the airplane AR target box.
The above modules may be functional modules or program modules, and may be implemented by software or hardware. For a module implemented by hardware, the modules may be located in the same processor; or the modules can be respectively positioned in different processors in any combination.
Fig. 9 is a block diagram of a structure of a target tracking system according to an embodiment of the present application, and as shown in fig. 9, the system includes: server 14, video monitoring device 92, and user terminal 12; wherein, the server 14 is connected to the video monitoring device 92 and the user terminal 12, respectively.
The video monitoring device 92 is configured to obtain a video stream and send the video stream to the server 14; the user terminal 12 is configured to receive the video stream from the server 14 for display.
The server 14 is configured to obtain a video stream of the video monitoring device, and obtain a conversion relationship between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment; acquiring flight positioning information; the server 14 generates an airplane AR target frame according to the conversion relationship, the flight positioning message and the video stream, and sends the airplane AR target frame to the user terminal for display; the server 14 obtains a first tracking result for the aircraft dynamic target upon receiving a tracking instruction for the aircraft AR target frame from the user terminal 12.
Through the embodiment, the server 14 generates an AR target frame for a static or dynamic target on the video picture of the video monitoring device 92 such as a panoramic gun through the video monitoring technology, the AR technology and other technical means based on the obtained conversion relationship, flight positioning information and video stream, and simultaneously video tracks the target and amplifies the target, so as to provide a very visual picture display for a user, and can feed back the dynamic target in an airport flight area to airport staff in real time and comprehensively, promote the manual scheduling of the airport flight area to intelligent scheduling, and provide an intelligent auxiliary decision-making means, thereby effectively solving the problem of low target tracking accuracy in the airport monitoring scene, and realizing high-precision airport target tracking based on the video monitoring and the AR technology.
In this embodiment, a computer device is provided, and the computer device may be a server, and fig. 10 is a structural diagram of the inside of a computer device according to an embodiment of the present application, as shown in fig. 10. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing the video stream. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement the above-described object tracking method.
Those skilled in the art will appreciate that the architecture shown in fig. 10 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
The present embodiment also provides an electronic device comprising a memory having a computer program stored therein and a processor configured to execute the computer program to perform the steps of any of the above method embodiments.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, wherein the transmission device is connected to the processor, and the input/output device is connected to the processor.
Optionally, in this embodiment, the processor may be configured to execute the following steps by a computer program:
s1, acquiring a video stream of the video monitoring equipment, and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment.
And S2, acquiring the flight positioning message.
And S3, generating an airplane AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the airplane AR target frame to the user terminal for display.
And S4, under the condition that a tracking instruction of the user terminal for the aircraft AR target frame is received, controlling to acquire a first tracking result for the aircraft dynamic target.
It should be noted that, for specific examples in this embodiment, reference may be made to examples described in the foregoing embodiments and optional implementations, and details of this embodiment are not described herein again.
In addition, in combination with the target tracking method in the foregoing embodiments, the embodiments of the present application may provide a storage medium to implement. The storage medium having stored thereon a computer program; the computer program, when executed by a processor, implements any of the object tracking methods in the above embodiments.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It should be understood by those skilled in the art that various features of the above-described embodiments can be combined in any combination, and for the sake of brevity, all possible combinations of features in the above-described embodiments are not described in detail, but rather, all combinations of features which are not inconsistent with each other should be construed as being within the scope of the present disclosure.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the concept of the present application, which falls within the scope of protection of the present application. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (12)

1. A method of target tracking, the method comprising:
acquiring a video stream of a video monitoring device, and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment;
acquiring flight positioning information;
generating an aircraft AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the aircraft AR target frame to a user terminal for displaying;
and under the condition that a tracking instruction of the user terminal for the AR target frame of the airplane is received, acquiring a first tracking result for the dynamic target of the airplane.
2. The target tracking method of claim 1, wherein after the obtaining a transformation relationship between a camera coordinate system and a world coordinate system based on the video stream, the method further comprises:
acquiring vehicle positioning information of an airport vehicle system;
generating a vehicle AR target frame according to the conversion relation, the vehicle positioning message and the video stream, and sending the vehicle AR target frame to a user terminal for displaying;
and under the condition that a tracking instruction of the user terminal for the vehicle AR target frame is received, acquiring a second tracking result for the vehicle dynamic target.
3. The method of claim 2, wherein after obtaining the vehicle location message for the airport vehicle system, the method further comprises:
pushing the vehicle positioning message to the user terminal; and the user terminal displays the license plate number in the vehicle positioning message in the vehicle AR target frame.
4. The target tracking method of claim 1, further comprising:
pushing a flight status message of an airport flight support system to the user terminal;
and the user terminal displays the flight status message and displays the video stream matched with the flight status message under the condition of receiving a viewing instruction for the AR target frame.
5. The target tracking method of claim 1, wherein the obtaining a translation between a camera coordinate system and a world coordinate system based on the video stream comprises:
acquiring a plurality of point locations on the video stream through the user terminal, and acquiring point location pixel coordinates on the video stream corresponding to the point locations;
acquiring a geographical coordinate corresponding to the point location through positioning equipment;
and acquiring the conversion relation according to the point location pixel coordinate and the geographic coordinate.
6. The method of claim 1, wherein after obtaining the flight location message, the method further comprises:
acquiring the flight positioning message through a broadcast monitoring system, and storing the flight positioning message into a message queue;
the flight positioning messages are sequentially taken out from the message queue and pushed to the user terminal; or storing the flight positioning message into a database through the message queue.
7. The method of claim 1, wherein the obtaining the video stream of the video surveillance device comprises:
acquiring the identity information of the user uploaded by the user terminal, and verifying the identity information;
under the condition that the verification is passed, sending the acquired video stream to the user terminal for displaying; or sending the video stream to a storage device for storage.
8. The target tracking method according to any one of claims 1 to 7, wherein after the sending the aircraft AR target frame to a user terminal for display, the method further comprises:
pushing the flight location message to the user terminal; wherein the user terminal displays the flight information in the flight positioning message in the airplane AR target box.
9. An object tracking apparatus, characterized in that the apparatus comprises: the device comprises a conversion module, an acquisition module, a generation module and a tracking module;
the conversion module is used for acquiring a video stream of the video monitoring equipment and acquiring a conversion relation between a camera coordinate system and a world coordinate system based on the video stream; the camera coordinate system refers to a coordinate system of the video monitoring equipment;
the acquisition module is used for acquiring flight positioning information;
the generating module is used for generating an aircraft AR target frame according to the conversion relation, the flight positioning message and the video stream, and sending the aircraft AR target frame to a user terminal for displaying;
the tracking module is used for acquiring a first tracking result aiming at the aircraft dynamic target under the condition that a tracking instruction aiming at the aircraft AR target frame of the user terminal is received.
10. An object tracking system, characterized in that the system comprises: the system comprises a server, video monitoring equipment and a user terminal; the server is respectively connected with the video monitoring equipment and the user terminal;
the video monitoring equipment is used for acquiring a video stream and sending the video stream to the server;
the user terminal is used for receiving the video stream of the server and displaying the video stream;
the server is configured to perform the object tracking method according to any one of claims 1 to 8.
11. An electronic device comprising a memory and a processor, wherein the memory has stored therein a computer program, and the processor is arranged to run the computer program to perform the object tracking method of any of claims 1 to 8.
12. A storage medium having a computer program stored thereon, wherein the computer program is arranged to perform the object tracking method of any of claims 1 to 8 when executed.
CN202110731985.5A 2021-06-29 2021-06-29 Target tracking method, device, system, electronic device and storage medium Pending CN113536993A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110731985.5A CN113536993A (en) 2021-06-29 2021-06-29 Target tracking method, device, system, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110731985.5A CN113536993A (en) 2021-06-29 2021-06-29 Target tracking method, device, system, electronic device and storage medium

Publications (1)

Publication Number Publication Date
CN113536993A true CN113536993A (en) 2021-10-22

Family

ID=78097272

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110731985.5A Pending CN113536993A (en) 2021-06-29 2021-06-29 Target tracking method, device, system, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN113536993A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117455202A (en) * 2023-12-25 2024-01-26 青岛民航凯亚系统集成有限公司 Positioning and scheduling method, system and device for apron equipment
CN117938927A (en) * 2024-03-25 2024-04-26 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method
CN117938927B (en) * 2024-03-25 2024-06-04 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117455202A (en) * 2023-12-25 2024-01-26 青岛民航凯亚系统集成有限公司 Positioning and scheduling method, system and device for apron equipment
CN117938927A (en) * 2024-03-25 2024-04-26 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method
CN117938927B (en) * 2024-03-25 2024-06-04 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method

Similar Documents

Publication Publication Date Title
US7460148B1 (en) Near real-time dissemination of surveillance video
CN106550182B (en) Shared unmanned aerial vehicle viewing system
US9141866B2 (en) Summarizing salient events in unmanned aerial videos
US7956891B2 (en) Camera control apparatus and method, and camera control system
EP3420544B1 (en) A method and apparatus for conducting surveillance
US8929596B2 (en) Surveillance including a modified video data stream
CN110555876B (en) Method and apparatus for determining position
KR102183473B1 (en) Method for monitoring images and apparatus for the same
CN108924590B (en) Video playing and photographing system
KR102329077B1 (en) Method and system for real-time communication between satellites and mobile devices
KR102242694B1 (en) Monitoring method and apparatus using video wall
KR20200094444A (en) Intelligent image photographing apparatus and apparatus and method for object tracking using the same
CN113536993A (en) Target tracking method, device, system, electronic device and storage medium
CN111565298B (en) Video processing method, device, equipment and computer readable storage medium
CN109118233B (en) Authentication method and device based on face recognition
CN110458108A (en) Method for real-time monitoring, system, terminal device and storage medium hand-manipulated
CN108961424B (en) Virtual information processing method, device and storage medium
EP2808805A1 (en) Method and apparatus for displaying metadata on a display and for providing metadata for display
CN108108396B (en) Aerial photography picture splicing management system for aircraft
WO2019056492A1 (en) Contract investigation processing method, storage medium, and server
JP5864371B2 (en) Still image automatic generation system, worker information processing terminal, instructor information processing terminal, and determination device in still image automatic generation system
KR102242693B1 (en) Monitoring method and apparatus based on video management system
CN113099248B (en) Panoramic video filling method, device, equipment and storage medium
CN112672057B (en) Shooting method and device
KR102372450B1 (en) Aerial video trading system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination