CN116189484A - Method and device for determining flight guarantee node, electronic equipment and storage medium - Google Patents

Method and device for determining flight guarantee node, electronic equipment and storage medium Download PDF

Info

Publication number
CN116189484A
CN116189484A CN202310082722.5A CN202310082722A CN116189484A CN 116189484 A CN116189484 A CN 116189484A CN 202310082722 A CN202310082722 A CN 202310082722A CN 116189484 A CN116189484 A CN 116189484A
Authority
CN
China
Prior art keywords
image data
vehicle
vehicle information
determining
picture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310082722.5A
Other languages
Chinese (zh)
Inventor
曹利波
党婉丽
耿龙
王朝
郑怀宇
牛杰
陈肇欣
张涛
敬亦婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Second Research Institute of CAAC
Original Assignee
Second Research Institute of CAAC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Second Research Institute of CAAC filed Critical Second Research Institute of CAAC
Priority to CN202310082722.5A priority Critical patent/CN116189484A/en
Publication of CN116189484A publication Critical patent/CN116189484A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/06Traffic control systems for aircraft, e.g. air-traffic control [ATC] for control when on the ground
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection

Abstract

The application provides a method, a device, electronic equipment and a storage medium for determining a flight guarantee node, wherein the method comprises the following steps: comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data; if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data; if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information; and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data. According to the method and the device, the image data are compared with each first vehicle picture in the first template library to obtain the initial vehicle information corresponding to the image data, and then the target vehicle information is determined according to the initial vehicle information, so that the current flight guarantee node can be determined, the accuracy of determining the flight guarantee node is improved, and the efficiency is improved.

Description

Method and device for determining flight guarantee node, electronic equipment and storage medium
Technical Field
The invention relates to the technical field of aviation, in particular to a method and a device for determining a flight guarantee node, electronic equipment and a storage medium.
Background
The flight ground service guarantee is a key link in the whole chain of the overall operation of the civil aviation airport. The flight guarantee node is an important component of flight ground service guarantee, and comprises a lead car in place, a passenger ladder car in place, an airplane in place, an oil truck in place and the like.
Currently, most airports adopt manual work to determine and report current flight guarantee nodes. However, the manual mode has lower efficiency and has the problems of missing report, false report and the like.
Disclosure of Invention
In view of the above, an object of the present application is to provide a method, an apparatus, an electronic device, and a storage medium for determining a flight support node, which can determine a current flight support node, improve accuracy of determining the flight support node, and improve efficiency.
In a first aspect, an embodiment of the present application provides a method for determining a flight protection node, where the method for determining a flight protection node includes:
acquiring image data monitored at each airport;
comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data;
if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data;
if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information;
and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
In one possible implementation, comparing the image data with each first vehicle picture in the first template library includes:
extracting image features of an extraction area corresponding to the position of the camera in the image data;
comparing the image characteristics with the template characteristics of each first vehicle picture in the first template library to obtain a difference value between the image characteristics and the template characteristics;
and determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
In one possible implementation, comparing the image data with the second vehicle image corresponding to the location in the second template library includes:
extracting sub-image data of the same size as the second vehicle picture from the image data;
counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture;
calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles;
and determining the vehicle information of the second vehicle picture with the matching score being larger than the preset score as target vehicle information.
In one possible implementation, calculating an angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture includes:
respectively determining the direction gradient of each boundary point in the sub-image data and the second vehicle picture;
and calculating an included angle cosine of the direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture according to the direction gradient.
In one possible implementation manner, calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles includes:
calculating the sum of the cosine of the included angle between each boundary point in the sub-image data and each boundary point in the second vehicle picture;
and taking the ratio of the sum value to the number of boundary points as the matching score of the sub-image data and the second vehicle picture.
In one possible implementation manner, comparing the image feature with the template feature of each first vehicle picture in the first template library to obtain a difference value between the image feature and the template feature includes:
and calculating the absolute value of the difference value between the image characteristic and the template characteristic of each first vehicle picture in the first template library to obtain a difference value between the image characteristic and the template characteristic.
In a second aspect, an embodiment of the present application further provides a device for determining a flight protection node, where the device for determining a flight protection node includes:
the acquisition module is used for acquiring the image data monitored by each airport;
the comparison module is used for comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data;
the comparison module is also used for comparing the image data with a second vehicle picture corresponding to the machine position in the second template library if the initial vehicle information is empty, so as to obtain target vehicle information corresponding to the image data;
the determining module is used for taking the initial vehicle information as target vehicle information if the initial vehicle information is non-empty;
the determining module is further used for determining the preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
In one possible implementation manner, the comparison module is specifically used for extracting image features of an extraction area corresponding to the machine position in the image data; comparing the image characteristics with the template characteristics of each first vehicle picture in the first template library to obtain a difference value between the image characteristics and the template characteristics; and determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
In one possible embodiment, the comparison module is specifically configured to extract sub-image data of the same size as the second vehicle picture from the image data; counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture; calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles; and determining the vehicle information of the second vehicle picture with the matching score being larger than the preset score as target vehicle information.
In one possible embodiment, the comparison module is further configured to:
respectively determining the direction gradient of each boundary point in the sub-image data and the second vehicle picture;
and calculating an included angle cosine of the direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture according to the direction gradient.
In one possible embodiment, the comparison module is further configured to:
calculating the sum of the cosine of the included angle between each boundary point in the sub-image data and each boundary point in the second vehicle picture;
and taking the ratio of the sum value to the number of boundary points as the matching score of the sub-image data and the second vehicle picture.
In one possible embodiment, the comparison module is further configured to:
and calculating the absolute value of the difference value between the image characteristic and the template characteristic of each first vehicle picture in the first template library to obtain a difference value between the image characteristic and the template characteristic.
In a third aspect, an embodiment of the present application further provides an electronic device, including: the system comprises a processor, a storage medium and a bus, wherein the storage medium stores machine-readable instructions executable by the processor, and when the electronic device is running, the processor communicates with the storage medium through the bus, and the processor executes the machine-readable instructions to execute the steps of the method for determining the flight assurance node according to any one of the first aspect.
In a fourth aspect, embodiments of the present application further provide a computer readable storage medium having a computer program stored thereon, which when executed by a processor performs the steps of the method of determining a flight assurance node according to any of the first aspects.
The embodiment of the application provides a method, a device, electronic equipment and a storage medium for determining a flight guarantee node, wherein the method for determining the flight guarantee node comprises the following steps: acquiring image data monitored at each airport; comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data; if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data; if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information; and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data. According to the method and the device, the image data are compared with each first vehicle picture in the first template library to obtain the initial vehicle information corresponding to the image data, and then the target vehicle information is determined according to the initial vehicle information, so that the current flight guarantee node can be determined, the accuracy of determining the flight guarantee node is improved, and the efficiency is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the embodiments will be briefly described below, it being understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered limiting the scope, and that other related drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 shows a flowchart of a method for determining a flight protection node according to an embodiment of the present application;
FIG. 2 is a flowchart illustrating another method for determining a flight protection node according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a device for determining a flight protection node according to an embodiment of the present application;
fig. 4 shows a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more clear, the technical solutions of the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application, and it should be understood that the accompanying drawings in the present application are only for the purpose of illustration and description, and are not intended to limit the protection scope of the present application. In addition, it should be understood that the schematic drawings are not drawn to scale. A flowchart, as used in this application, illustrates operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be implemented out of order and that steps without logical context may be performed in reverse order or concurrently. Moreover, one or more other operations may be added to the flow diagrams and one or more operations may be removed from the flow diagrams as directed by those skilled in the art.
In addition, the described embodiments are only some, but not all, of the embodiments of the present application. The components of the embodiments of the present application, which are generally described and illustrated in the figures herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as provided in the accompanying drawings, is not intended to limit the scope of the application, as claimed, but is merely representative of selected embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be within the scope of the present application.
In order to enable one skilled in the art to use the present disclosure, the following embodiments are presented in connection with a specific application scenario "aeronautical technology field". It will be apparent to those having ordinary skill in the art that the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present application. Although the present application is described primarily in the context of "aeronautical technology," it should be understood that this is but one exemplary embodiment.
It should be noted that the term "comprising" will be used in the embodiments of the present application to indicate the presence of the features stated hereinafter, but not to exclude the addition of other features.
The following describes in detail a method for determining a flight protection node provided in an embodiment of the present application.
Referring to fig. 1, a flow chart of a method for determining a flight protection node according to an embodiment of the present application is shown, where a specific implementation process of the method for determining a flight protection node is as follows:
s101, acquiring image data monitored at each airport.
S102, comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data.
And S103, if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data.
And S104, if the initial vehicle information is not empty, taking the initial vehicle information as target vehicle information.
S105, determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
The embodiment of the application provides a method, a device, electronic equipment and a storage medium for determining a flight guarantee node, wherein the method for determining the flight guarantee node comprises the following steps: acquiring image data monitored at each airport; comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data; if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data; if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information; and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data. According to the method and the device, the image data are compared with each first vehicle picture in the first template library to obtain the initial vehicle information corresponding to the image data, and then the target vehicle information is determined according to the initial vehicle information, so that the current flight guarantee node can be determined, the accuracy of determining the flight guarantee node is improved, and the efficiency is improved.
Exemplary steps of embodiments of the present application are described below:
s101, acquiring image data monitored at each airport.
In the embodiment of the application, the camera equipment is arranged at all the sites of the airport, and a plurality of camera equipment can monitor the sites in each site. And acquiring a real-time machine position scene shot by the camera equipment, namely image data in real time.
S102, comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data.
In the embodiment of the application, the first template library is a pre-established template library, and first vehicle pictures of all kinds of special vehicles are stored in the first template library, wherein the special vehicles comprise vehicles for aviation ground assurance services, such as fuelling vehicles, passenger stairs vehicles, garbage trucks, luggage trailers and the like. And comparing the picture data with each first vehicle picture, and taking the name of the special vehicle corresponding to the successfully compared first vehicle picture as initial vehicle information. For example, if the successfully compared first vehicle picture is a picture of the fuelling vehicle, then the initial vehicle information is the fuelling vehicle. The initial vehicle information may have names of a plurality of special vehicles.
The following describes the step of comparing the image data with the first vehicle picture:
and I, extracting image features of an extraction area corresponding to the machine position in the image data.
In this embodiment of the present application, the extraction area is an extraction area corresponding to a machine location where image data is located, and refers to an area in the image data where image features need to be extracted, and different machine location scenes are different, so that the extraction areas corresponding to different machine locations are also different. The number of the extraction areas corresponding to each machine position can be one or more, and the extraction areas are determined according to actual conditions.
And II, comparing the image characteristics with the template characteristics of each first vehicle picture in the first template library to obtain a difference value between the image characteristics and the template characteristics.
In this embodiment of the present application, the difference value is used to characterize the difference between the image data corresponding to the image feature and the first vehicle picture corresponding to the last shift feature. The larger the difference value, the greater the probability that the vehicle in the image data is different from the vehicle in the first vehicle picture. The smaller the difference value, the greater the probability that the vehicle in the image data is identical to the vehicle in the first vehicle picture.
The following is a method of calculating the difference value between the image feature and the template feature.
Here, the absolute value of the difference value between the image feature and the template feature of each first vehicle picture in the first template library is calculated, and the difference value between the image feature and the template feature is obtained.
In the embodiment of the application, the image feature and the template feature of the first vehicle picture are subjected to difference to obtain a difference value, and the absolute value of the difference value is used as a difference value between the image feature and the template feature.
And III, determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
In the embodiment of the present application, the smaller the difference value, the greater the probability that the vehicle in the image data is identical to the vehicle in the first vehicle picture. When the difference value between the image feature and the template feature of the first vehicle picture is smaller than the preset difference value, the vehicle in the image data is identical to the vehicle in the first vehicle picture, so that the vehicle information of the first vehicle picture corresponding to the template feature is determined as initial vehicle information. The initial vehicle information is a name of a vehicle contained in the first vehicle picture.
And S103, if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data.
In the embodiment of the present application, if the initial vehicle information is empty, it is indicated that the picture data has no special vehicle. However, as a result, there is no characteristic vehicle in the picture data, and errors may be caused. Therefore, in the embodiment of the application, the image data is compared with the second vehicle image corresponding to the position of the second template library, whether the characteristic vehicle exists in the image data is further judged, the target vehicle information is determined, the accuracy of determining the target vehicle information is improved, and the accuracy of determining the flight guarantee node is further improved.
The second template library is a pre-established template library, and second vehicle pictures of the special vehicles corresponding to each machine position are stored in the second template library. Each machine position corresponds to pictures of all kinds of special vehicles, and the special vehicles comprise vehicles for aviation ground guarantee services, such as fueller, passenger lift car, garbage truck, luggage trailer and the like. Because the positions of the machine positions are different, the shooting angles of the second vehicle pictures of the characteristic vehicles of the same kind corresponding to the different machine positions are also different, and the accuracy of determining the target vehicle information later is improved. For example, both bay 1 and bay 2 correspond to a second vehicle picture of the fuel truck, but the second vehicle picture of the fuel truck of bay 1 is not at the same angle as the second vehicle picture of the fuel truck of bay 2.
In addition, the second template library also comprises second vehicle pictures of the non-special vehicles corresponding to each airplane position and second vehicle pictures of the airplane.
The method for comparing the image data with the second vehicle picture corresponding to the position in the second template library comprises the following steps: extracting sub-image data of the same size as the second vehicle picture from the image data; counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture; calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles; and determining the vehicle information of the second vehicle picture with the matching score being larger than the preset score as target vehicle information.
And S104, if the initial vehicle information is not empty, taking the initial vehicle information as target vehicle information.
In the embodiment of the present application, if the initial vehicle information is not empty, it is explained that the type of the special vehicle included in the image data has been determined, that is, no further determination is required, and therefore the initial vehicle information is directly taken as the target vehicle information.
S105, determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
In the embodiment of the application, the flight guarantee nodes corresponding to different special vehicles are different. By way of example, the flight assurance node corresponding to the fuelling vehicle is fuelling an aircraft; the corresponding flight guarantee nodes of the luggage trailer are used for loading and unloading passenger luggage; the flight guarantee node corresponding to the target vehicle information corresponding to the second vehicle picture without the special vehicle is an airplane take-off and the like.
Referring to fig. 2, a flow chart of a method for determining a flight protection node according to an embodiment of the present application is shown, and the following description describes exemplary steps of the embodiment of the present application:
s201, extracting sub-image data of the same size as the second vehicle picture from the image data.
In this embodiment, the size of the sub-image data needs to be the same as the size of the second vehicle picture, so as to facilitate the subsequent comparison step. And extracting the sub-image data with the same size as the second vehicle picture from the extraction area corresponding to the machine position of the image data according to the sequence from top to bottom and from left to right. The extraction area is an extraction area corresponding to the machine position of the image data, namely an area in which the sub-image data needs to be extracted in the image data, and different machine position scenes are different, so that the extraction areas corresponding to different machine positions are also different. The number of the extraction areas corresponding to each machine position can be one or more, and the extraction areas are determined according to actual conditions.
S202, counting the number of boundary points of the second vehicle picture.
In this embodiment of the present application, the number of boundary points of the second vehicle image corresponding to the machine position where the image data is located in the second template library is counted.
According to the embodiment of the application, the number of the boundary points of the second vehicle picture is counted through a canny edge detection algorithm, or the number of the boundary points of the second vehicle picture can be counted through other modes, and the method is not limited.
Here, the second template library is a pre-established template library, and second vehicle pictures of the special vehicles corresponding to each machine position are stored in the second template library. Each machine position corresponds to pictures of all kinds of special vehicles, and the special vehicles comprise vehicles for aviation ground guarantee services, such as fueller, passenger lift car, garbage truck, luggage trailer and the like. Because the positions of the machine positions are different, the shooting angles of the second vehicle pictures of the characteristic vehicles of the same kind corresponding to the different machine positions are also different, and the accuracy of determining the target vehicle information later is improved. For example, both bay 1 and bay 2 correspond to a second vehicle picture of the fuel truck, but the second vehicle picture of the fuel truck of bay 1 is not at the same angle as the second vehicle picture of the fuel truck of bay 2.
In addition, the second template library also comprises second vehicle pictures of the non-special vehicles corresponding to each airplane position and second vehicle pictures of the airplane.
S203, calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture.
In this embodiment, the method for calculating the angle cosine of the direction vector is the same as the conventional algorithm for calculating the angle cosine, and will not be described here too much.
Specifically, the direction gradient of each boundary point in the sub-image data and the second vehicle picture is determined separately.
In the embodiment of the application, the direction gradient of each boundary point in the sub-image data and the second vehicle picture is determined through a sobel operator. The second vehicle picture refers to a second vehicle picture corresponding to the machine position where the image data is located in the second template library.
Specifically, according to the direction gradient, an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture is calculated.
In this embodiment, the method for calculating the cosine of the included angle is the same as the conventional algorithm for calculating the cosine of the included angle, and will not be described here too much.
S204, calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles.
In an embodiment of the present application, the matching score is used to characterize the degree of matching between the sub-image data and the second vehicle picture. The greater the matching score, the greater the probability that the vehicle in the sub-image data is the same as the vehicle in the second vehicle picture; the smaller the matching score, the less probability that the vehicle in the sub-image data is identical to the vehicle in the second vehicle picture.
The following is a method of calculating a matching score between the sub-image data and the second vehicle picture.
The first step is to calculate, for each second vehicle picture, a sum of cosine of an angle between each boundary point in the sub-image data and a direction vector between each boundary point in the second vehicle picture.
In this embodiment, for each second vehicle picture, the sum value is obtained by adding, to each second vehicle picture, the angle cosine of the direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture.
And secondly, taking the ratio of the sum value to the number of boundary points as the matching score of the sub-image data and the second vehicle picture.
In the embodiment of the present application, the calculation formula of the matching score is: match score = sum value/number of boundary points.
S205, determining the vehicle information of the second vehicle picture with the matching score larger than the preset score as target vehicle information.
In the embodiment of the present application, the larger the matching score, the greater the probability that the vehicle in the sub-image data is identical to the vehicle in the second vehicle picture. And when the matching score of the sub-image data and the second vehicle picture is larger than the preset score, indicating that the vehicle in the sub-image data is identical to the vehicle in the second vehicle picture. The vehicle information in the second vehicle picture is thus determined as the target vehicle information. The vehicle information refers to the name of the vehicle contained in the second vehicle picture.
The embodiment of the application discloses another method for determining a flight guarantee node, which comprises the following steps: extracting sub-image data of the same size as the second vehicle picture from the image data; counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture; calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles; and determining the vehicle information of the second vehicle picture with the matching score being larger than the preset score as target vehicle information. The method can further judge whether the characteristic vehicle exists in the picture data, so that the accuracy of determining the target vehicle information is improved, and the accuracy of determining the flight guarantee node is further improved.
Based on the same inventive concept, the embodiment of the present application further provides a device for determining a flight support node corresponding to the method for determining a flight support node, and since the principle of solving the problem by the device in the embodiment of the present application is similar to that of the method for determining a flight support node in the embodiment of the present application, the implementation of the device may refer to the implementation of the method, and the repetition is omitted.
Referring to fig. 3, a schematic diagram of a device for determining a flight protection node according to an embodiment of the present application is shown, where the device for determining a flight protection node includes:
an acquisition module 301, configured to acquire image data monitored at each airport;
a comparison module 302, configured to compare the image data with each first vehicle picture in the first template library, so as to obtain initial vehicle information corresponding to the image data;
the comparison module 302 is further configured to compare the image data with a second vehicle picture corresponding to a machine position in the second template library if the initial vehicle information is empty, so as to obtain target vehicle information corresponding to the image data;
a determining module 303, configured to take the initial vehicle information as the target vehicle information if the initial vehicle information is not empty;
the determining module 303 is further configured to determine a preset flight protection node corresponding to the target vehicle information as a flight protection node corresponding to the image data.
In one possible implementation, the comparison module 302 is specifically configured to extract an image feature of an extraction area corresponding to the location in the image data; comparing the image characteristics with the template characteristics of each first vehicle picture in the first template library to obtain a difference value between the image characteristics and the template characteristics; and determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
In one possible implementation, the comparison module 302 is specifically configured to extract sub-image data of the same size as the second vehicle picture from the image data; counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture; calculating the matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all included angles; and determining the vehicle information of the second vehicle picture with the matching score being larger than the preset score as target vehicle information.
In one possible implementation, the comparison module 302 is further configured to:
respectively determining the direction gradient of each boundary point in the sub-image data and the second vehicle picture;
and calculating an included angle cosine of the direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture according to the direction gradient.
In one possible implementation, the comparison module 302 is further configured to:
calculating the sum of the cosine of the included angle between each boundary point in the sub-image data and each boundary point in the second vehicle picture;
and taking the ratio of the sum value to the number of boundary points as the matching score of the sub-image data and the second vehicle picture.
In one possible implementation, the comparison module 302 is further configured to:
and calculating the absolute value of the difference value between the image characteristic and the template characteristic of each first vehicle picture in the first template library to obtain a difference value between the image characteristic and the template characteristic.
The embodiment of the application provides a device for determining a flight guarantee node, which comprises the following components: acquiring image data monitored at each airport; comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data; if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data; if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information; and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data. According to the method and the device, the image data are compared with each first vehicle picture in the first template library to obtain the initial vehicle information corresponding to the image data, and then the target vehicle information is determined according to the initial vehicle information, so that the current flight guarantee node can be determined, the accuracy of determining the flight guarantee node is improved, and the efficiency is improved.
As shown in fig. 4, an electronic device 400 provided in an embodiment of the present application includes: the flight assurance node comprises a processor 401, a memory 402 and a bus, wherein the memory 402 stores machine-readable instructions executable by the processor 401, and when the electronic device is running, the processor 401 communicates with the memory 402 through the bus, and the processor 401 executes the machine-readable instructions to execute the steps of the method for determining the flight assurance node.
Specifically, the memory 402 and the processor 401 can be general-purpose memories and processors, and are not particularly limited herein, and the above-described method for determining a flight assurance node can be performed when the processor 401 runs a computer program stored in the memory 402.
Corresponding to the above method for determining the flight support node, the embodiment of the present application further provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, and when the computer program is executed by the processor, the steps of the above method for determining the flight support node are executed.
It will be clearly understood by those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described system and apparatus may refer to corresponding procedures in the method embodiments, which are not described in detail in this application. In the several embodiments provided in this application, it should be understood that the disclosed systems, devices, and methods may be implemented in other manners. The above-described apparatus embodiments are merely illustrative, and the division of the modules is merely a logical function division, and there may be additional divisions when actually implemented, and for example, multiple modules or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be through some communication interface, indirect coupling or communication connection of devices or modules, electrical, mechanical, or other form.
The modules described as separate components may or may not be physically separate, and components shown as modules may or may not be physical units, may be located in one place, or may be distributed over multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer readable storage medium executable by a processor. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, including several instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the information processing method described in the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk, etc.
The foregoing is merely a specific embodiment of the present application, but the protection scope of the present application is not limited thereto, and any person skilled in the art can easily think about changes or substitutions within the technical scope of the present application, and the changes or substitutions are covered in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. The method for determining the flight guarantee node is characterized by comprising the following steps of:
acquiring image data monitored at each airport;
comparing the image data with each first vehicle picture in a first template library to obtain initial vehicle information corresponding to the image data;
if the initial vehicle information is empty, comparing the image data with a second vehicle picture corresponding to the position of the vehicle in a second template library to obtain target vehicle information corresponding to the image data;
if the initial vehicle information is non-empty, taking the initial vehicle information as target vehicle information;
and determining a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
2. The method of claim 1, wherein the comparing the image data with each first vehicle picture in the first template library comprises:
extracting image features of an extraction area corresponding to the position of the camera in the image data;
comparing the image features with template features of each first vehicle picture in the first template library to obtain a difference value between the image features and the template features;
and determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
3. The method for determining a flight protection node according to claim 1, wherein the comparing the image data with the second vehicle picture corresponding to the location in the second template library includes:
extracting sub-image data of the same size as the second vehicle picture from the image data;
counting the number of boundary points of the second vehicle picture; calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture;
calculating the matching score of the sub-image data and each second vehicle picture according to the number of the boundary points and the cosine of all the included angles;
and determining the vehicle information of the second vehicle picture with the matching score larger than the preset score as target vehicle information.
4. A method of determining a flight support node according to claim 3, wherein calculating the cosine of the angle of the direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture comprises:
respectively determining the direction gradient of each boundary point in the sub-image data and the second vehicle picture;
and calculating an included angle cosine of a direction vector between each boundary point in the sub-image data and each boundary point in the second vehicle picture according to the direction gradient.
5. A method for determining a flight protection node according to claim 3, wherein calculating a matching score of the sub-image data and each second vehicle picture according to the number of boundary points and cosine of all the included angles comprises:
calculating the sum of cosine of an included angle between each boundary point in the sub-image data and a direction vector of each boundary point in the second vehicle image aiming at each second vehicle image;
and taking the ratio of the sum value to the number of the boundary points as a matching score of the sub-image data and the second vehicle picture.
6. The method for determining a flight assurance node according to claim 2, wherein the comparing the image feature with the template feature of each first vehicle picture in the first template library to obtain a difference value between the image feature and the template feature comprises:
and calculating the absolute value of the difference value between the image characteristic and the template characteristic of each first vehicle picture in the first template library to obtain a difference value between the image characteristic and the template characteristic.
7. A device for determining a flight protection node, wherein the device for determining a flight protection node comprises:
the acquisition module is used for acquiring the image data monitored by each airport;
the comparison module is used for comparing the image data with each first vehicle picture in the first template library to obtain initial vehicle information corresponding to the image data;
the comparison module is further used for comparing the image data with a second vehicle picture corresponding to the position in a second template library if the initial vehicle information is empty, so as to obtain target vehicle information corresponding to the image data;
the determining module is used for taking the initial vehicle information as target vehicle information if the initial vehicle information is non-empty;
the determining module is further configured to determine a preset flight guarantee node corresponding to the target vehicle information as the flight guarantee node corresponding to the image data.
8. The apparatus for determining a flight protection node according to claim 7, wherein the comparison module is specifically configured to:
extracting image features of an extraction area corresponding to the position of the camera in the image data;
comparing the image features with template features of each first vehicle picture in the first template library to obtain a difference value between the image features and the template features;
and determining the vehicle information of the template features with the difference value smaller than the preset difference value as initial vehicle information.
9. An electronic device, comprising: a processor, a storage medium and a bus, the storage medium storing machine-readable instructions executable by the processor, the processor and the storage medium communicating over the bus when the electronic device is running, the processor executing the machine-readable instructions to perform the steps of the method of determining a flight assurance node according to any one of claims 1 to 6.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored thereon a computer program which, when executed by a processor, performs the steps of the method of determining a flight protection node according to any of claims 1 to 6.
CN202310082722.5A 2023-01-28 2023-01-28 Method and device for determining flight guarantee node, electronic equipment and storage medium Pending CN116189484A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310082722.5A CN116189484A (en) 2023-01-28 2023-01-28 Method and device for determining flight guarantee node, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310082722.5A CN116189484A (en) 2023-01-28 2023-01-28 Method and device for determining flight guarantee node, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116189484A true CN116189484A (en) 2023-05-30

Family

ID=86445667

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310082722.5A Pending CN116189484A (en) 2023-01-28 2023-01-28 Method and device for determining flight guarantee node, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116189484A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117745085A (en) * 2024-02-21 2024-03-22 中国民用航空总局第二研究所 Flight guarantee service adjustment method and device, electronic equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117745085A (en) * 2024-02-21 2024-03-22 中国民用航空总局第二研究所 Flight guarantee service adjustment method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
US8116530B2 (en) Map change detection device, map change detection method, and program
US20170004345A1 (en) Survey data processing device, survey data processing method, and survey data processing program
CN107730485B (en) Vehicle damage assessment method, electronic device and computer-readable storage medium
CN116189484A (en) Method and device for determining flight guarantee node, electronic equipment and storage medium
CN112052807B (en) Vehicle position detection method, device, electronic equipment and storage medium
CN111582255A (en) Vehicle overrun detection method and device, computer equipment and storage medium
CN112528781B (en) Obstacle detection method, device, equipment and computer readable storage medium
CN112001357B (en) Target identification detection method and system
CN113569812A (en) Unknown obstacle identification method and device and electronic equipment
CN111860512B (en) Vehicle identification method, device, electronic equipment and computer readable storage medium
Neulist et al. Segmentation, classification, and pose estimation of military vehicles in low resolution laser radar images
CN113689493A (en) Lens attachment detection method, lens attachment detection device, electronic equipment and storage medium
CN115731179A (en) Track component detection method, terminal and storage medium
CN111144361A (en) Road lane detection method based on binaryzation CGAN network
CN116188587A (en) Positioning method and device and vehicle
US8335382B2 (en) Method and system for applying silhouette tracking to determine attitude of partially occluded objects
CN112767211A (en) Rescue resource recommendation method and device, electronic equipment and storage medium
CN114663843A (en) Road fog detection method and device, electronic equipment and storage medium
CN113099210B (en) Three-dimensional image restoration method and device, computer equipment and storage medium
CN111915446A (en) Accident vehicle damage assessment method and device and terminal equipment
CN111538953B (en) Material identification method, device, equipment and storage medium
CN115031989A (en) Carriage state detection method and device, computer equipment and readable storage medium
CN112750312A (en) Information detection method, device, equipment and storage medium
CN116092333A (en) Method and device for determining flight guarantee node, electronic equipment and storage medium
CN117953441A (en) Method and device for identifying flight ground guarantee event, electronic equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination