CN109063576A - Management method and device for flight movement node - Google Patents
Management method and device for flight movement node Download PDFInfo
- Publication number
- CN109063576A CN109063576A CN201810732611.3A CN201810732611A CN109063576A CN 109063576 A CN109063576 A CN 109063576A CN 201810732611 A CN201810732611 A CN 201810732611A CN 109063576 A CN109063576 A CN 109063576A
- Authority
- CN
- China
- Prior art keywords
- data
- flight
- interface
- server
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/214—Generating training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/94—Hardware or software architectures specially adapted for image or video understanding
- G06V10/95—Hardware or software architectures specially adapted for image or video understanding structured as a network, e.g. client-server architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Engineering & Computer Science (AREA)
- Bioinformatics & Computational Biology (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
This application discloses a kind of management methods and device for flight movement node.The management method includes acquiring the image data of default flight movement node;Described image data input picture is identified into server;Configuration described image identification server and flight fortune refer to the data-interface at center;And refer to that center sends the data of the flight movement node to flight fortune according to the data-interface.Present application addresses the technical problems that the management effect for flight movement node is poor.The application realizes following technical effect: dynamically recording key node, real-time capture exception information, and management application platform is established based on this, according to airport concurrentization, it feeds back into existing application system, to realize the purpose for improving flight operational efficiency, promoting safety management level, improving profession service quality.
Description
Technical field
This application involves wisdom airports, field of image recognition, in particular to a kind of pipe for flight movement node
Manage method and device.
Background technique
In wisdom airdrome control, the realization to flight movement node intelligent management application function can be airport to each flight shape
The control of state, the construction on wisdom airport provide strong information reference.
Inventors have found that when current flight acts node statistics: monitoring only retrospect after an event occurs, it is counter look into play
Effect can not accomplish that the real-time dynamic of video is handled.Further, if if wanting to understand an airplane from dropping into take-off process
Involved each movement node time of origin, it can only be carried out by way of manual record, expend larger labor workload.Together
When, there are a large amount of flight landings on airport daily, and each time of origin hardly possible for acting node is regular to be sayed, only manually operates, magnanimity
It is difficult to realize that flight acts node time statistical function.
For the problem that the management effect for being directed to flight movement node in the related technology is poor, not yet propose at present effective
Solution.
Summary of the invention
The main purpose of the application is to provide a kind of management method and device for flight movement node, to solve needle
The problem poor to the management effect of flight movement node.
To achieve the goals above, according to the one aspect of the application, a kind of pipe for flight movement node is provided
Reason method.
Include: according to the management method for flight movement node of the application
Acquire the image data of default flight movement node;
Described image data input picture is identified into server;
Configuration described image identification server and flight fortune refer to the data-interface at center;And
Refer to that center sends the data of the flight movement node to flight fortune according to the data-interface.
Further, acquiring the image data that default flight acts node includes:
Obtain the access authority of the image collecting device on predeterminated position in airport;
The image data of described image acquisition device acquisition is accessed and obtained by the access authority;And
Described image data are divided into video image data and picture image data.
Further, include: by described image data input picture identification server
Image recognition server receives the flag data of user's selection;
The pre-set image training degree that image recognition server is inputted according to user obtains training data;And
Image recognition server generates the image recognition model of expected identification target for identification according to the training data.
Further, configuration described image identification server and flight, which are transported, refers to that the data-interface at center includes:
Establish linking for described image identification server and interface applications server;
Configure the data format of processing result in image recognition server;
Refer to hub-interface specification in the interface applications server configuration data interface according to default flight fortune.
Further, refer to that center sends the data of the flight movement node to flight fortune according to the data-interface
Including following any or a variety of:
Refer to that center is sent in the flight that the flight movement node occurs and moves to flight fortune according to the data-interface
Make the time;
It is inclined to flight fortune to refer to that center is sent in the content that flight movement node is arranged according to the data-interface
Good flight action record;
Refer to that center is sent in the flight that the flight movement node occurs and moves to flight fortune according to the data-interface
Make visual content.
Further, refer to that center sends the data of the flight movement node to flight fortune according to the data-interface
Later further include:
The flight is acted to the management application software program on the data access terminal of node by interchanger;
The flight is acted into node visualization output by the management application software program.
To achieve the goals above, according to the another aspect of the application, a kind of pipe for flight movement node is provided
Manage device.
Include: according to the managing device for flight movement node of the application
Acquisition module, for acquiring the image data of default flight movement node;
Input module, for described image data input picture to be identified server;
Configuration module refers to the data-interface at center for configuring described image identification server and flight fortune;And
Sending module, for referring to that center sends the flight movement node to flight fortune according to the data-interface
Data.
Further, the acquisition module includes:
Acquiring unit, for obtaining the access authority of the image collecting device on predeterminated position in airport;
Access unit, for the picture number of described image acquisition device acquisition to be accessed and obtained by the access authority
According to;And
Division unit, for described image data to be divided into video image data and picture image data.
Further, the input module includes:
Receiving unit receives the flag data of user's selection for image recognition server;
Training data unit, the pre-set image training degree inputted for image recognition server according to user are trained
Data;And
Generation unit generates according to the training data for image recognition server and is expected identification target for identification
Image recognition model.
Further, the configuration module includes:
Unit is established, for establishing linking for described image identification server and interface applications server;
Configuration unit, for configuring the data format of processing result in image recognition server;
Unit is docked, for referring to hub-interface specification in the interface applications server configuration data according to default flight fortune
Interface.
In the embodiment of the present application, the image data of node, backstage image recognition service are acted by acquiring default flight
Device identification identification image data has achieved the purpose that identify flight movement node, to realize the pipe to flight movement node
The technical effect of reason, and then solve the technical problem poor for the management effect of flight movement node.
Detailed description of the invention
The attached drawing constituted part of this application is used to provide further understanding of the present application, so that the application's is other
Feature, objects and advantages become more apparent upon.The illustrative examples attached drawing and its explanation of the application is for explaining the application, not
Constitute the improper restriction to the application.In the accompanying drawings:
Fig. 1 is the management method schematic diagram for flight movement node according to the application first embodiment;
Fig. 2 is the management method schematic diagram for flight movement node according to the application second embodiment;
Fig. 3 is the management method schematic diagram for flight movement node according to the application 3rd embodiment;
Fig. 4 is the management method schematic diagram for flight movement node according to the application fourth embodiment;
Fig. 5 is the management method schematic diagram for flight movement node according to the 5th embodiment of the application;
Fig. 6 is the management method schematic diagram for flight movement node according to the application sixth embodiment;
Fig. 7 is the managing device schematic diagram for flight movement node according to the application first embodiment;
Fig. 8 is the managing device schematic diagram for flight movement node according to the application second embodiment;
Fig. 9 is the managing device schematic diagram for flight movement node according to the application 3rd embodiment;
Figure 10 is the managing device schematic diagram for flight movement node according to the application fourth embodiment;And
Figure 11 is the managing device schematic diagram for flight movement node according to the 5th embodiment of the application.
Specific embodiment
In order to make those skilled in the art more fully understand application scheme, below in conjunction in the embodiment of the present application
Attached drawing, the technical scheme in the embodiment of the application is clearly and completely described, it is clear that described embodiment is only
The embodiment of the application a part, instead of all the embodiments.Based on the embodiment in the application, ordinary skill people
Member's every other embodiment obtained without making creative work, all should belong to the model of the application protection
It encloses.
It should be noted that the description and claims of this application and term " first " in above-mentioned attached drawing, "
Two " etc. be to be used to distinguish similar objects, without being used to describe a particular order or precedence order.It should be understood that using in this way
Data be interchangeable under appropriate circumstances, so as to embodiments herein described herein.In addition, term " includes " and " tool
Have " and their any deformation, it is intended that cover it is non-exclusive include, for example, containing a series of steps or units
Process, method, system, product or equipment those of are not necessarily limited to be clearly listed step or unit, but may include without clear
Other step or units listing to Chu or intrinsic for these process, methods, product or equipment.
In this application, term " on ", "lower", "left", "right", "front", "rear", "top", "bottom", "inner", "outside",
" in ", "vertical", "horizontal", " transverse direction ", the orientation or positional relationship of the instructions such as " longitudinal direction " be orientation based on the figure or
Positional relationship.These terms are not intended to limit indicated dress primarily to better describe the application and embodiment
Set, element or component must have particular orientation, or constructed and operated with particular orientation.
Also, above-mentioned part term is other than it can be used to indicate that orientation or positional relationship, it is also possible to for indicating it
His meaning, such as term " on " also are likely used for indicating certain relations of dependence or connection relationship in some cases.For ability
For the those of ordinary skill of domain, the concrete meaning of these terms in this application can be understood as the case may be.
In addition, term " installation ", " setting ", " being equipped with ", " connection ", " connected ", " socket " shall be understood in a broad sense.For example,
It may be a fixed connection, be detachably connected or monolithic construction;It can be mechanical connection, or electrical connection;It can be direct phase
It even, or indirectly connected through an intermediary, or is two connections internal between device, element or component.
For those of ordinary skills, the concrete meaning of above-mentioned term in this application can be understood as the case may be.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
According to the embodiment of the present application, a kind of management method for flight movement node is provided.
As shown in Figure 1, this method includes the following steps, namely S102 to step S108:
Step S102 acquires the image data of default flight movement node;
Preferably, preset flight movement node can be aircraft enter position, upper catch, shelter bridge mouth docking, open passenger door, open
Cargo door, lower visitor, machine under unit, machine, upper visitor on unit, close passenger door, close cargo door, shelter bridge mouth is released, removes wheel shelves, aircraft
It releases.Number of nodes can be adjusted according to the demand on airport and increases new node.
The image data for acquiring default flight movement node, which can be, obtains video and figure by equipment such as airport cameras
As data.
Described image data input picture is identified server by step S104;
Preferably, image recognition server from described image data for identifying that flight acts node.
Described image data input picture identification server can be, the equipment such as camera are obtained into video and picture number
According to being uploaded to cloud image recognition server or local image recognition server.
Step S106, configuration described image identification server and flight fortune refer to the data-interface at center;And
Preferably, data-interface can be API data interface, JSON data-interface etc..Herein not with data interface type
Limit protection scope.
Configuration described image identification server and flight fortune refer to that the data-interface at center can be and take described image identification
Business device refers to that center is connect by data-interface with flight fortune.
Step S108 refers to that center sends the number of the flight movement node to flight fortune according to the data-interface
According to.
Preferably, referring to that center sends the data of flight movement node to flight fortune according to the data-interface can
To be that the flight that image recognition server identifies is acted node-node transmission to flight fortune by data-interface to refer to center.
As shown in Fig. 2, the image data for acquiring default flight movement node includes the following steps, namely S202 to step S206:
Step S202 obtains the access authority of the image collecting device on predeterminated position in airport;
Preferably, image collecting device can be the devices such as airport camera, sensor.
Obtain in airport that the access authority of image collecting device can be and be schemed by data-interface on predeterminated position
The image data or video data acquired as acquisition device.
Step S204 accesses by the access authority and obtains the image data of described image acquisition device acquisition;With
And
Preferably, it is accessed by the access authority and the image data for obtaining the acquisition of described image acquisition device can be
The image data or video data of image acquisition device are obtained by data-interface.
Described image data are divided into video image data and picture image data by step S206.
Preferably, server classifies described image data, can be video image data, is also possible to picture figure
As data.
As shown in figure 3, including the following steps, namely S302 to step described image data input picture identification server
S306:
Step S302, image recognition server receive the flag data of user's selection;
Preferably, flag data can be the picture label of picture image data, can also be video image data
Video image label.
The flag data for receiving user's selection can be the data type for receiving user and selecting in tagging system, file, mark
Label etc..
Step S304, the pre-set image training degree that image recognition server is inputted according to user obtain training data;With
And
Preferably, pre-set image training degree can be user and select the identity of own level, such as can be and begin to learn
Person or expert.
It obtains training data according to the pre-set image training degree of user's input and can be user to be carried out according to own level
Identity selects (beginner or expert), and relevant information passes through database server;If user selects " beginner " identity, it is
System is its recommendation network;Increase label engineering, beginning label;Start to train.
If obtaining training data according to the pre-set image training degree of user's input can also be that user selects " expert "
Identity, then system is its recommendation network;Increase label engineering, beginning label;Select network;Setup parameter;Start to train.
Step S306, image recognition server generate the image of expected identification target for identification according to the training data
Identification model.
Preferably, it is contemplated that identification target can be the picture image data to be identified of user's offer, can also be user
The video image data to be identified provided.
It can be according to the image recognition model that the training data generates expected identification target for identification according to selection
Network, system be its recommend deployment way;Optimized model whether is needed, other deployments mode is selected;Complete deployment.
Specifically, the flag data that image recognition server receives user's selection can be data in above-mentioned steps S302
Type can be picture image data, can also be video image data.The data type for receiving user's selection can be reception
The picture image data of user's selection, can also be the video image data for receiving user's selection.If the data type is
Picture image data then receives user to the picture label of the object selection in the picture image data;And it is excellent
Selection of land receives user and selects the object in the picture image data if the data type is picture image data
The picture label selected, such as picture label can be visitor etc..Determine that user marks the position of described image label
Note.Preferably, at user option location tags are provided in systems, for example, picture upper right quarter, upper left quarter, middle part, right lower quadrant,
Lower left quarter etc..Determine that user can be according to the location tags determination that user selects the position mark of described image label
The position mark of image tag.
Specifically, the flag data that image recognition server receives user's selection can be reception in above-mentioned steps S302
The data type of user's selection;Preferably, data type can be video image data, can also be picture image data.It connects
The data type for receiving user's selection can be the video image data for receiving user's selection, can also be the figure for receiving user's selection
Picture data.If the data type is video image data, user is received to dynamic in the video image data
Make the video image label of frame selection;And preferably, it is dynamic to can be some having occurred between action action frame for video image label
Make, for example, in the 5th frame to the movement that upper visitor has occurred between the 15th frame.If the data type is video image data,
It receives user and reception user can be to video figure to the video image label of the action action frame selection in the video image data
As the selection of video image labels multiple in data.Determine that user acts length to the frame fragment of the video image label.It is excellent
Selection of land, frame fragment movement length can be the duration that a certain movement occurs.For example, determining user to the video image label
Frame fragment movement length, which can be, determines that the movement of upper visitor occurs arbitrarily may be used in the 5th frame between the 15th frame, can also be other
The movement occurred in the video of label.
Specifically, the pre-set image training degree that image recognition server is inputted according to user obtains in above-mentioned steps S304
It can be to training data and the first user processing identity obtained according to the pre-set image training degree of user's input;Preferably,
One user enters training system, login system or registration and provides user information, and relevant information passes through database server;With
Family carries out identity selection (beginner or expert) according to own level, and relevant information passes through database server;If user selects
" beginner " identity is selected, then system is its recommendation network;If user selects " expert " identity, system is its recommendation network.
Identity is handled to the first user recommendation network model according to first user;Preferably, according to the first user selection
Identity label is to first user's recommendation network model.For example, the first user selects " beginner " identity, then recommend for it more simple
Single network model;In another example the first user selects " expert " identity, then recommend more complicated network model for it.It imports
First flag data of the first user selection;And, for example, first flag data can be the picture of picture image data
Image tag can also be the video image label of video image data.Import the first reference numerals of the first user selection
Data type, file, the label etc. selected according to can be the first user of importing in tagging system.Pass through the network model and institute
It states the first flag data and determines the network model deployment way.Preferably, network model deployment way can be importing
First flag data is input in the network model of recommendation.
Specifically, the pre-set image training degree that image recognition server is inputted according to user obtains in above-mentioned steps S304
It can be to training data and second user processing identity obtained according to the pre-set image training degree of user's input;Preferably,
Two users enter training system, login system or registration and provide user information, and relevant information passes through database server;With
Family carries out identity selection (beginner or expert) according to own level, and relevant information passes through database server;If user selects
" beginner " identity is selected, then system is its recommendation network;If user selects " expert " identity, system is its recommendation network.
Identity, which is handled, according to the second user opens training data interface to the second user;Preferably, it is used according to described second
Family, which handles identity and opens training data interface to the second user and can be to provide for second user, starts connecing for training pattern
Mouthful.The second flag data of the second user selection is triggered according to the operation of the data markers of second user;Preferably, according to
The data markers that two users selected are set out the second flag data, and corresponding data label is called.It is defeated according to training data interface
Enter the network model and training parameter of selection;And pass through the network model, training parameter and second flag data
Determine the network model deployment way.Preferably, according to the network model of above-mentioned recommendation, selected training parameter and offer
Flag data determine training and the deployment way of network model.For example, network model deployment way can be the number of plies of model
It is determined as how many, can also be the deployment way such as selection Bridge Driver, Overlay Driver.
Specifically, image recognition server is expected for identification according to training data generation in above-mentioned steps S306
The image recognition model of identification target can be the log-on message for receiving user;Preferably, logon information can be account, close
Code.Receive user logon information can be according to the information such as account, password of user's input determine the membership of user with
And history handles data.It determines the expected identification target set after user logs in and imports flag data;Preferably, it is contemplated that identification
Target can be picture identification target, can also be video image identification target.Determine the expection set after user logs in
Identification target and importing flag data can be the identification target of determining user and import the flag data or history of user's selection
Flag data.Receive the data generation operations instruction of user;And preferably, the data generation operations instruction for receiving user can be with
It is that offer one generates operational controls button, user after user logs in, selects expected identification target and import flag data
After clicking operation control, operational order is received from the background.According to data generation operations instruction according to the expected identification target
Image recognition model is generated with flag data training.Preferably, the information obtained according to above steps acquisition generates on backstage
Image recognition model provides a recognition result by being identified as user.
As shown in figure 4, configuration described image identification server and flight fortune refer to that the data-interface at center includes following step
Rapid S402 to step S406:
Step S402 establishes linking for described image identification server and interface applications server;
Step S404 configures the data format of processing result in image recognition server;
Preferably, the data of the processing result of described image identification server are arranged.
Such as the data that send over of server-side are obtained by ajax, according to the data that the data acquisition received needs,
It is object JSON.parse (d.template) by JSON format conversion, according to object acquisition data.
Step S406 refers to that hub-interface specification connects in the interface applications server configuration data according to default flight fortune
Mouthful.
As shown in figure 5, referring to that center sends the number of the flight movement node to flight fortune according to the data-interface
According to including as follows any or a variety of:
Refer to that center is sent in the flight that the flight movement node occurs and moves to flight fortune according to the data-interface
Make the time;
Preferably, flight actuation time can be the time of origin of specific each flight movement node.
For example, the pickup time is 5:00-5:15.
It is inclined to flight fortune to refer to that center is sent in the content that flight movement node is arranged according to the data-interface
Good flight action record;
Preferably, the flight that the needs that content-preference flight action record can be diagram identification server setting record is dynamic
Make node.
For example, aircraft enters position, upper catch, the docking of shelter bridge mouth, opens passenger door, opens machine, machine under cargo door, lower visitor, unit
Machine, upper objective, pass passenger door, pass cargo door, shelter bridge mouth are released, remove one or more of wheel shelves, aircraft release in group.
Refer to that center is sent in the flight that the flight movement node occurs and moves to flight fortune according to the data-interface
Make visual content.
Preferably, flight movement visual content can be the picture image data of the flight movement node of camera acquisition
Or the frame in video image data acts segment.
For example, upper visitor's video clip.
As shown in fig. 6, referring to that center sends the number of the flight movement node to flight fortune according to the data-interface
According to include thing steps that following S602 to step S604 later:
The flight is acted the management application software journey on the data access terminal of node by interchanger by step S602
Sequence;
Preferably, interchanger can be Ethernet switch, optical fiber switch etc..
Management application software program in terminal can be the management software that flight fortune refers to the use at center.
The flight is acted node visualization output by the management application software program by step S604.
Preferably, the flight is acted node visualization output by the management application software program can be output
Flight acts the visualization results such as image data or the video data of node.
It can be seen from the above description that the application realizes following technical effect: the application of name through the invention,
It is intended to vitalize existing video monitoring resource, existing discrete, passively, static video monitoring resource is applied to actual fortune
Row ensures, in daily service and secure administration procedure.Camera is become into dynamically recording key node, real-time capture exception information
Sensor, and management application platform is established based on this, according to airport concurrentization, fed back into existing application system
It goes, to realize the purpose for improving flight operational efficiency, promoting safety management level, improving profession service quality.For wisdom airport
Construction, future airport planning landing are laid a good foundation.
It should be noted that step shown in the flowchart of the accompanying drawings can be in such as a group of computer-executable instructions
It is executed in computer system, although also, logical order is shown in flow charts, and it in some cases, can be with not
The sequence being same as herein executes shown or described step.
According to the embodiment of the present application, additionally provide a kind of for implementing the above-mentioned management method for flight movement node
Managing device, as shown in fig. 7, the device includes:
Acquisition module 10, for acquiring the image data of default flight movement node;
Input module 20, for described image data input picture to be identified server;
Configuration module 30 refers to the data-interface at center for configuring described image identification server and flight fortune;And
Sending module 40, for referring to that center sends the flight and acts node to flight fortune according to the data-interface
Data.
As shown in figure 8, the acquisition module 10 includes:
Acquiring unit 101, for obtaining the access authority of the image collecting device on predeterminated position in airport;
Access unit 102, for the image of described image acquisition device acquisition to be accessed and obtained by the access authority
Data;And
Division unit 103, for described image data to be divided into video image data and picture image data.
As shown in figure 9, the input module 20 includes:
Receiving unit 201 receives the flag data of user's selection for image recognition server;
Training data unit 202, the pre-set image training degree inputted for image recognition server according to user obtain
Training data;And
Generation unit 203 generates expected identification mesh for identification according to the training data for image recognition server
Target image recognition model.
As shown in Figure 10, the configuration module 30 includes:
Unit 301 is established, for establishing linking for described image identification server and interface applications server;
Configuration unit 302, for configuring the data format of processing result in image recognition server;
Unit 303 is docked, for referring to that hub-interface specification is configured in the interface applications server according to default flight fortune
Data-interface.
As shown in figure 11, the workflow for the managing device of flight movement node is as follows:
Video and image data are obtained by airport camera;The data obtained is incoming equipped with flight movement node intelligent pipe
The video analytics server of application software is managed, and carries out data analysis, processing herein;Previous step is analyzed processing acquired results to lead to
It crosses interface mode and sends airport to and transport and refer to that center applications dock server, interface coordination follows fortune and refers to hub-interface technical specification;
Software has the function of report generation, for it is each movement node occur time and airport it is interested it is other statistics with as a result,
It can easily realize visualization on various devices.
The present invention mainly has following technical effect that 1) software off-line operation, provides higher safety guarantee for client;2)
Artificial treatment of the careful modelling client to mass data;3) leading algorithm provides more application possibilities;4) have
There is report generation function, can easily realize result visualization.
Obviously, those skilled in the art should be understood that each module of above-mentioned the application or each step can be with general
Computing device realize that they can be concentrated on a single computing device, or be distributed in multiple computing devices and formed
Network on, optionally, they can be realized with the program code that computing device can perform, it is thus possible to which they are stored
Be performed by computing device in the storage device, perhaps they are fabricated to each integrated circuit modules or by they
In multiple modules or step be fabricated to single integrated circuit module to realize.In this way, the application be not limited to it is any specific
Hardware and software combines.
The foregoing is merely preferred embodiment of the present application, are not intended to limit this application, for the skill of this field
For art personnel, various changes and changes are possible in this application.Within the spirit and principles of this application, made any to repair
Change, equivalent replacement, improvement etc., should be included within the scope of protection of this application.
Claims (10)
1. a kind of management method for flight movement node characterized by comprising
Acquire the image data of default flight movement node;
Described image data input picture is identified into server;
Configuration described image identification server and flight fortune refer to the data-interface at center;And
Refer to that center sends the data of the flight movement node to flight fortune according to the data-interface.
2. management method according to claim 1, which is characterized in that acquire the image data packet of default flight movement node
It includes:
Obtain the access authority of the image collecting device on predeterminated position in airport;
The image data of described image acquisition device acquisition is accessed and obtained by the access authority;And
Described image data are divided into video image data and picture image data.
3. management method according to claim 1, which is characterized in that described image data input picture is identified server
Include:
Image recognition server receives the flag data of user's selection;
The pre-set image training degree that image recognition server is inputted according to user obtains training data;And
Image recognition server generates the image recognition model of expected identification target for identification according to the training data.
4. management method according to claim 1, which is characterized in that configuration described image identification server refers to flight fortune
The data-interface at center includes:
Establish linking for described image identification server and interface applications server;
Configure the data format of processing result in image recognition server;
Refer to hub-interface specification in the interface applications server configuration data interface according to default flight fortune.
5. management method according to claim 1, which is characterized in that referred to according to the data-interface to flight fortune
The data that the heart sends flight movement node include following any or a variety of:
When referring to that center is sent in the flight movement that the flight movement node occurs to flight fortune according to the data-interface
Between;
Refer to that center is sent in the content-preference boat that the flight movement node is arranged to flight fortune according to the data-interface
Class's action record;
Referring to that center is sent in the flight movement that flight movement node occurs to flight fortune according to the data-interface can
Depending on changing content.
6. management method according to claim 1, which is characterized in that referred to according to the data-interface to flight fortune
The heart is sent after the data of the flight movement node further include:
The flight is acted to the management application software program on the data access terminal of node by interchanger;
The flight is acted into node visualization output by the management application software program.
7. a kind of managing device for flight movement node characterized by comprising
Acquisition module, for acquiring the image data of default flight movement node;
Input module, for described image data input picture to be identified server;
Configuration module refers to the data-interface at center for configuring described image identification server and flight fortune;And
Sending module, for referring to that center sends the number of the flight movement node to flight fortune according to the data-interface
According to.
8. managing device according to claim 7, which is characterized in that the acquisition module includes:
Acquiring unit, for obtaining the access authority of the image collecting device on predeterminated position in airport;
Access unit, for the image data of described image acquisition device acquisition to be accessed and obtained by the access authority;With
And
Division unit, for described image data to be divided into video image data and picture image data.
9. managing device according to claim 7, which is characterized in that the input module includes:
Receiving unit receives the flag data of user's selection for image recognition server;
Training data unit, the pre-set image training degree inputted for image recognition server according to user obtain training number
According to;And
Generation unit generates the image of expected identification target for identification for image recognition server according to the training data
Identification model.
10. managing device according to claim 7, which is characterized in that the configuration module includes:
Unit is established, for establishing linking for described image identification server and interface applications server;
Configuration unit, for configuring the data format of processing result in image recognition server;
Unit is docked, for referring to that hub-interface specification connects in the interface applications server configuration data according to default flight fortune
Mouthful.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810732611.3A CN109063576B (en) | 2018-07-05 | 2018-07-05 | Management method and device for flight action nodes |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810732611.3A CN109063576B (en) | 2018-07-05 | 2018-07-05 | Management method and device for flight action nodes |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109063576A true CN109063576A (en) | 2018-12-21 |
CN109063576B CN109063576B (en) | 2021-12-17 |
Family
ID=64819492
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810732611.3A Active CN109063576B (en) | 2018-07-05 | 2018-07-05 | Management method and device for flight action nodes |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109063576B (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109887343A (en) * | 2019-04-04 | 2019-06-14 | 中国民航科学技术研究院 | It takes to a kind of flight and ensures node automatic collection monitoring system and method |
CN110097659A (en) * | 2019-05-16 | 2019-08-06 | 深圳市捷赛机电有限公司 | Catch, the time recording method for removing catch and Related product on a kind of aircraft |
CN110379209A (en) * | 2019-07-22 | 2019-10-25 | 捻果科技(深圳)有限公司 | A kind of flight work flow node specification monitoring alarm method |
CN113096108A (en) * | 2021-04-21 | 2021-07-09 | 安徽声讯信息技术有限公司 | Accurate docking method for interface test |
CN115826464A (en) * | 2022-11-29 | 2023-03-21 | 航科院中宇(北京)新技术发展有限公司 | Remote machine position node acquisition system and acquisition method thereof |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426331A (en) * | 2013-08-22 | 2013-12-04 | 南京莱斯信息技术股份有限公司 | Multi-airport collaborative delivery system flight sorting and decision making method |
CN103679341A (en) * | 2013-11-18 | 2014-03-26 | 南京航空航天大学 | Flight ground operation support efficiency evaluation method |
CN106067028A (en) * | 2015-04-19 | 2016-11-02 | 北京典赞科技有限公司 | The modeling method of automatic machinery based on GPU study |
-
2018
- 2018-07-05 CN CN201810732611.3A patent/CN109063576B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426331A (en) * | 2013-08-22 | 2013-12-04 | 南京莱斯信息技术股份有限公司 | Multi-airport collaborative delivery system flight sorting and decision making method |
CN103679341A (en) * | 2013-11-18 | 2014-03-26 | 南京航空航天大学 | Flight ground operation support efficiency evaluation method |
CN106067028A (en) * | 2015-04-19 | 2016-11-02 | 北京典赞科技有限公司 | The modeling method of automatic machinery based on GPU study |
Non-Patent Citations (2)
Title |
---|
郑洪峰: "中国民航创新方法思考与机场技术创新实践", 《民航管理》 * |
黄鑑: "对A-CDM系统建设的探索与思考", 《民航管理》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109887343A (en) * | 2019-04-04 | 2019-06-14 | 中国民航科学技术研究院 | It takes to a kind of flight and ensures node automatic collection monitoring system and method |
CN110097659A (en) * | 2019-05-16 | 2019-08-06 | 深圳市捷赛机电有限公司 | Catch, the time recording method for removing catch and Related product on a kind of aircraft |
CN110379209A (en) * | 2019-07-22 | 2019-10-25 | 捻果科技(深圳)有限公司 | A kind of flight work flow node specification monitoring alarm method |
CN110379209B (en) * | 2019-07-22 | 2021-11-09 | 捻果科技(深圳)有限公司 | Flight operation flow node specification monitoring and alarming method |
CN113096108A (en) * | 2021-04-21 | 2021-07-09 | 安徽声讯信息技术有限公司 | Accurate docking method for interface test |
CN115826464A (en) * | 2022-11-29 | 2023-03-21 | 航科院中宇(北京)新技术发展有限公司 | Remote machine position node acquisition system and acquisition method thereof |
CN115826464B (en) * | 2022-11-29 | 2024-03-22 | 航科院中宇(北京)新技术发展有限公司 | Acquisition method of remote site node acquisition system |
Also Published As
Publication number | Publication date |
---|---|
CN109063576B (en) | 2021-12-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109063576A (en) | Management method and device for flight movement node | |
CN204463272U (en) | reservation queuing system | |
CN107835240A (en) | A kind of standard is uniformly accessed into the multi-vendor access system of platform and cut-in method | |
CN109034684A (en) | Logistics end delivery management system based on unmanned dispensing machine people | |
CN110061871A (en) | Opening base station method, apparatus, computer storage medium and equipment | |
CN112008736A (en) | Welcome robot allocation method and device, storage medium and electronic device | |
CN109583780A (en) | A kind of city fine-grained management system | |
CN105072386B (en) | A kind of Net Video System and state monitoring method based on multicasting technology | |
CN107333095A (en) | The processing system of media resource | |
CN111918039A (en) | Artificial intelligence high risk operation management and control system based on 5G network | |
CN104394366A (en) | Distributed video streaming media transcoding access control method and system | |
CN110210793A (en) | A kind of project sites construction remote monitoring platform, system and method | |
CN109101547A (en) | Management method and device for wild animal | |
CN207780836U (en) | A kind of recruitment management system | |
CN106529456A (en) | Information matching and information transmitting/receiving method, device and target object finding system | |
CN113973094B (en) | Message processing method, system and medium | |
CN109936814A (en) | A kind of intercommunication terminal, speech talkback coordinated dispatching method and its system | |
WO2019042432A1 (en) | Information interaction method and system | |
CN107222712A (en) | Police command dispatching system and dispatching method | |
CN104735121B (en) | Method and system for matching device with network service | |
CN104253980B (en) | Connection method and device of a kind of headend equipment with backstage media device | |
CN112632124B (en) | Multimedia information acquisition method, device, system, storage medium and electronic device | |
CN207354486U (en) | Police command dispatching system | |
US11258939B2 (en) | System, method and apparatus for networking independent synchronized generation of a series of images | |
Zhang et al. | USTB 6G: Key technologies and metaverse applications |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |