CN108234195A - Predict method and apparatus, equipment, medium, the program of network performance - Google Patents
Predict method and apparatus, equipment, medium, the program of network performance Download PDFInfo
- Publication number
- CN108234195A CN108234195A CN201711306429.3A CN201711306429A CN108234195A CN 108234195 A CN108234195 A CN 108234195A CN 201711306429 A CN201711306429 A CN 201711306429A CN 108234195 A CN108234195 A CN 108234195A
- Authority
- CN
- China
- Prior art keywords
- network
- layer
- structural
- parameter
- network layer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/04—Network management architectures or arrangements
- H04L41/044—Network management architectures or arrangements comprising hierarchical management structures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L41/00—Arrangements for maintenance, administration or management of data switching networks, e.g. of packet switching networks
- H04L41/14—Network analysis or design
- H04L41/147—Network analysis or design for predicting network behaviour
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L43/00—Arrangements for monitoring or testing data switching networks
- H04L43/08—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters
- H04L43/0805—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters by checking availability
- H04L43/0817—Monitoring or testing based on specific metrics, e.g. QoS, energy consumption or environmental parameters by checking availability by checking functioning
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Environmental & Geological Engineering (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
The embodiment of the present disclosure discloses a kind of method and apparatus, equipment, medium, program for predicting network performance, wherein, method includes:Obtain the network parameter of structural network to be predicted;Based on the network parameter of the structural network, the structure feature of the structural network is determined;Based on the structure feature, the network performance parameter of the structural network is determined.The method provided based on disclosure above-described embodiment, it realizes and the performance of structural network is predicted by the network parameter of structural network, the method of the prediction network performance of the present embodiment saves the plenty of time, improves the foreseeable efficiency of internetworking without being trained to structural network.
Description
Technical field
This disclosure relates to depth learning technology, especially a kind of method and apparatus for predicting network performance, equipment, medium,
Program.
Background technology
Neural network is the core technology in present image identification system, is a kind of feature learning device end to end, passes through
Nerve network system may learn the feature statement of particular task, and final result is obtained using grader.A usual god
It is required for carrying out the training of a period of time to obtain satisfactory performance through network.
In recent years, the progress each time of neural network is brought by the improvement of the structure of neural network, however as
Neural network it is increasingly sophisticated, the mankind have been absorbed in bottleneck for the designed capacity of complex network structures.Therefore, it is various automatic
The algorithm of generation neural network structure is proposed that these algorithms have been proved to be able to the nerve net of generating structure complexity successively
Network.
Disclosure
A kind of technology for prediction network performance that the embodiment of the present disclosure provides.
According to the one side of the embodiment of the present disclosure, a kind of method of prediction network performance provided, using predicting network
It realizes, including:
Obtain the network parameter of structural network to be predicted;
Based on the network parameter of the structural network, the structure feature of the structural network is determined;
Based on the structure feature, the network performance parameter of the structural network is determined.
In another embodiment based on the above method of the present invention, the network parameter includes following at least one:
The calculating core of the calculating type of each network layer, the network layer at least one network layer of the structural network
Length, the network layer calculate the width of core, the channel number of the network layer, the network layer output channel number with
The ratio of input channel number.
In another embodiment based on the above method of the present invention, the network parameter based on the structural network,
Determine the structure feature of the structural network, including:
Based on the network parameter of the structural network, each network at least one network layer of the structural network is determined
The representation information of layer;
Based on the representation information of network layer each at least one network layer, the knot of the structural network is determined
Structure feature.
In another embodiment based on the above method of the present invention, the representation information includes having default dimension
Structure vector.
In another embodiment based on the above method of the present invention, based on the network parameter of the structural network, determine
The representation information of each network layer at least one network layer of the structural network, including:
The layer parameter of each network layer, determines each network at least one network layer based on the structural network
At least one identifier of layer;
Based at least one identifier of network layer each at least one network layer, each network layer is determined
Representation information.
In another embodiment based on the above method of the present invention, based on each network at least one network layer
At least one identifier of layer determines the representation information of each network layer, including:
Each identifier at least one identifier of each network layer is mapped, obtains each identifier
Mapping result;
Based on the mapping result of identifier each at least one identifier, the structure of each network layer is obtained
Represent information.
In another embodiment based on the above method of the present invention, at least one identifier to each network layer
In each identifier mapped, obtain the mapping result of each identifier, including:
It is obtained based on each identifier in each network layer by searching for the mode of the first preset table described every
The mapping result of a identifier.
In another embodiment based on the above method of the present invention, based on each network at least one network layer
The representation information of layer determines the structure feature of the structural network, including:
The representation information of network layer each at least one network layer is merged, obtains the Structure Network
The structure feature of network.
In another embodiment based on the above method of the present invention, by each network layer at least one network layer
Representation information merged, obtain the structure feature of the structural network, including:
Using neural network, the representation information of network layer each at least one network layer is merged,
Obtain the structure feature of the structural network.
It is described using neural network in another embodiment based on the above method of the present invention, it will be described at least one
The representation information of each network layer is merged in network layer, including:
The representation information of network layer each at least one network layer is inputted into neural network, the neural network is led to
It crosses recursive algorithm at least one representation information permeates structure feature.
In another embodiment based on the above method of the present invention, based on the structure feature, the Structure Network is determined
Before the network performance parameter of network, further include:
Time arrow is obtained based on the corresponding preset time point of the structural network;
Based on the structure feature, the network performance parameter of the structural network is determined, including:
Network of the structural network in the preset time point is obtained based on the structure feature and the time arrow
Performance parameter.
In another embodiment based on the above method of the present invention, it is described based on the structure feature and the time to
The structural network is measured in the corresponding network performance parameter of the preset time point, including:
Merge the structure feature and the time arrow, obtain merging feature;
Using multilayer perceptron, it is corresponding under the time arrow that the structural network is obtained based on the merging feature
Network performance parameter.
In another embodiment based on the above method of the present invention, it is described based on the structural network it is corresponding default when
Between point obtain time arrow, including:
When obtaining described based on the corresponding preset time o'clock of the structural network by searching for the mode of the second preset table
Between vector.
It is described based on the structure feature in another embodiment based on the above method of the present invention, determine the knot
The network performance parameter of network forming network, including:
Before the training structural network, based on the structure feature, the network performance ginseng of the structural network is determined
Number.
In another embodiment based on the above method of the present invention, in the network for obtaining structural network to be predicted
Before parameter, further include:
Using network is predicted described in multiple composition of sample network trainings, the composition of sample network is labeled with the sample knot
The network performance parameter of network forming network each time point at least one time point;The time point corresponds to the composition of sample
The frequency of training of network.
It is described using described in multiple composition of sample network trainings in another embodiment based on the above method of the present invention
Before predicting network, further include:
Network layer sampling is carried out to multiple preset structure networks, generates network block;The network block includes at least one
Network layer;
The composition of sample network is built based on the network block.
It is described that network layer is carried out to multiple preset structure networks in another embodiment based on the above method of the present invention
Sampling generates network block, including:
Multiple preset structure networks are sampled based on Markov chain, obtain at least one default network layer;
At least one default network layer is sequentially connected, forms the network block.
In another embodiment based on the above method of the present invention, the network layer include it is following any one or it is more
Kind:
Convolutional layer, maximum pond layer, average pond layer, active coating and batch layer.
It is described that the sample knot is built based on the network block in another embodiment based on the above method of the present invention
Network forming network, including:
It sequentially connects multiple network blocks and obtains the composition of sample network, wherein, multiple networks in the block the
One network block and the second network block correspond to different characteristic dimensions.
In another embodiment based on the above method of the present invention, the first network block and second network block it
Between be connected with maximum pond layer.
In another embodiment based on the above method of the present invention, based on Markov chain to multiple preset structure networks
It is sampled, obtains at least one default network layer, including:
Based on the obtained network parameter of the i-th network layer of sampling, multiple preset structure networks are sampled, obtain i-th+
1 network layer, wherein, i is greater than or equal to 1 and the number of network layer included less than the network block.
According to the other side of the embodiment of the present disclosure, the device of a kind of prediction network performance provided utilizes pre- survey grid
Network realization, including:
Parameter acquiring unit, for obtaining the network parameter of structural network to be predicted;
Structure feature unit for the network parameter based on the structural network, determines that the structure of the structural network is special
Sign;
Capabilities determination unit for being based on the structure feature, determines the network performance parameter of the structural network.
In another embodiment based on above device of the present invention, the network parameter includes following at least one:
The calculating core of the calculating type of each network layer, the network layer at least one network layer of the structural network
Length, the network layer calculate the width of core, the channel number of the network layer, the network layer output channel number with
The ratio of input channel number.
In another embodiment based on above device of the present invention, the structure feature unit, including:
Information representation module for the network parameter based on the structural network, determines at least the one of the structural network
The representation information of each network layer in a network layer;
Characteristic determination module, for the representation information based on network layer each at least one network layer, really
The structure feature of the fixed structural network.
In another embodiment based on above device of the present invention, the representation information includes having default dimension
Structure vector.
In another embodiment based on above device of the present invention, described information representation module, including:
Symbol recognition module is joined for the layer of network layer each at least one network layer based on the structural network
Number determines at least one identifier of each network layer;
Information module at least one identifier based on network layer each at least one network layer, determines
The representation information of each network layer.
In another embodiment based on above device of the present invention, the Symbol recognition module, specifically for each
Each identifier at least one identifier of network layer is mapped, and obtains the mapping result of each identifier;
Information module specifically for the mapping result based on identifier each at least one identifier, obtains institute
State the representation information of each network layer.
In another embodiment based on above device of the present invention, the Symbol recognition module is additionally operable to based on described
Each identifier in each network layer obtains the mapping knot of each identifier by searching for the mode of the first preset table
Fruit.
In another embodiment based on above device of the present invention, the characteristic determination module, specifically for by described in
The representation information of each network layer is merged at least one network layer, obtains the structure feature of the structural network.
In another embodiment based on above device of the present invention, the characteristic determination module, specifically for utilizing god
Through network, the representation information of network layer each at least one network layer is merged, obtains the Structure Network
The structure feature of network.
In another embodiment based on above device of the present invention, the characteristic determination module, specifically for will at least
The representation information input neural network of each network layer, the neural network are near by recursive algorithm in one network layer
Few representation information permeates a structure feature.
In another embodiment based on above device of the present invention, further include:
Time acquisition unit obtains time arrow for being based on the corresponding preset time point of the structural network;
The capabilities determination unit obtains the Structure Network specifically for being based on the structure feature and the time arrow
Network is in the network performance parameter of the preset time point.
In another embodiment based on above device of the present invention, the capabilities determination unit, including:
Merging module for merging the structure feature and the time arrow, obtains merging feature;
For utilizing multilayer perceptron, the structural network is obtained in institute based on the merging feature for time performance module
State corresponding network performance parameter under time arrow.
In another embodiment based on above device of the present invention, the time acquisition unit, specifically for being based on
It states the corresponding preset time o'clock of structural network and obtains the time arrow by searching for the mode of the second preset table.
In another embodiment based on above device of the present invention, the capabilities determination unit, specifically in training
Before the structural network, based on the structure feature, the network performance parameter of the structural network is determined.
In another embodiment based on above device of the present invention, further include:
Network training unit, for using predicting network, the composition of sample net described in multiple composition of sample network trainings
Network is labeled with the network performance parameter of composition of sample network each time point at least one time point;The time point
Corresponding to the frequency of training of the composition of sample network.
In another embodiment based on above device of the present invention, further include:
Network module unit for carrying out network layer sampling to multiple preset structure networks, generates network block;The network block
Include at least one network layer;
Network struction unit builds the composition of sample network for being based on the network block.
In another embodiment based on above device of the present invention, the network module unit, including:
Layer sampling module, samples multiple preset structure networks for being based on Markov chain, obtains at least one
Default network layer;
Block forms module, for sequentially connecting at least one default network layer, forms the network block.
In another embodiment based on above device of the present invention, the network layer include it is following any one or it is more
Kind:
Convolutional layer, maximum pond layer, average pond layer, active coating and batch layer.
In another embodiment based on above device of the present invention, the network struction unit, specifically for sequentially connecting
It connects multiple network blocks and obtains the composition of sample network, wherein, multiple networks first network block in the block and second
Network block corresponds to different characteristic dimensions.
In another embodiment based on above device of the present invention, the first network block and second network block it
Between be connected with maximum pond layer.
In another embodiment based on above device of the present invention, the layer sampling module, specifically for being based on sampling
The obtained network parameter of the i-th network layer samples multiple preset structure networks, obtains i+1 network layer, wherein, i is big
In or equal to 1 and the number of network layer that includes less than the network block.
According to the other side of the embodiment of the present disclosure, a kind of electronic equipment provided, including processor, the processor
Including device of deploying to ensure effective monitoring and control of illegal activities as described above.
According to the other side of the embodiment of the present disclosure, a kind of electronic equipment provided, including:Memory, for storing
Executable instruction;
And processor, it completes to regard as described above to perform the executable instruction for communicating with the memory
The operation of the content analysis method of frequency stream.
According to the other side of the embodiment of the present disclosure, a kind of computer storage media provided, for storing computer
The instruction that can be read, described instruction are performed the operation for the content analysis method for performing video flowing as described above.
According to the other side of the embodiment of the present disclosure, a kind of computer program provided, including computer-readable code,
When the computer-readable code in equipment when running, the processor execution in the equipment is used to implement video as described above
The instruction of each step in the content analysis method of stream.
Method and apparatus, equipment, medium, journey based on a kind of prediction network performance that disclosure above-described embodiment provides
Sequence obtains the network parameter of structural network to be predicted;Network parameter based on structural network determines that the structure of structural network is special
Sign;Based on structure feature, the network performance parameter of structural network is determined;It realizes and network parameter pair is passed through by structural network
The performance of structural network is predicted that the method for the prediction network performance of the present embodiment is saved without being trained to structural network
The plenty of time has been saved, has improved the efficiency of prediction network performance.
Below by drawings and examples, the technical solution of the disclosure is described in further detail.
Description of the drawings
The attached drawing of a part for constitution instruction describes embodiment of the disclosure, and is used to explain together with description
The principle of the disclosure.
With reference to attached drawing, according to following detailed description, the disclosure can be more clearly understood, wherein:
Fig. 1 is the schematic flow chart of the method for prediction network performance that the embodiment of the present disclosure provides.
Fig. 2 is that the disclosure predicts the process schematic that structure feature is obtained in method one embodiment of network performance.
Fig. 3 is composition of sample network the structure diagram generated in the embodiment of the present disclosure.
Fig. 4 is one predicted using network is predicted the network performance of structural network that the embodiment of the present disclosure provides
The structure diagram of example.
Fig. 5 is the structure diagram of the device of prediction network performance that the embodiment of the present disclosure provides.
Fig. 6 is the structural representation suitable for being used for realizing the terminal device of the embodiment of the present application or the electronic equipment of server
Figure.
Specific embodiment
The various exemplary embodiments of the disclosure are described in detail now with reference to attached drawing.It should be noted that:Unless in addition have
Body illustrates that the unlimited system of component and the positioned opposite of step, numerical expression and the numerical value otherwise illustrated in these embodiments is originally
Scope of disclosure.
Simultaneously, it should be appreciated that for ease of description, the size of the various pieces shown in attached drawing is not according to reality
Proportionate relationship draw.
It is illustrative to the description only actually of at least one exemplary embodiment below, is never used as to the disclosure
And its application or any restrictions that use.
Technology, method and apparatus known to person of ordinary skill in the relevant may be not discussed in detail, but suitable
In the case of, the technology, method and apparatus should be considered as part of specification.
It should be noted that:Similar label and letter represents similar terms in following attached drawing, therefore, once a certain Xiang Yi
It is defined in a attached drawing, then in subsequent attached drawing does not need to that it is further discussed.
The embodiment of the present disclosure can be applied to computer system/server, can be with numerous other general or specialized calculating
System environments or configuration operate together.Suitable for be used together with computer system/server well-known computing system, ring
The example of border and/or configuration includes but not limited to:Personal computer system, server computer system, thin client, thick client
Machine, hand-held or laptop devices, the system based on microprocessor, set-top box, programmable consumer electronics, NetPC Network PC,
Minicomputer system, large computer system and distributed cloud computing technology environment including any of the above described system, etc..
Computer system/server can be in computer system executable instruction (such as journey performed by computer system
Sequence module) general linguistic context under describe.In general, program module can include routine, program, target program, component, logic, number
According to structure etc., they perform specific task or realize specific abstract data type.Computer system/server can be with
Implement in distributed cloud computing environment, in distributed cloud computing environment, task is long-range by what is be linked through a communication network
Manage what equipment performed.In distributed cloud computing environment, program module can be located at the Local or Remote meter for including storage device
It calculates in system storage medium.
In other automatically generate the algorithm of neural network structure, a neural network structure is often generated in learning process
It is required for just can know that its performance to the primary training of neural network structure progress.Such primary training can spend at least one
The time of hour.Meanwhile it to generating algorithm is made to acquire good generative capacity, at least needs that him is allowed to generate the net of magnitudes up to ten thousand
Network structure and using these structures performance as feedback (i.e. to each network training at least 1 hour), this has resulted in life
Preconceived plan calligraphy learning process efficiency it is low.
This application provides a kind of technical solutions for predicting network performance, are realized using network is predicted, can predict each
The performance of the network structure of type.Wherein, the embodiment of the present disclosure can be applied to the neural network with sequential organization, can be with
Applied to the structural network of a variety of different depth and/or network topology structure, the embodiment of the present application does not limit this.
Fig. 1 is the schematic flow chart of the method for prediction network performance that the embodiment of the present disclosure provides.It as shown in Figure 1, should
Method includes:
S101 obtains the network parameter of structural network to be predicted.
Specifically, structural network can include multiple network layers, and multiple network layer can connect in some order.
In one or more alternative embodiments, the network parameter of structural network can include including the structural network to
The layer parameter of a few network layer, wherein, which can be the part or all of network layer of structural network, example
Such as, the layer parameter of each network layer of structural network to be predicted can be obtained.
In at least one alternative embodiment, layer parameter can include calculating type, calculate core (kernel) parameter and lead to
The arbitrary combination of any one or its in road parameter can also include other kinds of layer parameter, wherein, optionally, meter
Calculating nuclear parameter can include calculating the height of core and/or width (or being referred to as calculating the length and width of core), channel
Parameter can include the ratio of channel number and/or output channel number and input channel number, alternatively, calculating nuclear parameter and channel
Parameter can also include other kinds of parameter, and the embodiment of the present application is not construed as limiting the specific implementation of layer parameter.
In an optional example, the network parameter of structural network includes that following any one or its are arbitrary to be combined:Structure
The calculating type of each network layer at least one network layer of network, the length of calculating core of each network layer, network layer
Calculate the ratio of the width of core, the channel number of each network layer, the output channel number of each network layer and input channel number.With
The absolute figure of the channel number of network layer is compared, and the output channel number of network layer can be limited effectively with the ratio of input channel number
The numberical range of parameter processed, so as to reduce the complexity of subsequent processing.
S102, the network parameter based on structural network determine the structure feature of structural network.
In at least one alternative embodiment, which can have certain specific form, for example, the structure is special
Sign can have default dimension and/or structure, can cause to handle with different depth and topology by the way of unified in this way
The network of structure.
S103 based on structure feature, determines the network performance parameter of structural network.
Optionally, the network performance parameter of structural network can include the accuracy of structural network, wherein, as an example
Son, the network performance parameter of structural network can include the accuracy of the corresponding structural network of one or more frequency of training,
In, the corresponding structural network of some frequency of training can represent to obtain after being trained structural network using the frequency of training
Structural network.Optionally, which can also include other parameter, and the embodiment of the present application does not limit this.
Method based on a kind of prediction network performance that disclosure above-described embodiment provides, obtains structural network to be predicted
Network parameter;Network parameter based on structural network determines the structure feature of structural network;Based on structure feature, knot is determined
The network performance parameter of network forming network;It realizes and the performance of structural network is predicted by the network parameter of structural network, this
The method of the prediction network performance of embodiment saves the plenty of time, improves pre- survey grid without being trained to structural network
The efficiency of network performance.
Optionally, in a specific example of method above-described embodiment of disclosure prediction network performance, S102 includes:
Network parameter based on structural network determines the structure of each network layer at least one network layer of structural network
Represent information.
In at least one alternative embodiment, can the representation of the network layer be determined based on the layer parameter of network layer
Information.In this way, can each network be determined according to the layer parameter of network layer each at least one network layer of structural network
The representation information of layer.
Representation information can be with the design feature of marker.Optionally, which can be specifically preset
Form and/or dimension, in one or more alternative embodiments, the representation information can include with default dimension to
Amount.In this way, distinguishing different layers by unified representation, being uniformly processed for a variety of different network structures is realized.
In an optional example, the representation information of some network layer can include the calculating type correspondence of network layer
Vector, network layer calculate the corresponding vector of core height, network layer calculates the corresponding vector of core width and network layer
Vector of the output channel number corresponding to the ratio of input channel number.For example, the representation information can be by above-mentioned
What four vector splicings or merging treatment obtained, but representation information can also use other modes to realize that the application is real
Example is applied not limit this.
In one or more alternative embodiments, the structure of network layer can be determined by way of following unified layer coding
Represent information:
Layer parameter based on network layer determines at least one identifier of network layer;
At least one identifier based on network layer determines the representation information of network layer.
Wherein, identifier here can be that integer identifies.It is alternatively possible to it is compiled by the layer parameter to network layer
Code, obtains at least one integer, i.e., at least one identifier, for example, can each parameter value in layer parameter be encoded to one
A integer, in this way, can obtain the network layer it is corresponding including at least one integer multi-component system (tuple).It can at one
It selects in example, the layer parameter of network layer can include the calculatings type of network layer, the width for calculating core of network layer, network layer
The height of core and the ratio between the input channel number of network layer and output channel number are calculated, it correspondingly, can be to each in layer parameter
Parameter value is encoded, and obtains four-tuple (TY, KW, KH, CH), wherein, TY represents to calculate the corresponding identifier of type, KW and KH
The width and height of calculating core are represented respectively, CH represents output channel number identifier corresponding with the ratio of input channel number, but
The embodiment of the present application is without being limited thereto.
In one or more alternative embodiments, the correspondence between parameter value and identifier, example can be pre-set
Such as default correspondence calculated between type and integer mark, and the one or more of network layer is joined based on the correspondence
Numerical value is encoded, but the embodiment of the present application is without being limited thereto.
In one or more alternative embodiments, when the ratio to output channel number and input channel number encodes,
It can be quantized to default value first, be then based on the quantized result of output channel number and the ratio of input channel number, really
Its fixed corresponding identifier.For example, 8 sections (bin) can be preset, central point is respectively 0.25,0.5,0.75,1.0,
1.5,2.0,2.5,3.0, then the ratio of the input channel number of network layer and input channel number is quantized in above-mentioned 8 sections
One, and using the number of the determining quantized interval or index it is corresponding with the ratio of input channel number as the output channel number
Identifier.Optionally, determining for identifier can also be realized by other means, and the embodiment of the present application does not limit this.
In the embodiment of the present application, it after at least one identifier of network layer is obtained, obtains at least one based on this
Identifier determines the representation information of network layer.It, can be at least the one of network layer in one or more alternative embodiments
Each identifier in a identifier is mapped, and obtains the mapping result of each identifier, and based at least one mark
Know the mapping result of each identifier in symbol, obtain the representation information of network layer.
In one alternate embodiment, mapping can be realized by way of tabling look-up, and be corresponded to for example, network layer can be based on
Some identifier search the first preset table, obtain the mapping result of the identifier.Wherein, optionally, here first pre-
If table can by prediction network training during obtain, for example, the initial value in the first preset table can
To be random number, network is then predicted by training, the numerical value in first preset table is updated, finally obtain this
One preset table, but the embodiment of the present application does not limit this.
It is alternatively possible to corresponding each identifier is mapped using the mode of enquiry form to network layer, wherein,
It is mapped as shown in Fig. 2, different tables may be used in the corresponding identifier of different parameters value, for example, calculating type, calculating
Core width, calculating core height and output-corresponding identifier of input channel number ratio (or being integer code) can be utilized respectively
Different tables are mapped, and obtain type vector, core width vector, core height vector and channel vector.In this way, by tabling look-up
The vector of more high latitude is obtained, these more high-dimensional vectors can give expression to every layer of more rich feature.
It in the embodiments of the present disclosure, can at least one identifier is obtained after the mapping result of each identifier
The mapping result of at least one identifier to be spliced or merging treatment, the representation information of network layer is obtained.Example
Such as, can the type vector of network layer, core width vector, core height vector and channel vector be subjected to splicing, is somebody's turn to do
The vector of network layer represents that at this point, the representation information is the vector of fixed dimension, but the embodiment of the present application is without being limited thereto.
Optionally, S120 can also include:Based on the representation information of network layer each at least one network layer, really
Determine the structure feature of structural network.
In one or more optional examples, the representation of network layer each at least one network layer can be believed
Breath is merged or splicing, obtains the structure feature of structural network.For example, can will be in the all-network layer of structural network
Each network layer representation information carry out splicing, obtain the structure feature of the structural network.Alternatively it is also possible to
The structure feature of the structural network is determined by other means, and the embodiment of the present application does not limit this.
It is alternatively possible to using neural network, by the representation information of network layer each at least one network layer into
Row fusion or splicing.Neural network has its unique advantage in terms of information processing, for different tasks, can train to obtain
Targetedly neural network merges representation information based on the neural network after training, can reach and more preferably melt
Close effect.
In one or more alternative embodiments, using neural network to the knot of network layer each at least one network layer
Structure represents that information carries out fusion and includes:The representation information of network layer each at least one network layer is inputted into neural network
Processing.
Neural network can utilize representation information of one or more algorithms to one or more network layers of input
Fusion treatment is carried out, for example, the structure table of at least one network layer along the sequence of at least one network layer, can be merged
Show information.Optionally, at least one representation information can be fused to structure feature by neural network by recursive algorithm.Example
Such as, long short-term memory (Long-Short Term Memory, LSTM) network can be utilized to carry out above-mentioned fusion or splicing,
But the embodiment of the present application is not construed as limiting the specific implementation of neural network.
In one or more alternative embodiments, LSTM networks can include at least one LSTM units, and each LSTM is mono-
Member corresponds to a network layer.Optionally, LSTM can include keeping a hidden state htWith an element memory ct, and make
With input gate it, out gate otWith forgetting door (forget gate) ftTo control information flow.In each step, it needs to input xt,
It determines the value of all, generates an output ut, and update hidden state (hidden state) htWith element memory ct, it is such as public
Shown in formula (1)-(6):
it=σ (W(i)xt+U(i)ht-1+b(i)) formula (1)
ot=σ (W(o)xt+U(o)ht+b(°)) formula (2)
ft=σ (W(f)xt+U(f)ht+b(f)) formula (3)
ut=tanh (W(u)xt+U(u)ht-1+b(u)) formula (4)
ct=it⊙ut+ft⊙ct-1Formula (5)
ht=ot⊙tanh(ct) formula (6)
Wherein, σ represents Sigmoid functions, and ⊙ represents that element is same or operation, the level from rudimentary to advanced, network are gradual
Successively information is hidden state to the LSTM of involvement.In final step, i.e., the classification of layer is fully connected before layer, extracts hidden state
LSTM members usually represent the structure of whole network, referred to as structure feature.
In the embodiments of the present disclosure, optionally, the network performance parameter predicted in S103 can be the Structure Network of prediction
Network training complete after network performance parameter or can also be prediction some of structural network in the training process
Time point corresponding network performance parameter, wherein, which can correspond to the period that certain training is completed.At one or more
In a alternative embodiment, before S103, it is also based on the corresponding preset time point of structural network and obtains time arrow.
Specifically, preset time point can be expressed as Epoch ID, represent the frequency of training that structural network needs pass through, example
Traversal number when such as structural network traverses entire training set repeatedly in normal training.It is alternatively possible to by default to this
Time point is encoded or is mapped, and obtains the time arrow.For example, inquire the second preset table by using the preset time point
Lattice obtain the time arrow, wherein, optionally, which the prediction network can be obtained by training, specifically may be used
With with reference to the description to the first preset table, the embodiment of the present application implement it and be not construed as limiting above.
In one or more alternative embodiments, after the time arrow is obtained, in S103, it is special structure can be based on
It seeks peace time arrow, obtains network performance parameter of the structural network in preset time point.
It is alternatively possible to which the structure feature of structural network and the time arrow are spliced or fusion treatment, spelled
It connects or fusion results, and based on the splicing or fusion results, obtains network performance parameter of the structural network in preset time point.
In one or more optional examples, it can obtain merging feature, the merging with combinatorial construction feature and time arrow
It can be realized by way of splicing.It is alternatively possible to be handled using multilayer perceptron amalgamation result, Structure Network is obtained
Network is in the network performance parameter of preset time point.
Specifically, which can obtain by training, and the multilayer perceptron that training obtains can be based on
Structure feature and time arrow, obtain network performance parameter.
In one or more alternative embodiments, which can include full articulamentum, batch normalizes
It is one or more in (Batch Normalization, BN) layer and active coating.For example, the multilayer perceptron can include three
A full articulamentum, BN layers and amendment linear unit (Rectified Linear Unit, ReLU) active coating, but the application is implemented
Example is not construed as limiting the specific implementation of multilayer perceptron.
In the embodiments of the present disclosure, it is alternatively possible to before being trained to structural network, is determined by above-mentioned flow
The network performance parameter of the structural network can obtain the structural network in different time points accurate without training structure network
Degree substantially reduces the time (can shorten to 0.01 second) of prediction network performance, calculation is automatically generated so as to quickly predict
The network performance for the structural network that the modes such as method generate improves the probability for the neural network for obtaining excellent performance.
Optionally, before S101, sample data training prediction network can also be utilized, wherein, sample data includes more
The network performance parameter of a composition of sample network and the composition of sample network each time point at least one time point.
In this way, multiple composition of sample network trainings for being labeled with network performance parameter (such as accuracy) can be utilized to predict
Network obtains the training network for using.
Wherein, time point here can correspond to the frequency of training of composition of sample network, optionally, to predicting network
During being trained, the frequency of training of composition of sample network can include the composition of sample network in the training process every
Secondary training, and when using the network performance parameter for predicting neural network forecast structural network, it can only obtain the structural network and instruct
Practice corresponding network performance parameter when completing.
In embodiments of the present invention, composition of sample network can be generated in a manner of block-based.Specifically, heap can be passed through
The mode of the similar block of stack structure builds composition of sample network.Each piece can be built first, then by each piece of structure with
Certain network architecture is stacked, and forms composition of sample network.
Optionally, it before using multiple composition of sample network trainings prediction network, further includes:
Network layer sampling is carried out to multiple preset structure networks, generates network block.Specifically, the multiple nets obtained will be sampled
Network layers connect and compose network block, wherein, the connection relation between network layer is determined by the network parameter in network layer, such as:It obtains
The network parameter of jth layer network layer is obtained, determines what network layer is the network layer of+1 layer of jth be based on the network parameter.Wherein,
Network layer include it is following any one or more:Convolutional layer, maximum pond layer, average pond layer, active coating and batch layer.
Specifically, network block includes at least one network layer;The acquisition of preset structure network can be based on network or
Large database concept obtains, and the structural network of acquisition is usually neural network, by being sampled to neural network, obtains net therein
Network layers, network layer is combined and obtains network block, and the composition of sample network obtained by combinational network block can overcome the prior art
In acquisition is directly combined by network layer neural network the drawbacks of occurring, the drawbacks of prior art includes:By what cannot be combined
Network layer is combined, it is impossible to which meaningless network layer links together after connection or connection, at this point, based on the prior art
The neural network of acquisition may not apply to train.
In one or more embodiments, the network block based on any of the above-described embodiment structure builds composition of sample network.
In the composition of sample network of composition, in addition to including network block, average pond layer and line are finally being also typically included
Property layer.The input of pond layer is typically derived from a convolutional layer, and there is provided very strong robustness (such as max- for main function
Pooling is the maximum value taken in a pocket, if the other values in this region are slightly changed at this time or image is slightly flat
Move, the result after pooling is still constant), and reduce the quantity of parameter, prevent the generation of over-fitting.Pond layer one
As there is no parameter, so when backpropagation, only input parameter derivation need to not needed to carry out right value update.
Specifically, preset structure network is sampled, generates the process of network block, including:
Multiple preset structure networks are sampled based on Markov chain, obtain at least one default network layer;
At least one default network layer is sequentially connected, forms network block.The network layer for forming network block is usually not more than 10
Layer, and the first layer of network block is usually convolutional layer.
Optionally, composition of sample network is built based on network block, including:
Multiple network blocks are sequentially connected to obtain composition of sample network or stack multiple network blocks acquisition composition of sample networks,
Wherein, multiple networks first network block in the block and the second network block correspond to different characteristic dimensions;First network block and
Network block is not present between two network blocks.
Maximum pond layer is connected between first network block and the second network block.
Optionally, Markov chain samples multiple preset structure networks, obtains at least one default network layer, packet
It includes:
Based on the obtained network parameter of the i-th network layer of sampling, multiple preset structure networks are sampled, obtain i-th+
1 network layer.
In each step of Markov chain, system can change to another state according to probability distribution from a state,
It can keep current state.The change of state is called transfer, and changing relevant probability from different states is called transition probability.Tool
Body, it is sampled based on Markov Chain, the net that next layer can connect can be determined based on the network parameter of a upper network layer
Network layers are any network layers, therefore, can quickly form network block.
It is that the condition of next network layer is determined based on a upper network layer obtaining network block, can is according to default
Transition channel type between determine the probability, the probability between this transition channel type is obtained from the experience estimation of real network
's.For example, it is batch change layer and nonlinear activation layer that next network layer of convolutional layer, which has very high probability,.Active coating it is next
The connection convolutional layer of layer higher probability or pond layer.
Fig. 3 is composition of sample network the structure diagram generated in the embodiment of the present disclosure.
The composition of sample network formed in Fig. 3 includes three regions, and three regions correspond to different characteristic dimensions, Mei Gequ
Domain includes two network blocks, and there are one maximum pond layer, each region corresponds to different feature dimensions for connection between each two region
Degree, in the present embodiment, the corresponding characteristic dimension in three regions is respectively 16,32 and 64.
Specifically, it is pre- using predicting network the network performance of structural network being carried out to be that the embodiment of the present disclosure is provided by Fig. 4
The structure diagram for the example surveyed.It is possible, firstly, to by the block frame structure (i.e. Block Architecture) of structural network
Each layer be input to prediction network, prediction network each layer in block frame structure can be encoded respectively and map (or
For insertion), specifically, can first by each layer of calculating type, calculate core width and height and each layer of output-
Input channel ratio is encoded respectively, corresponding integer code (integer code) is obtained, in this way, it is right to obtain each layer
That answers includes the multi-component system (TY, KW, KH, CH) of four integer codes, (i.e. every a line in Block Layer Code);Then,
Layer insertion (i.e. Layer Embedding) can be carried out to the corresponding multi-component system of each layer, it specifically, can be in multi-component system
Each integer code carries out mapping processing, obtains the corresponding vector of each integer code, and the corresponding vector of four integer codes is joined
4 dimensional vectors are synthesized, which can represent as the code vector of this layer.It is then possible to by corresponding 4 dimension of each layer
Vector is separately input to a LSTM unit of LSTM, and LSTM can handle multiple 4 dimensional vector of input, obtain whole
The whole expression of a structural network, the i.e. structure feature (Structural Feature) of structural network.Furthermore, it is possible to it will need
The time point corresponding period mark (Epoch ID) of prediction is input in prediction network, and prediction network can be embedded in by period
The mode of (i.e. Epoch Embedding) obtains time arrow or period is vectorial (Epoch Vector), wherein, period insertion
Can be similar with above-mentioned layer telescopiny, it is realized using the mode tabled look-up.Finally, can be vectorial with co-ordinative construction feature and period,
And combined results are input to multilayer perceptron (Multi-Layer Perceptron, MLP), the MLP export structure networks
Network performance parameter, for example, the accuracy of the structural network of the prediction neural network forecast at the time point.
It should be understood that example shown in Fig. 4 is only used for the technical solution for helping to understand the embodiment of the present disclosure, should not be understood in pairs
The restriction of the embodiment of the present disclosure, the embodiment of the present disclosure can also be realized by other means.
One of ordinary skill in the art will appreciate that:Realizing all or part of step of above method embodiment can pass through
The relevant hardware of program instruction is completed, and aforementioned program can be stored in a computer read/write memory medium, the program
When being executed, step including the steps of the foregoing method embodiments is performed;And aforementioned storage medium includes:ROM, RAM, magnetic disc or light
The various media that can store program code such as disk.
One of ordinary skill in the art will appreciate that:Realizing all or part of step of above method embodiment can pass through
The relevant hardware of program instruction is completed, and aforementioned program can be stored in a computer read/write memory medium, the program
When being executed, step including the steps of the foregoing method embodiments is performed;And aforementioned storage medium includes:ROM, RAM, magnetic disc or light
The various media that can store program code such as disk.
Fig. 5 is the structure diagram of the device of prediction network performance that the embodiment of the present disclosure provides.The device of the embodiment
Available for realizing the above-mentioned each method embodiment of the disclosure.As shown in figure 5, using real-time performance is predicted, the device packet of the embodiment
It includes:
Parameter acquiring unit 51, for obtaining the network parameter of structural network to be predicted.
Specifically, network parameter includes following at least one:
The calculating type of each network layer at least one network layer of structural network, network layer calculating core length,
The ratio for calculating the width of core, the channel number of network layer, the output channel number of network layer and input channel number of network layer.
Structure feature unit 52 for the network parameter based on structural network, determines the structure feature of structural network.
Capabilities determination unit 53 for being based on structure feature, determines the network performance parameter of structural network.
Device based on a kind of prediction network performance that disclosure above-described embodiment provides, obtains structural network to be predicted
Network parameter;Network parameter based on structural network determines the structure feature of structural network;Based on structure feature, knot is determined
The network performance parameter of network forming network;It realizes and the performance of structural network is carried out in advance by network parameter by structural network
It surveys, the method for the prediction network performance of the present embodiment saves the plenty of time, improve pre- without being trained to structural network
Survey the efficiency of network performance.
Structure feature unit 52, including:
Information representation module for the network parameter based on structural network, determines at least one network layer of structural network
In each network layer representation information;
Wherein, optionally, representation information includes the structure vector with default dimension.
Characteristic determination module for the representation information based on network layer each at least one network layer, determines knot
The structure feature of network forming network.
Specifically, information representation module can include Symbol recognition module and information module;
Symbol recognition module, for the layer parameter of network layer each at least one network layer based on structural network, really
At least one identifier of fixed each network layer;
Information module at least one identifier based on network layer each at least one network layer, determines each
The representation information of network layer.
Optionally, in a specific example, Symbol recognition module, specifically at least one mark to each network layer
The each identifier known in symbol is mapped, and obtains the mapping result of each identifier;
Information module specifically for the mapping result based on identifier each at least one identifier, obtains each net
The representation information of network layers.
Specifically, it is by searching for the mode of the first preset table that Symbol recognition module obtains mapping result based on identifier
Obtain the mapping result of each identifier.Wherein, the first preset table is by predicting that network determines in the training process, just
What the numerical value in the first preset table to begin was randomly generated, the numerical information for specific tasks is obtained by training process.
Optionally, in a specific example, characteristic determination module, specifically for by net each at least one network layer
The representation information of network layers is merged, and obtains the structure feature of structural network.
Specifically, fusion structure represents that information can utilize neural network, by network layer each at least one network layer
Representation information merged, obtain the structure feature of structural network.
It, specifically can be by by network layer each at least one network layer using neural network fusion representation information
Representation information input neural network, neural network permeated at least one representation information by recursive algorithm
A structure feature.Wherein, recursive algorithm is a kind of process for directly or indirectly calling itself algorithm.Journey is write in computer
In sequence, to solving the problems, such as that a major class is highly effective, it often makes the description of algorithm succinct and should be readily appreciated that recursive algorithm.
Another embodiment of the device of disclosure prediction network performance, can be on the basis of the various embodiments described above
Including:
Time acquisition unit obtains time arrow for being based on the corresponding preset time point of structural network.
Specifically, preset time point Epoch ID represent that structural network needs the number of training, by the number by reflecting
It penetrates the mode tabled look-up and obtains a time arrow, time arrow is high broadwise amount, and the mapping table of the time arrow is again by instruction
Practice and obtain;Traversal time when structural network traverses entire training set repeatedly in normal training by the time point expression of definition
Number.
At this point, capabilities determination unit 53, structural network is obtained described specifically for being based on structure feature and time arrow
The network performance parameter of preset time point.
Specifically, it after structure feature and time arrow merge, is handled using multilayer perceptron, obtains network performance ginseng
Number, the network performance parameter are usually represented by the accuracy of structural network.
Combinatorial construction feature and time arrow obtain merging feature;The structure feature of merging and the mode of time arrow, can
Realized in a manner of by splicing, it is spliced merging feature can while structure feature is embodied, by structure feature and when
Between put corresponding, to obtain at this time merging feature representation be it is corresponding with time point, at this time based on merge feature obtain standard
Exactness is the time point of the corresponding time arrow.
Optionally, in a specific example, time acquisition unit, specifically for being based on the corresponding preset time of structural network
O'clock obtain the time arrow by searching for the mode of the second preset table.
Specifically, the method that disclosure device is realized carries out before training structure network.By in structural network
The network performance parameter of structural network is obtained before training (such as:Accuracy), it can will predict the time of network performance significantly
Reduction improves the efficiency of performance prediction.
Another embodiment of the device of disclosure prediction network performance, on the basis of the various embodiments described above, further includes:
Network training unit, for predicting network using multiple composition of sample network trainings, composition of sample network is labeled with
The network performance parameter of composition of sample network each time point at least one time point;Time point corresponds to composition of sample net
The frequency of training of network.
Wherein, time point corresponds to the frequency of training of composition of sample network, is needle for frequency of training in the training process
Each frequency of training (each needed for training completion) is carried out, and when concrete application predicts network, it need to only obtain finally
The time point corresponding network performance parameter of frequency of training.
Optionally, it before network training unit, further includes:
Network module unit for carrying out network layer sampling to multiple preset structure networks, generates network block.
Specifically, network module unit can include:
Layer sampling module, samples multiple preset structure networks for being based on Markov chain, obtains at least one
Default network layer;
Block forms module, for sequentially connecting at least one default network layer, forms network block.
Wherein, network block includes at least one network layer;The acquisition of preset structure network can be based on network or big
Database obtains, and the structural network of acquisition is usually neural network, by being sampled to neural network, obtains network therein
Layer, network layer is combined and obtains network block, and the composition of sample network obtained by combinational network block can overcome in the prior art
The drawbacks of the drawbacks of neural network for directly combining acquisition by network layer occurs, the prior art, includes:The net that will cannot be combined
Network layers are combined, it is impossible to which meaningless network layer links together after connection or connection, at this point, being obtained based on the prior art
The neural network obtained may not apply to train.
Network struction unit, for being based on network block structure composition of sample network.It is particularly used in and sequentially connects multiple nets
Network block obtains composition of sample network, wherein, multiple networks first network block in the block and the second network block correspond to different spies
Levy dimension.
Specifically, network layer include it is following any one or more:Convolutional layer, maximum pond layer, average pond layer, activation
Layer and batch layer.
Optionally, maximum pond layer is connected between first network block and the second network block.Form the network layer of network block
It is usually not more than 10 layers, and the first layer of network block is usually convolutional layer.
Optionally, layer sampling module, specifically for the network parameter of the i-th network layer obtained based on sampling, to multiple pre-
If structural network is sampled, i+1 network layer is obtained, wherein, i is greater than or equal to 1 and the network layer included less than network block
Number.
According to the other side of the embodiment of the present disclosure, a kind of electronic equipment provided, including processor, processor includes
The device of the prediction network performance of any of the above-described embodiment of the disclosure.
According to the other side of the embodiment of the present disclosure, a kind of electronic equipment provided, including:Memory, for storing
Executable instruction;
And processor, for communicating with memory disclosure prediction network performance is completed to perform executable instruction
Any of the above-described embodiment of method operation.
According to the other side of the embodiment of the present disclosure, a kind of computer storage media provided, for storing computer
The instruction that can be read, instruction are performed the operation for any of the above-described embodiment of method for performing disclosure prediction network performance.
According to the other side of the embodiment of the present disclosure, a kind of computer program provided, including computer-readable code,
When computer-readable code in equipment when running, the processor in equipment performs each in the method for disclosure prediction network performance
The instruction of step.
According to another aspect of the embodiment of the present disclosure, a kind of computer program product provided, for storing computer
Readable instruction, described instruction is performed so that computer performs the human body key described in any of the above-described possible realization method
Point detecting method.
In an optional embodiment, the computer program product is specially computer storage media, at another
In optional embodiment, the computer program product is specially software product, such as SDK.
In addition, the embodiment of the present disclosure additionally provides a kind of computer storage media, for storing computer-readable finger
It enables, the method which is performed the prediction network performance for realizing any of the above-described embodiment of the disclosure.
In addition, the embodiment of the present disclosure additionally provides a kind of computer program, including computer-readable instruction, work as calculating
When the instruction that machine can be read is run in a device, the processor execution in equipment is used to implement any of the above-described embodiment of the disclosure
The method for predicting network performance.
In an optional embodiment, the computer program is specially software product, such as software development kit
(Software Development Kit, SDK), etc..
In one or more optional embodiments, the embodiment of the present disclosure additionally provides a kind of computer program program production
Product, for storing computer-readable instruction, described instruction is performed so that computer performs the prediction of any of the above-described embodiment
The method of network performance.
The computer program product can be realized especially by hardware, software or its mode combined.In an alternative embodiment
In son, the computer program product is embodied as computer storage media, in another optional example, the computer
Program product is embodied as software product, such as SDK etc..
In one or more optional embodiments, the embodiment of the present disclosure additionally provides a kind of method for predicting network performance
And its corresponding device and electronic equipment, computer storage media, computer program and computer program product, wherein, it should
Method includes:First device sends the instruction of prediction network performance to second device, and it is above-mentioned which so that second device performs
The method of prediction network performance in any possible embodiment;First device receives the network performance ginseng that second device is sent
Number.
In some embodiments, human body critical point detection instruction can be specially call instruction, and first device can lead to
It crosses the mode called and indicates that second device performs the prediction of network performance, accordingly, in response to call instruction is received, second fills
Put the step and/or flow in any embodiment in the method that can perform above-mentioned prediction network performance.
The embodiment of the present disclosure additionally provides a kind of electronic equipment, such as can be mobile terminal, personal computer (PC), put down
Plate computer, server etc..Below with reference to Fig. 6, it illustrates suitable for being used for realizing the terminal device of the embodiment of the present application or service
The structure diagram of the electronic equipment 600 of device:As shown in fig. 6, computer system 600 includes one or more processors, communication
Portion etc., one or more of processors are for example:One or more central processing unit (CPU) 601 and/or one or more
Image processor (GPU) 613 etc., processor can according to the executable instruction being stored in read-only memory (ROM) 602 or
From the executable instruction that storage section 608 is loaded into random access storage device (RAM) 603 perform various appropriate actions and
Processing.Communication unit 612 may include but be not limited to network interface card, and the network interface card may include but be not limited to IB (Infiniband) network interface card.
Processor can communicate with read-only memory 602 and/or random access storage device 630 to perform executable instruction,
It is connected by bus 604 with communication unit 612 and is communicated through communication unit 612 with other target devices, is implemented so as to complete the application
The corresponding operation of any one method that example provides, for example, obtaining the network parameter of structural network to be predicted;Based on structural network
Network parameter, determine the structure feature of structural network;Based on structure feature, determine to obtain the network performance ginseng of structural network
Number.
In addition, in RAM 603, it can also be stored with various programs and data needed for device operation.CPU601、ROM602
And RAM603 is connected with each other by bus 604.In the case where there is RAM603, ROM602 is optional module.RAM603 is stored
Executable instruction is written in executable instruction into ROM602 at runtime, and it is above-mentioned logical that executable instruction performs processor 601
The corresponding operation of letter method.Input/output (I/O) interface 605 is also connected to bus 604.Communication unit 612 can be integrally disposed,
It may be set to be with multiple submodule (such as multiple IB network interface cards), and in bus link.
I/O interfaces 605 are connected to lower component:Importation 606 including keyboard, mouse etc.;It is penetrated including such as cathode
The output par, c 607 of spool (CRT), liquid crystal display (LCD) etc. and loud speaker etc.;Storage section 608 including hard disk etc.;
And the communications portion 609 of the network interface card including LAN card, modem etc..Communications portion 609 via such as because
The network of spy's net performs communication process.Driver 610 is also according to needing to be connected to I/O interfaces 605.Detachable media 611, such as
Disk, CD, magneto-optic disk, semiconductor memory etc. are mounted on driver 610, as needed in order to be read from thereon
Computer program be mounted into storage section 608 as needed.
Need what is illustrated, framework as shown in Figure 6 is only a kind of optional realization method, can root during concrete practice
The component count amount and type of above-mentioned Fig. 6 are selected, are deleted, increased or replaced according to actual needs;It is set in different function component
Put, can also be used it is separately positioned or integrally disposed and other implementations, such as GPU and CPU separate setting or can be by GPU collection
Into on CPU, communication unit separates setting, can also be integrally disposed on CPU or GPU, etc..These interchangeable embodiments
Each fall within protection domain disclosed in the disclosure.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product, it is machine readable including being tangibly embodied in
Computer program on medium, computer program are included for the program code of the method shown in execution flow chart, program code
It may include the corresponding instruction of corresponding execution method and step provided by the embodiments of the present application, for example, obtaining structural network to be predicted
Network parameter;Network parameter based on structural network determines the structure feature of structural network;Based on structure feature, determine
To the network performance parameter of structural network.In such embodiments, which can be by communications portion 609 from net
It is downloaded and installed on network and/or is mounted from detachable media 611.In the computer program by central processing unit (CPU)
During 601 execution, the above-mentioned function of being limited in the present processes is performed.
Disclosed method and device, equipment may be achieved in many ways.For example, software, hardware, firmware can be passed through
Or any combinations of software, hardware, firmware realize disclosed method and device, equipment.The step of for method
Sequence is stated merely to illustrate, the step of disclosed method is not limited to sequence described in detail above, unless with other
Mode illustrates.In addition, in some embodiments, the disclosure can be also embodied as recording program in the recording medium, this
A little programs include being used to implement the machine readable instructions according to disclosed method.Thus, the disclosure also covers storage for holding
The recording medium gone according to the program of disclosed method.
The description of the disclosure provides for the sake of example and description, and is not exhaustively or by the disclosure
It is limited to disclosed form.Many modifications and variations are obvious for the ordinary skill in the art.It selects and retouches
Embodiment is stated and be the principle and practical application in order to more preferably illustrate the disclosure, and those of ordinary skill in the art is enable to manage
The disclosure is solved so as to design the various embodiments with various modifications suitable for special-purpose.
Claims (10)
- A kind of 1. method for predicting network performance, which is characterized in that using predicting real-time performance, including:Obtain the network parameter of structural network to be predicted;Based on the network parameter of the structural network, the structure feature of the structural network is determined;Based on the structure feature, the network performance parameter of the structural network is determined.
- 2. according to the method described in claim 1, it is characterized in that, the network parameter includes following at least one:The calculating type of each network layer, the length for calculating core of the network layer at least one network layer of the structural network The output channel number and the input that calculate the width of core, the channel number of the network layer, the network layer of degree, the network layer The ratio of port number.
- 3. method according to claim 1 or 2, which is characterized in that the network parameter based on the structural network, really The structure feature of the fixed structural network, including:Based on the network parameter of the structural network, each network layer at least one network layer of the structural network is determined Representation information;Based on the representation information of network layer each at least one network layer, determine that the structure of the structural network is special Sign.
- 4. according to the method described in claim 3, it is characterized in that, the representation information includes the knot with default dimension Structure vector.
- 5. method according to claim 3 or 4, which is characterized in that the network parameter based on the structural network determines institute The representation information of each network layer at least one network layer of structural network is stated, including:The layer parameter of each network layer, determines each network layer at least one network layer based on the structural network At least one identifier;Based at least one identifier of network layer each at least one network layer, the knot of each network layer is determined Structure represents information.
- 6. a kind of device for predicting network performance, which is characterized in that using predicting real-time performance, including:Parameter acquiring unit, for obtaining the network parameter of structural network to be predicted;Structure feature unit for the network parameter based on the structural network, determines the structure feature of the structural network;Capabilities determination unit for being based on the structure feature, determines the network performance parameter of the structural network.
- 7. a kind of electronic equipment, which is characterized in that including processor, the processor includes the dress of deploying to ensure effective monitoring and control of illegal activities described in claim 6 It puts.
- 8. a kind of electronic equipment, which is characterized in that including:Memory, for storing executable instruction;And processor, for communicating to perform the executable instruction so as to complete claim 1 to 5 times with the memory The operation of the content analysis method of one video flowing of meaning.
- 9. a kind of computer storage media, for storing computer-readable instruction, which is characterized in that described instruction is performed When perform claim require 1 to 5 any one described in video flowing content analysis method operation.
- 10. a kind of computer program, including computer-readable code, which is characterized in that when the computer-readable code is being set During standby upper operation, the processor execution in the equipment is used to implement the content of video flowing described in claim 1 to 5 any one The instruction of each step in analytic method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711306429.3A CN108234195B (en) | 2017-12-08 | 2017-12-08 | Method, apparatus, device, medium for predicting network performance |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201711306429.3A CN108234195B (en) | 2017-12-08 | 2017-12-08 | Method, apparatus, device, medium for predicting network performance |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108234195A true CN108234195A (en) | 2018-06-29 |
CN108234195B CN108234195B (en) | 2021-08-31 |
Family
ID=62653467
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201711306429.3A Active CN108234195B (en) | 2017-12-08 | 2017-12-08 | Method, apparatus, device, medium for predicting network performance |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108234195B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112561174A (en) * | 2020-12-18 | 2021-03-26 | 西南交通大学 | Method for predicting geothermal energy production based on LSTM and MLP superimposed neural network |
CN115396929A (en) * | 2022-08-15 | 2022-11-25 | 中国联合网络通信集团有限公司 | Performance data prediction method, device and storage medium |
CN115860055A (en) * | 2022-11-23 | 2023-03-28 | 北京百度网讯科技有限公司 | Performance determination method, performance optimization method, device, electronic equipment and medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101675648A (en) * | 2007-03-08 | 2010-03-17 | Lm爱立信电话有限公司 | Structure relevant and method with performance monitoring |
US20110061051A1 (en) * | 2009-09-10 | 2011-03-10 | International Business Machines Corporation | Dynamic Recommendation Framework for Information Technology Management |
CN102567786A (en) * | 2011-12-13 | 2012-07-11 | 北京交通大学 | Method for predicting derailment coefficients |
US20140171039A1 (en) * | 2012-10-04 | 2014-06-19 | Bernt Erik Bjontegard | Contextually intelligent communication systems and processes |
CN104091045A (en) * | 2014-06-16 | 2014-10-08 | 华南理工大学 | Predicting method for long-term performance of air conditioner based on BP neural network |
-
2017
- 2017-12-08 CN CN201711306429.3A patent/CN108234195B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101675648A (en) * | 2007-03-08 | 2010-03-17 | Lm爱立信电话有限公司 | Structure relevant and method with performance monitoring |
US20110061051A1 (en) * | 2009-09-10 | 2011-03-10 | International Business Machines Corporation | Dynamic Recommendation Framework for Information Technology Management |
CN102567786A (en) * | 2011-12-13 | 2012-07-11 | 北京交通大学 | Method for predicting derailment coefficients |
US20140171039A1 (en) * | 2012-10-04 | 2014-06-19 | Bernt Erik Bjontegard | Contextually intelligent communication systems and processes |
CN104091045A (en) * | 2014-06-16 | 2014-10-08 | 华南理工大学 | Predicting method for long-term performance of air conditioner based on BP neural network |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112561174A (en) * | 2020-12-18 | 2021-03-26 | 西南交通大学 | Method for predicting geothermal energy production based on LSTM and MLP superimposed neural network |
CN115396929A (en) * | 2022-08-15 | 2022-11-25 | 中国联合网络通信集团有限公司 | Performance data prediction method, device and storage medium |
CN115860055A (en) * | 2022-11-23 | 2023-03-28 | 北京百度网讯科技有限公司 | Performance determination method, performance optimization method, device, electronic equipment and medium |
CN115860055B (en) * | 2022-11-23 | 2024-01-02 | 北京百度网讯科技有限公司 | Performance determination method, performance optimization method, device, electronic equipment and medium |
Also Published As
Publication number | Publication date |
---|---|
CN108234195B (en) | 2021-08-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Flurin et al. | Using a recurrent neural network to reconstruct quantum dynamics of a superconducting qubit from physical observations | |
US11295208B2 (en) | Robust gradient weight compression schemes for deep learning applications | |
CN109446430A (en) | Method, apparatus, computer equipment and the readable storage medium storing program for executing of Products Show | |
CN108268934A (en) | Recommendation method and apparatus, electronic equipment, medium, program based on deep learning | |
Li et al. | Automating cloud deployment for deep learning inference of real-time online services | |
CN108460338A (en) | Estimation method of human posture and device, electronic equipment, storage medium, program | |
CN108418825A (en) | Risk model training, rubbish account detection method, device and equipment | |
CN110297911A (en) | Internet of Things (IOT) calculates the method and system that cognition data are managed and protected in environment | |
CN109345553A (en) | A kind of palm and its critical point detection method, apparatus and terminal device | |
CN108280451A (en) | Semantic segmentation and network training method and device, equipment, medium, program | |
US11741370B2 (en) | Transfer learning based on cross-domain homophily influences | |
CN113792881A (en) | Model training method and device, electronic device and medium | |
Fisichella et al. | Can deep learning improve technical analysis of forex data to predict future price movements? | |
CN107844653A (en) | A kind of reservoir water drive potentiality to be exploited integrated evaluating method and device | |
CN104933428A (en) | Human face recognition method and device based on tensor description | |
CN108234195A (en) | Predict method and apparatus, equipment, medium, the program of network performance | |
CN113379059B (en) | Model training method for quantum data classification and quantum data classification method | |
CN110231447A (en) | The method, apparatus and terminal device of water quality abnormality detection | |
CN112420125A (en) | Molecular attribute prediction method and device, intelligent equipment and terminal | |
US11475297B2 (en) | Cross-domain homophily quantification for transfer learning | |
CN117215728B (en) | Agent model-based simulation method and device and electronic equipment | |
CN116109449A (en) | Data processing method and related equipment | |
Luchnikov et al. | Simulating quantum circuits using the multi-scale entanglement renormalization ansatz | |
CN113255701B (en) | Small sample learning method and system based on absolute-relative learning framework | |
CN109726824A (en) | The transfer learning method and terminal device of training pattern |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |