CN112182848A - Modeling and simulation service quality measurement method for weapon equipment simulation - Google Patents
Modeling and simulation service quality measurement method for weapon equipment simulation Download PDFInfo
- Publication number
- CN112182848A CN112182848A CN202010920457.XA CN202010920457A CN112182848A CN 112182848 A CN112182848 A CN 112182848A CN 202010920457 A CN202010920457 A CN 202010920457A CN 112182848 A CN112182848 A CN 112182848A
- Authority
- CN
- China
- Prior art keywords
- service
- ith
- services
- simulation
- formula
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 91
- 238000000691 measurement method Methods 0.000 title claims abstract description 55
- 238000011156 evaluation Methods 0.000 claims abstract description 86
- 238000000034 method Methods 0.000 claims abstract description 65
- 238000012216 screening Methods 0.000 claims abstract description 60
- 238000005259 measurement Methods 0.000 claims abstract description 32
- 238000005457 optimization Methods 0.000 claims abstract description 31
- 230000006870 function Effects 0.000 claims abstract description 18
- 238000013441 quality evaluation Methods 0.000 claims abstract description 11
- 230000008447 perception Effects 0.000 claims abstract description 9
- 238000012163 sequencing technique Methods 0.000 claims abstract description 5
- 239000013598 vector Substances 0.000 claims description 59
- 230000008569 process Effects 0.000 claims description 21
- 238000004364 calculation method Methods 0.000 claims description 18
- 230000004044 response Effects 0.000 claims description 18
- 230000008859 change Effects 0.000 claims description 10
- 238000010606 normalization Methods 0.000 claims description 6
- 238000005094 computer simulation Methods 0.000 claims description 4
- 238000012986 modification Methods 0.000 claims description 4
- 230000004048 modification Effects 0.000 claims description 4
- 238000013442 quality metrics Methods 0.000 claims description 3
- 238000011084 recovery Methods 0.000 claims description 3
- 238000013097 stability assessment Methods 0.000 claims description 3
- 238000003860 storage Methods 0.000 claims description 3
- 238000004422 calculation algorithm Methods 0.000 description 13
- 238000013461 design Methods 0.000 description 5
- 238000011160 research Methods 0.000 description 5
- 238000004458 analytical method Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 4
- 230000014509 gene expression Effects 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 230000007123 defense Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- OPDFUQJBZZJZRG-WPJYNPJPSA-N (4r,4as,7r,7ar,12bs)-7-[2-[2-[2-[[(4r,4as,7r,7ar,12bs)-3-(cyclopropylmethyl)-4a,9-dihydroxy-1,2,4,5,6,7,7a,13-octahydro-4,12-methanobenzofuro[3,2-e]isoquinoline-7-yl]amino]ethoxy]ethoxy]ethylamino]-3-(cyclopropylmethyl)-1,2,4,5,6,7,7a,13-octahydro-4,12-me Chemical compound N1([C@@H]2CC3=CC=C(C=4O[C@@H]5[C@](C3=4)([C@]2(CC[C@H]5NCCOCCOCCN[C@H]2[C@@H]3OC=4C(O)=CC=C5C[C@@H]6[C@]([C@@]3(CCN6CC3CC3)C5=4)(O)CC2)O)CC1)O)CC1CC1 OPDFUQJBZZJZRG-WPJYNPJPSA-N 0.000 description 1
- 208000021236 Hereditary diffuse leukoencephalopathy with axonal spheroids and pigmented glia Diseases 0.000 description 1
- 241001494479 Pecora Species 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 208000036969 diffuse hereditary with spheroids 1 leukoencephalopathy Diseases 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 229940081330 tena Drugs 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/04—Constraint-based CAD
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/10—Numerical modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2119/00—Details relating to the type or aim of the analysis or the optimisation
- G06F2119/14—Force analysis or force optimisation, e.g. static or dynamic forces
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Primary Health Care (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
The invention discloses a modeling and simulation service quality measurement method for weapon equipment simulation, which establishes a modeling and simulation service quality measurement index system from two aspects of service screening and service optimization, and firstly performs service screening evaluation from the aspects of functions, interfaces, data infrastructure and the like to ensure the availability of services; if a plurality of services pass screening evaluation, service optimization sequencing evaluation is carried out from the aspects of performance, network, user perception and the like, and the goodness and badness sequencing of the plurality of services is given; if only a single service passes the screening evaluation, the service optimization value evaluation is carried out by combining the optimization index, and a service quality measurement comprehensive value is given; otherwise, returning the service information which is not found to be available. The method can effectively solve the problems of modeling and simulation service quality evaluation and selection for weapon equipment simulation.
Description
Technical Field
The invention relates to the field of computer simulation and evaluation, in particular to a modeling and simulation service quality measurement method for weapon equipment simulation.
Background
After development for more than half a century, modeling and simulation techniques have been widely used in aerospace, electronics, marine, manufacturing, and other fields. In order to meet the urgent requirements of distributed simulation, the U.S. department of defense proposes the concept of advanced distributed simulation, and the simulation framework is subjected to the development stages of SIMNET, DIS, ALSP, HLA, TENA and the like in sequence. However, current modeling and simulation architectures have been difficult to meet the increasingly complex, accurate weaponry simulation application requirements. On one hand, modeling simulation objects are more and more complex, so that the work of constructing a complex simulation system is very complicated, and the existing simulation framework is difficult to reuse and interoperate the resources of the existing framework; on the other hand, how to realize rapid combined simulation, optimize a system design scheme or process, and respond to rapid iteration of requirements is a key for improving design quality. The existing simulation framework has defects in the aspects of interconnection, intercommunication, expandability and the like, and needs to realize the iterative upgrade of modeling and simulation framework and improve the modeling and simulation capability of weapon equipment.
Emerging information technologies such as service-oriented architectures and cloud computing provide possibility for constructing a generalized, extensible and reusable modeling and simulation architecture. The Beijing modeling and Simulation group (MSG-131) innovatively provides the concept of MSaaS (modeling-Simulation-as-a-Service), and establishes an MSG-136 (MSaaS: interoperable and reliable Simulation environment rapid deployment) research group based on MSaaS results in 2014, and develops the work in the aspects of standards, protocols, system architectures, execution, cost effectiveness ratio analysis and the like of investigation, suggestion and evaluation of MSaaS. By the end of 2017, the group completed MSaaS concept development and evaluation. In addition, 29/11/2017, Cohort corporation SEA issued the results of british national defense science and technology laboratory simulation system architecture, interoperability and management research projects, MSaaS in the fields of simulation, experimental evaluation, simulation-based adoption and the like supported users to save cost, improve efficiency, provide a better overall solution, and is becoming an important method for simulation delivery strategy.
Quality of service (QoS) refers to "the overall effect of using services, determining how satisfied a user is with those services". Under a new framework oriented to weapon equipment simulation, the quality of modeling and simulation service is evaluated, and the capability of the service for meeting the requirements of users is actually evaluated. When there are multiple services available for selection, it is important how the user selects the best service. In this regard, there are documents that carry out relevant research work on methods for evaluating cloud computing service quality. The Hangzhou star provides seven important characteristics of the cloud computing service from two aspects of core service and support service; for Web services in cloud computing, Lekayman and the like provide service quality evaluation models of service availability, reliability, efficiency, readability and the like; and Rohur and the like research an optimal pricing model of cloud computing service based on service quality guarantee. However, no relevant literature has been retrieved so far for how to evaluate the quality of service of the metric weaponry simulation-oriented modeling and simulation.
Disclosure of Invention
The invention aims to provide a modeling and simulation service quality measurement method for weapon equipment simulation, which can effectively solve the problems of evaluation and selection of modeling and simulation service quality.
The specific technical scheme of the invention is as follows:
a modeling and simulation service quality measurement method oriented to weapon equipment simulation comprises the following steps:
step 1: establishing a modeling and simulation service quality measurement index system according to service requirements;
step 2: performing service screening evaluation on one or more services currently from a service available layer;
and step 3: judging whether the current service is available, if so, turning to the step 4, otherwise, giving an unavailable suggestion of the current service, and turning to the step 7;
and 4, step 4: judging whether the number of the current services is more than two, if so, turning to the step 5, otherwise, turning to the step 6;
and 5: from the service optimization aspect, based on the optimization indexes, performing optimization ranking evaluation on the current service quality, meanwhile, integrating service screening evaluation values, giving a service quality ranking, and going to step 7;
step 6: from the service optimization aspect, based on the optimization indexes, the service value of the current service quality is evaluated, meanwhile, the evaluation value is screened by comprehensive service, and the current service quality metric value is given;
and 7: the quality of service metric process ends.
The simulation modeling for the weapon equipment specifically comprises the following steps:
in weapon equipment combat simulation, the system mainly comprises sensors, aircrafts and other types of equipment, wherein the aircraft equipment is used for modeling and simulating missiles and comprises a kinematics/dynamics model, a missile-target relative motion model, a guidance model and other parts, and relates to position/speed component solution, proportion guidance design and the like, and parameters of the missile-target relative motion model need to be solved. Specifically, taking the case of a missile attacking a moving target as an example, a kinematic/dynamic model of the missile is established:
in the formula, x and y are respectively the abscissa and the ordinate of the missile position,andthe rate of change of x and y, respectively; vxAnd VyThe components of the missile speed V in the horizontal coordinate direction and the vertical coordinate direction are respectively;andare each VxAnd VyThe variation rate of (a) is n is missile overload perpendicular to the line of sight of the missile, q is the target azimuth angle, and theta is the missile runway angle.
Setting the target to do uniform motion in an attack plane, and then:
in the formula, xTAnd yTRespectively the horizontal and vertical coordinates of the target position,andare respectively xTAnd yTRate of change of (V)TIs the target speed, σTIs the target heading angle.
Establishing a bullet relative motion model:
wherein r is the relative distance between the missile and the target,andactual rates of change for r and q, respectively; eta is missile velocity vector lead angle etaTFor the target velocity vector lead angle, σTIs the target heading angle.
Selecting a proportional guidance law, and constructing a missile guidance model:
in the formula, K is a proportional-pilot law coefficient.
The above model requires the pair Vx、Vy、xT、yTAnd K and other parameters are solved, and the related solving algorithm is regarded as a type of service.
In step 1, the service requirement refers to the requirement of a user on the aspects of functions, performances and interfaces of the modeling and simulation service; the modeling and simulation service quality measurement index system comprises two types of measurement indexes of service screening and service optimization, wherein the service screening type measurement indexes comprise:
functional indicators including availability, interoperability, extensibility, combinability, wherein availability is determined by response time, failure time, service time;
interface indexes including reusability, normalization and usability;
data infrastructure metrics including throughput, redundancy, fault tolerance;
the service preference class metric includes:
performance indicators, including accuracy, reliability, stability;
network indexes including security, time delay and packet loss rate;
and the user perception indexes comprise service price, credit degree and success rate.
The step 2 comprises the following steps:
setting the current number of services to N1 forDifferent services respectively calculate their service screening class index values IS=[IS1,IS2,...,ISN1]In which ISiA vector consisting of screening class index values for the ith service, wherein i is 1, 2. The service screening index calculation method comprises the following steps:
(1) the function index measuring method comprises the following steps:
availability: the availability refers to the capability of the service to provide required functions at a certain time point or within a certain time period, and is determined by response time, failure time and service time, and the measurement method is as follows:
in the formula, FAiAn availability evaluation value for the ith service; [ omega ]rωfωs]Weight coefficients, ω, for response time, fault time, and service time, respectivelyr+ωf+ωs=1;ri、fi、siI is the average response time, the average failure time and the average service time of the ith service, i is 1,2T=[r1,r2,…,rN]、fT=[f1,f2,…,fN]、sT=[s1,s2,…,sN]Respectively mean response time, fault time and service time vector of the same type of service;
interoperability: interoperability refers to the ability of two or more services to share information, including the ability to invoke and be invoked by other services, and measures the method:
in the formula, FIiAn interoperability assessment for the ith service; [ omega ]call ωcalled]Weight coefficients for invoking other services and for being invoked by other services, respectively,ωcall+ωcalled=1;ccalli、ccallediRespectively calling other services for the ith service and calling times of the ith service by other services; c. Ccall=[ccall1,ccall2,…,ccallN]、ccalled=[ccalled1,ccalled2,…,ccalledN]Respectively calling other services and calling the frequency vectors by the other services;
and (3) expandability: the expandability refers to the capability of adapting to the change of the demand of the service and expanding and generating new service through function inheritance, and the measurement method comprises the following steps:
in the formula, FEiA scalability evaluation value for the ith service; e.g. of the typeiThe number of new services generated for the ith service extension; e ═ e1,e2,…,eN]Generating a new service quantity vector for the extension;
combinability: combinability refers to the ability to form a new service by combining existing services, and the measurement method is as follows:
in the formula, FCiA combinability evaluation value for the ith service; c. CiGenerating a combined service number for the ith service;
c=[c1,c2,…,cN]a vector of the number of services for the combination;
(2) the interface index measurement method comprises the following steps:
reusability: reusability refers to the ability of a service interface to be reused without or with little modification, and the measurement method is as follows:
in the formula IRiA reusability evaluation value for the ith service; r isiThe number of times of reuse for the ith service; r ═ r1,r2,…,rN]Is a reuse number vector;
standardization: the normalization refers to the characteristic of restricting and normalizing the content of the service interface, and the measuring method comprises the following steps:
in the formula INiA normative evaluation value for the ith service; n isiThe ith service specification degree; n ═ n1,n2,…,nN]Is a normalized degree vector;
ease of use: the usability refers to the easiness of understanding, learning and mastering the service by a user, and the measuring method comprises the following steps:
in the formula IUiAn ease of use assessment value for the ith service; u. ofiUsing times for the ith service; u ═ u1,u2,…,uN]Is a vector of the number of times of use;
(3) the data infrastructure index measurement method comprises the following steps:
throughput: the data throughput refers to the quantity of data of modeling and simulation services received in unit time, and the measurement method comprises the following steps:
in the formula, DTiAn estimate of throughput for the ith service; t is tiData throughput for the ith service; t ═ t1,t2,…,tN]Is a data throughput vector;
redundancy: the redundancy means that in order to improve the robustness of modeling and simulation services, the data of the modeling and simulation services are backed up in a data infrastructure, so that the influence of faults on the modeling and simulation services is reduced, and the measurement method comprises the following steps:
in the formula, DRiA redundancy evaluation value for the ith service; r isiThe number of data backups for the ith service;
r=[r1,r2,…,rN]a quantity vector for data backup;
fault tolerance: the fault tolerance is also called fault tolerance, which means that when an error occurs, the normal capability of the system is restored through data backup of different storage devices, and the measurement method is as follows:
in the formula, DFiA fault tolerance evaluation value for the ith service; f. ofbi、friRespectively representing the total times of faults and the times of fault recovery;
judging whether an index value is smaller than a service screening class measurement index threshold value alpha in each service, if so, rejecting the service, and otherwise, reserving the service; the number of services eventually reached is set to N2;
finally, an analytic hierarchy process (reference: Dengxue, Li Jia Ming, great and healthy, Chenjun sheep, Zhao Jun Peak. analytic hierarchy process weight calculation method analysis and application research [ J]The practice and the knowledge of mathematics, 2012,42(7):93-100.), the measurement indexes of the service screening class are integrated, and the screening evaluation value E of N2 services is calculatedS=[ES1,ES2,...,ESN2],ESiN2 is the screening evaluation value for the ith service, i 1, 2.
In step 3, if a service screening evaluation result is greater than or equal to the threshold β, the service is available, otherwise the service is unavailable.
The step 5 comprises the following steps: setting the current service number as N3, firstly, calculating the service preference index values I for different servicesO=[IO1,IO2,...,ION3]In which IOi(i 1, 2.,. N3) is a vector consisting of each preferred index value of the ith service; the service optimization index calculation method comprises the following steps:
(1) performance index measurement method
The accuracy is as follows: the accuracy refers to the accuracy of modeling and simulation service, and the measurement method comprises the following steps:
in the formula, PAiAn accuracy evaluation value for the ith service; a isi=|rai-reiI is the accuracy of the ith service, rai、reiRespectively serving actual results and expected results; a ═ a1,a2,…,aN]Is an accuracy vector;
reliability: reliability refers to the ability of a service to maintain a normal, correct operating state when used in a given environment, and is measured by:
in the formula, PRiReliability evaluation value for ith service; r isiMean time to failure for the ith service;
r=[r1,r2,…,rN]is an average time without failure vector;
stability: the stability refers to the capability of the service to continue normal operation when the operation environment changes, and the measurement method comprises the following steps:
in the formula, PSiStability assessment value for ith service; siThe service is normally operated after the environment is changed for the ith service; s ═ s1,s2,…,sN]The normal operation time vector is the normal operation time vector after the environment is changed;
(2) network index measuring method
Safety: the security refers to that the software and hardware of the network and the system thereof are protected and are not damaged, changed and leaked due to accidental or malicious reasons, and the reliable, normal and uninterrupted operation of modeling and simulation services can be supported, and the measurement method comprises the following steps:
in the formula, NSiA security assessment value for the ith service; siA service time for the ith service; s ═ s1,s2,…,sN]Is a service time vector;
time delay: the time delay refers to the time required for transmitting a data packet from a network transmitting end to a receiving end, and the measurement method comprises the following steps:
in the formula, NTiA delay evaluation value for the ith service; t is tiServing the ith delay; t ═ t1,t2,…,tN]Is a service delay vector;
packet loss rate: the packet loss rate refers to the total amount of data packets which are lost in the using process of the modeling and simulation service, and the packet loss rate measurement method comprises the following steps
In the formula, NPiEvaluating the packet loss rate of a single simulation task in the ith service; p is a radical ofli、ptiAre respectively provided withThe total amount of information packets which are lost during a single simulation task and are combed and transmitted is determined;
(3) user perception index measuring method
Service price: the service price refers to a method for measuring the cost which must be paid when the service requester calls the operation, and the method comprises the following steps:
in the formula of UPiA service price evaluation value for the ith service; p is a radical ofiFor the ith service price; p ═ p1,p2,…,pN]As a service price vector;
creditworthiness: the credibility refers to the degree of satisfaction and trust of the user on the service, and the measuring method comprises the following steps:
in the formula of UCiEvaluating the credibility of the ith service; c. CiScoring the ith service; c ═ c1,c2,…,cN]Scoring a vector for the service;
success rate: the success rate refers to the probability that the service can be correctly responded, and the measurement method is as follows:
in the formula of USiA success rate evaluation value for the ith service; sri、sciRespectively calling times for the requested service and times for the service to be correctly responded;
then, the service quality evaluation results of the N3 screening type indexes are taken as an index to be included in the service preference type measurement indexes, and the occupied weight is k (0< k < 1);
finally, the TOPSIS method (reference: Huyongmo. improvement of TOPSIS for comprehensive evaluation [ J ] was used]MathematicalPractice and recognition 2002,32(4):572-RiThe final quality ordering result is according to ERiThe magnitude of the values is determined in order and goes to step 7.
The step 6 comprises the following steps: firstly, for the current service, calculating a service preference class index value IOk;
Then, determining the weight of the service preference class measurement index by adopting an analytic hierarchy process;
finally, the service quality evaluation result of the screening type index is brought into the service optimization type measurement index, weighted summation is carried out, and the evaluation value E of the current service is obtained through calculationV。
In particular, the quality of service V is aimed at cost-type indicators, such as metric response timerMore strictly stated as
The metric expression of the cost-type index in the invention is similar, but is simplified from the point of simple expression. Similarly, for benefit-type indicators, e.g. quality of service V measuring service timesMore strictly stated as
The measurement formula of the benefit type index is also simplified.
In particular, the invention relates to thresholds α, β ∈ [0, 1], generally obtained from historical experience, or assessed by experts in the relevant field.
The positive progress effects of the invention are as follows: the method starts from two dimensionalities of service screening and service optimization, and establishes a service quality measurement index system covering six aspects of functions, interfaces, data infrastructure, performance, networks, user perception and the like; the service quality evaluation process is further designed, services which do not meet the availability requirement are removed through screening evaluation, the quality ranking of a plurality of services is realized through ranking evaluation, and the user preference is supported. The method has the advantages of complete indexes, reasonable index measurement method and strong operability in the evaluation process, is easy to understand and accept by the personnel in the field, and provides important decision reference basis for a user to select from a plurality of services.
Drawings
FIG. 1 is a schematic diagram of modeling and simulation service quality measurement process oriented to weapon equipment simulation.
FIG. 2 is a schematic diagram of modeling and simulation QoS metric index system for weapon equipment simulation.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and detailed description.
As shown in FIG. 1, the measurement evaluation indexes of the modeling and simulation service quality measurement method for weapon equipment simulation can be divided into two categories of service screening and service optimization, which cover six aspects of functions, interfaces, data infrastructure, performance, network, user perception and the like, perform service optimization on the basis of ensuring service availability, and give a service quality measurement comprehensive value.
With reference to fig. 1, the modeling and simulation service quality measurement method for weapon equipment simulation of the present invention includes the following steps:
step 1: and establishing a modeling and simulation service quality measurement index system according to the service requirements.
As shown in FIG. 2, the metric index system is composed of service screening class indexes and service preference class indexes, wherein the service screening class indexes include
1) Functional indicators including availability, interoperability, extensibility, combinability, etc., wherein availability is determined by response time, down time, service time, etc;
2) interface indexes including reusability, normalization, usability and the like;
3) data infrastructure metrics including throughput, redundancy, fault tolerance, etc.
The service preference class metric has
1) Performance indicators, including accuracy, reliability, stability, etc.;
2) network indexes including security, time delay, packet loss rate and the like;
3) and the user perception indexes comprise service price, credit degree, success rate and the like.
Step 2: and performing service screening evaluation on one or more current services from the service available layer.
Assume that the current number of services is N1. Firstly, aiming at different services, respectively calculating service screening class index values IS=[IS1,IS2,...,ISN1]In which ISiAnd (i ═ 1, 2., N1) is a vector consisting of screening class index values of the ith service. The service screening index calculation method comprises the following steps:
1) function index measuring method
a) Availability
Availability refers to the ability of a service to provide a desired function at a certain point in time or within a certain period of time, and is determined by response time, failure time, service time and other factors, and the measurement method is
In the formula, FAiAn availability evaluation value for the ith service; [ omega ]rωfωs]Weight coefficient, ω, for response time, time to failure, time to servicer+ωf+ωs=1;ri、fi、si(i 1, 2.. times.n) is the ith service average response time, average failure time, average service time, respectively; r isT=[r1,r2,…,rN]、fT=[f1,f2,…,fN]、sT=[s1,s2,…,sN]The average response time, the fault time and the service time vector of the same type of service are respectively, and N is the number of the same type of service. In particularQuality of service V for cost-type indicators, e.g. measuring response timerMore strictly stated as
The metric expressions for subsequent cost-type indicators are similar, but are simplified from the standpoint of simplicity of expression. Similarly, for benefit-type indicators, e.g. quality of service V measuring service timesMore strictly stated as
In the subsequent measurement formula, simplification is also performed.
b) Interoperability
Interoperability refers to the ability to share information between two or more services, including the ability to invoke and be invoked by other services, measured as
In the formula, FIiAn interoperability assessment for the ith service; [ omega ]callωcalled]Weighting factors, ω, invoked by other services for invoking themcall+ωcalled=1;ccalli、ccallediRespectively calling other services for the ith service and calling times of the ith service by other services; c. Ccall=[ccall1,ccall2,…,ccallN]、ccalled=[ccalled1,ccalled2,…,ccalledN]Respectively calling other services and calling the number vectors by the other services, wherein N is the number of the services of the same type.
c) Extensibility
The expandability refers to the capability of adapting to the change of the demand of the service and expanding and generating new service through function inheritance, and the measurement method is
In the formula, FEiA scalability evaluation value for the ith service; e.g. of the typeiThe number of new services generated for the ith service extension; e ═ e1,e2,…,eN]And generating a new service quantity vector for expansion, wherein N is the quantity of the services of the same type.
d) Combinability
Combinability refers to the ability to form a new service by combining existing services, measured as
In the formula, FCiA combinability evaluation value for the ith service; c. CiGenerating a combined service number for the ith service;
c=[c1,c2,…,cN]for the combined service quantity vector, N is the quantity of the same type of service.
2) Interface index measurement method
a) Reusability
Reusability refers to the ability of a service interface to be reused without or with little modification, measured by
In the formula IRiA reusability evaluation value for the ith service; r isiThe number of times of reuse for the ith service; r ═ r1,r2,…,rN]For reuse number vector, N is the number of services of the same type.
b) Normative property
The normalization refers to the characteristic that the content of the service interface is restricted and normalized, the interface design document is checked and scored by generally referring to the scoring item standard, and normalized processing is carried out by adopting benefit type indexes, and the measurement method is that
In the formula INiA normative evaluation value for the ith service; n isiThe ith service specification degree; n ═ n1,n2,…,nN]For the canonical degree vector, N is the number of services of the same type.
c) Ease of use
Ease of use refers to the ease with which a service can be understood, learned, and mastered by a user, measured in
In the formula IUiAn ease of use assessment value for the ith service; u. ofiUsing times for the ith service; u ═ u1,u2,…,uN]To use the number of times vector, N is the number of services of the same type.
3) Data infrastructure index measuring method
a) Throughput capacity
The data throughput refers to the quantity of data of modeling and simulation services received in unit time, and can be measured by the quantity of bits, bytes, data packets and the like
In the formula, DTiAn estimate of throughput for the ith service; t is tiData throughput for the ith service; t ═ t1,t2,…,tN]For a data throughput vector, N is the number of services of the same type.
b) Redundancy
The redundancy means that in order to improve the robustness of modeling and simulation services, the data of the modeling and simulation services are backed up in a data infrastructure, so that the influence of faults on the modeling and simulation services is reduced, and the measurement method comprises the following steps
In the formula, DRiA redundancy evaluation value for the ith service; r isiThe number of data backups for the ith service;
r=[r1,r2,…,rN]and N is the number of the same type of service.
c) Fault tolerance
The fault tolerance is also called fault tolerance, which means that when an error occurs, the normal capability of the system is restored through data backup of different storage devices, and the measurement method is
In the formula, DFiA fault tolerance evaluation value for the ith service; f. ofbi、friThe total failure times and the failure recovery times are respectively.
Then, judging whether an index value is smaller than a service screening type measurement index threshold value alpha or not in each service, if so, rejecting the service, and otherwise, reserving the service; the final number of services to be met is assumed to be N2.
And finally, integrating the service screening class measurement indexes by adopting an analytic hierarchy process, and calculating to obtain screening evaluation values E of the N2 servicesS=[ES1,ES2,...,ESN2]。
And step 3: and judging whether the current service is available, if so, turning to the step 4, otherwise, giving an unavailable suggestion of the current service, and turning to the step 7.
The method for judging whether the current service is available is based on the threshold beta (0)<β<1) Judging if a certain service screening evaluation result ESiBeta or more, the service is available, otherwiseThe service is not available; the final number of currently available services is assumed to be N3.
And 4, step 4: and (5) judging whether the number of the current services is multiple, if so, turning to the step 5, and otherwise, turning to the step 6.
And judging whether the currently available service number N3 is greater than 1, if so, turning to the step 5, and otherwise, turning to the step 6.
And 5: and performing optimization ranking evaluation on the current service quality from a service optimization level, and simultaneously integrating service screening evaluation values to give service quality ranking.
Assume that the current number of services is N3. Firstly, aiming at different services, respectively calculating service preference class index values IO=[IO1,IO2,...,ION3]In which IOiAnd (i ═ 1, 2., N3) is a vector consisting of the index values of the preferred classes of the ith service. The service optimization index calculation method comprises the following steps:
1) performance index measurement method
a) Accuracy of
The accuracy refers to the accuracy of modeling and simulation service, and can be measured by the consistency degree of the service result and the expected result
In the formula, PAiAn accuracy evaluation value for the ith service; a isi=|rai-reiI is the accuracy of the ith service, rai、reiRespectively serving actual results and expected results; a ═ a1,a2,…,aN]For accuracy vectors, N is the number of services of the same type.
b) Reliability of
Reliability refers to the ability of a service to maintain a normal, correct operating state when used in a given environment, and can generally be measured by normalizing Mean Time Between Failures (MTBF)
In the formula, PRiReliability evaluation value for ith service; r isiMean time to failure for the ith service;
r=[r1,r2,…,rN]n is the number of services of the same type as the mean time to failure vector.
c) Stability of
The stability refers to the capability of the service to continue normal operation when the operation environment changes, and the measurement method is
In the formula, PSiStability assessment value for ith service; siThe service is normally operated after the environment is changed for the ith service; s ═ s1,s2,…,sN]The time vector is the normal operation time after the environment is changed, and N is the number of the services of the same type.
2) Network index measuring method
a) Safety feature
The security refers to that the software and hardware of the network and the system thereof are protected and are not damaged, changed and leaked due to accidental or malicious reasons, the reliable, normal and uninterrupted operation of modeling and simulation service can be supported, and the measurement method is that
In the formula, NSiA security assessment value for the ith service; siA service time for the ith service; s ═ s1,s2,…,sN]For the service time vector, N is the number of services of the same type.
b) Time delay
The time delay refers to the time required for transmitting a data packet from a network transmitting end to a receiving end, generally comprises transmission time delay, propagation time delay, processing time delay, queuing time delay and the like, and the measurement method is that
In the formula, NTiA delay evaluation value for the ith service; t is tiServing the ith delay; t ═ t1,t2,…,tN]For the service delay vector, N is the number of services of the same type.
c) Packet loss rate
The packet loss rate refers to the total amount of data packets which are lost in the using process of the modeling and simulation service and account for the transmitted data packets. The packet loss rate of the service is the average value of the packet loss rates of a plurality of simulation tasks (including the modeling and simulation service). Here, a comprehensive packet loss rate measurement method is provided
In the formula, NPiEvaluating the packet loss rate of a single simulation task in the ith service; p is a radical ofli、ptiThe total amount of the lost information packet combing and transmitted information packet during the single simulation task is respectively.
3) User perception index measuring method
a) Price of service
The service price is the fee that the service requester has to pay to call the operation, and is generally specified by the service provider, and the measurement method is
In the formula of UPiA service price evaluation value for the ith service; p is a radical ofiFor the ith service price; p ═ p1,p2,…,pN]For the service price vector, N is the number of services of the same type.
b) Degree of credit
The credibility refers to the degree of satisfaction and trust of the user to the service, the user scores the service according to the experience of using the service and by referring to the scoring item standard, and the measuring method is that
In the formula of UCiEvaluating the credibility of the ith service; c. CiScoring the ith service; c ═ c1,c2,…,cN]And (4) scoring a service vector, wherein N is the number of services of the same type.
c) Success rate
Success rate refers to the probability that a service can be responded to correctly, and is measured by
In the formula of USiA success rate evaluation value for the ith service; sri、sciRespectively the number of times the service is called for the request and the number of times the service is correctly responded to.
Then, the service quality evaluation results of the N3 screening type indexes are taken as an index to be included in the service preference type measurement indexes, and the occupied weight is k (0< k < 1);
and finally, sequencing the indexes by adopting a TOPSIS method, and calculating to obtain the quality degree E of each serviceRiThe final quality ordering result is according to ERiThe magnitude of the values is determined in order and goes to step 7.
Step 6: and from the service optimization aspect, service value evaluation is carried out on the current service quality, and meanwhile, the service screening evaluation value is integrated to give a current service quality metric value.
Firstly, for the current service, calculating a service preference class index value IOk;
Then, determining the weight of the service preference class measurement index by adopting an analytic hierarchy process;
finally, the screening class refers toThe evaluation result of the standard service quality is incorporated into the service optimization metric index, weighted summation is carried out, and the evaluation value E of the current service is obtained by calculationV。
And 7: the quality of service metric process ends.
The present invention will be further described with reference to specific embodiments.
In the weapon equipment combat simulation, the weapon equipment combat simulation mainly comprises sensors, aircrafts and other types of equipment, wherein the aircrafts are used for modeling and simulating missiles and comprise parts such as a kinematics/dynamics model, a missile-target relative motion model, a guidance model and the like, and relate to position/speed component solution, proportion guidance design and the like, and parameters of the missiles need to be solved, wherein differential equation solution is a main numerical simulation algorithm, and the algorithm solution efficiency is influenced by an adopted programming language, an adopted packaging format and the like.
Aiming at the service quality evaluation in the weapon equipment modeling simulation, the efficiency of the differential equation solving algorithm involved needs to be focused. Nine ordinary differential equation solving algorithm services are selected as analysis objects, and the service quality of the analysis objects is evaluated so as to check the effectiveness of the modeling simulation service quality evaluation method provided by the patent.
According to the algorithm types, nine algorithm services can be divided into three types, namely a fourth-fifth-order Ruge-Kutta algorithm RK45, a second-third-order Ruge-Kutta algorithm RK23 and a first-fifth-order Ruge-Kutta variable-step-size algorithm RK15, each type of algorithm is provided with 3 versions of a C + + function, a Matlab library function and a C + + dynamic link library, and compared with the fact that as shown in Table 1, RK45 is a widely applied non-rigid differential solution equation method, calculation errors are small, but solution speed is medium. RK23 is a non-rigid differential equation solving method with relatively fast calculation speed and large error. RK15 is an algorithm for rigid body differential equation and differential algebraic equation solution.
TABLE 1 service comparison of nine ordinary differential equation solving algorithms
Step 1, continue to use modeling and simulation service quality measurement index system in fig. 2, and do not need to cut.
And 2, collecting raw data calculated by each index, wherein the data for availability evaluation is shown in a table 2.
TABLE 2 raw data for partial index calculation
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-C | R15-M | R15-D | |
Response time ri/sec | 0.12 | 0.15 | 0.16 | 0.1 | 0.12 | 0.13 | 0.1 | 0.13 | 0.14 |
Time of failure fi/sec | 38 | 66 | 57 | 50 | 52 | 40 | 46 | 57 | 36 |
Service time si/sec | 302 | 521 | 396 | 202 | 241 | 168 | 209 | 262 | 169 |
Screening evaluation is carried out. Taking the functional availability index as an example, determining the weights as omega respectively by adopting an analytic hierarchy process according to the data in the table 2r=0.2,ωf=0.4,ωs0.4. According to the usability metric method, the calculation process is as follows:
min(rT)=0.1,min(fT)=36,max(sT)=521
taking the availability index calculation of R45-C as an example, R1=0.12,f1=38,s1=302
According to the above steps, calculating the index value of each screening class IS=[IS1,IS2,...,ISN1]The calculation results are shown in table 3. In the case of the present invention, the threshold α of the screening type index is 0.6, and the combinability index and the fault tolerance index of R15-C, R15-D are found to be lower than 0.6 through pre-screening, and other 7 services pass through the pre-screening.
TABLE 3 calculation results of the filtered class bottom indicators
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-C | R15-M | R15-D | |
Availability | 0.78 | 0.75 | 0.68 | 0.64 | 0.63 | 0.64 | 0.67 | 0.61 | 0.67 |
Interoperability | 0.8 | 0.75 | 0.68 | 0.76 | 0.77 | 0.65 | 0.72 | 0.68 | 0.63 |
Extensibility | 1 | 0.85 | 0.82 | 0.87 | 0.82 | 0.78 | 0.86 | 0.77 | 0.76 |
Combinability | 0.75 | 0.77 | 1 | 0.73 | 0.73 | 0.8 | 0.57 | 0.63 | 0.73 |
Reusability | 0.78 | 0.8 | 0.9 | 0.74 | 0.7 | 1 | 0.69 | 0.62 | 0.79 |
Normative property | 1 | 0.83 | 0.75 | 0.82 | 0.82 | 0.75 | 0.83 | 0.8 | 0.76 |
Ease of use | 1 | 0.85 | 0.77 | 0.84 | 0.84 | 0.77 | 0.85 | 0.82 | 0.78 |
Throughput capacity | 1 | 0.86 | 0.79 | 1 | 0.86 | 0.79 | 1 | 0.86 | 0.79 |
Redundancy | 1 | 0.84 | 0.75 | 1 | 0.84 | 0.75 | 1 | 0.84 | 0.75 |
Fault tolerance | 0.76 | 0.8 | 0.67 | 0.77 | 0.78 | 0.69 | 0.71 | 0.73 | 0.58 |
And (3) performing index comprehensive evaluation on the service quality of the function, the interface, the data infrastructure and the screening index in sequence by adopting an analytic hierarchy process, wherein the evaluation result is shown in table 4.
TABLE 4 comprehensive evaluation results of screening indexes
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | |
Function(s) | 0.84 | 0.78 | 0.81 | 0.77 | 0.75 | 0.73 | 0.68 |
Interface | 0.96 | 0.83 | 0.79 | 0.81 | 0.80 | 0.81 | 0.77 |
Data infrastructure | 0.92 | 0.83 | 0.74 | 0.92 | 0.83 | 0.74 | 0.81 |
Screening class index quality of service | 0.91 | 0.82 | 0.78 | 0.83 | 0.79 | 0.76 | 0.75 |
Step 3, according to the service quality evaluation result E of the screening type index in the table 4S=[ES1,ES2,...,ESN2],=[0.91,0.82,0.78,0.83,0.79,0.76,0.75]Currently, 7 services exceed the threshold β ═ 0.6, and 7 services are available.
And 4, judging the current service quantity, and switching to the optimization evaluation step because the current service quantity is N3-7 > 1.
And 5, performing preference evaluation, namely firstly acquiring a measurement method according to each preference index, and calculating a preference result as shown in table 5. Because the TOPSIS method is different from the analytic hierarchy process comprehensive mode, in order to take the screening results into consideration in the optimization process, the service quality of the screening indicators is sorted, the weight k of the indexes is 0.4, and the total weight of other optimization indicators is 0.6.
TABLE 5 evaluation results of preferred indices
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | Weight of | |
Accuracy of | 1 | 1 | 1 | 0.78 | 0.78 | 0.78 | 0.85 | 0.08 |
Reliability of | 0.81 | 0.78 | 0.8 | 0.79 | 0.75 | 0.77 | 0.75 | 0.08 |
Stability of | 0.89 | 0.87 | 1 | 0.85 | 0.84 | 0.85 | 0.78 | 0.08 |
Safety feature | 0.83 | 0.78 | 1 | 0.83 | 0.78 | 1 | 0.78 | 0.04 |
Time delay | 1 | 0.88 | 0.9 | 0.89 | 0.85 | 0.87 | 0.85 | 0.04 |
Packet loss rate | 0.86 | 0.95 | 0.81 | 0.92 | 0.97 | 0.9 | 0.98 | 0.04 |
Price of service | 0.73 | 0.81 | 0.77 | 0.87 | 1 | 0.79 | 0.83 | 0.08 |
Degree of credit | 0.81 | 1 | 0.72 | 0.71 | 0.77 | 0.68 | 0.75 | 0.08 |
Success rate | 0.82 | 1 | 0.75 | 0.81 | 0.85 | 0.68 | 0.7 | 0.08 |
TOPSIS ranking calculations were performed and the results of the evaluations and rankings are shown in Table 6. According to the preferred sequencing result, the 7 differential equation solution services are ranked as R45-C, R45-M, R23-C, R23-M, R45-D, R45-M, and a user can select the solution services according to the requirement.
TABLE 6 evaluation results of preferred indices
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | |
Evaluation results | 0.67 | 0.54 | 0.31 | 0.44 | 0.37 | 0.14 | 0.15 |
Sorting | 1 | 2 | 5 | 3 | 4 | 7 | 6 |
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed in a similar manner without departing from the spirit of the invention or exceeding the scope as defined in the appended claims.
Claims (7)
1. A modeling and simulation service quality measurement method for weapon equipment simulation is characterized by comprising the following steps:
step 1: carrying out simulation modeling for weapon equipment, and establishing a modeling and simulation service quality measurement index system according to service requirements;
step 2: performing service screening evaluation on one or more services currently from a service available layer;
and step 3: judging whether the current service is available, if so, turning to the step 4, otherwise, giving an unavailable suggestion of the current service, and turning to the step 7;
and 4, step 4: judging whether the number of the current services is more than two, if so, turning to the step 5, otherwise, turning to the step 6;
and 5: from the service optimization aspect, based on the optimization indexes, performing optimization ranking evaluation on the current service quality, meanwhile, integrating service screening evaluation values, giving a service quality ranking, and going to step 7;
step 6: from the service optimization aspect, based on the optimization indexes, the service value of the current service quality is evaluated, meanwhile, the evaluation value is screened by comprehensive service, and the current service quality metric value is given;
and 7: the quality of service metric process ends.
2. The method of claim 1, wherein: in step 1, the simulation modeling is performed on the weapon-oriented equipment, and specifically comprises the following steps:
establishing a kinematic model of the missile:
in the formula, x and y are respectively the abscissa and the ordinate of the missile position,andthe rate of change of x and y, respectively; vxAnd VyThe components of the missile speed V in the horizontal coordinate direction and the vertical coordinate direction are respectively;andare each VxAnd VyThe change rate of (a), n is missile overload perpendicular to the line of sight of the missile, q is a target azimuth angle, and theta is a missile runway angle;
setting the target to do uniform motion in an attack plane, and then:
in the formula, xTAnd yTAre respectively target positionThe horizontal and vertical coordinates of the device are set,andare respectively xTAnd yTRate of change of (V)TIs the target speed, σTIs a target course angle;
establishing a bullet relative motion model:
wherein r is the relative distance between the missile and the target,andactual rates of change for r and q, respectively; eta is missile velocity vector lead angle etaTFor the target velocity vector lead angle, σTIs a target course angle;
selecting a proportional guidance law, and constructing a missile guidance model:
in the formula, K is a proportional-pilot law coefficient.
3. The method of claim 2, wherein: in step 1, the service requirement refers to the requirement of a user on the aspects of functions, performances and interfaces of the modeling and simulation service; the modeling and simulation service quality measurement index system comprises two types of measurement indexes of service screening and service optimization, wherein the service screening type measurement indexes comprise:
functional indicators including availability, interoperability, extensibility, combinability, wherein availability is determined by response time, failure time, service time;
interface indexes including reusability, normalization and usability;
data infrastructure metrics including throughput, redundancy, fault tolerance;
the service preference class metric includes:
performance indicators, including accuracy, reliability, stability;
network indexes including security, time delay and packet loss rate;
and the user perception indexes comprise service price, credit degree and success rate.
4. The method of claim 3, wherein: the step 2 comprises the following steps:
setting the current service number as N1, and calculating the service screening class index values I of different servicesS=[IS1,IS2,...,ISN1]In which ISiA vector consisting of screening class index values for the ith service, wherein i is 1, 2. The service screening index calculation method comprises the following steps:
(1) the function index measuring method comprises the following steps:
availability: the availability refers to the capability of the service to provide required functions at a certain time point or within a certain time period, and is determined by response time, failure time and service time, and the measurement method is as follows:
in the formula, FAiAn availability evaluation value for the ith service; [ omega ]r ωf ωs]Weight coefficients, ω, for response time, fault time, and service time, respectivelyr+ωf+ωs=1;ri、fi、siThe ith service average response time, mean time to failure, and mean time to service, i 1, 2., N,n is the number of services of the same type, rT=[r1,r2,…,rN]、fT=[f1,f2,…,fN]、sT=[s1,s2,…,sN]Respectively mean response time, fault time and service time vector of the same type of service;
interoperability: interoperability refers to the ability of two or more services to share information, including the ability to invoke and be invoked by other services, and measures the method:
in the formula, FIiAn interoperability assessment for the ith service; [ omega ]call ωcalled]Weight coefficients, ω, for invoking other services and for being invoked by other services, respectivelycall+ωcalled=1;ccalli、ccallediRespectively calling other services for the ith service and calling times of the ith service by other services; c. Ccall=[ccall1,ccall2,…,ccallN]、ccalled=[ccalled1,ccalled2,…,ccalledN]Respectively calling other services and calling the frequency vectors by the other services;
and (3) expandability: the expandability refers to the capability of adapting to the change of the demand of the service and expanding and generating new service through function inheritance, and the measurement method comprises the following steps:
in the formula, FEiA scalability evaluation value for the ith service; e.g. of the typeiThe number of new services generated for the ith service extension; e ═ e1,e2,…,eN]Generating a new service quantity vector for the extension;
combinability: combinability refers to the ability to form a new service by combining existing services, and the measurement method is as follows:
in the formula, FCiA combinability evaluation value for the ith service; c. CiGenerating a combined service number for the ith service; c ═ c1,c2,…,cN]A vector of the number of services for the combination;
(2) the interface index measurement method comprises the following steps:
reusability: reusability refers to the ability of a service interface to be reused without or with little modification, and the measurement method is as follows:
in the formula IRiA reusability evaluation value for the ith service; r isiThe number of times of reuse for the ith service; r ═ r1,r2,…,rN]Is a reuse number vector;
standardization: the normalization refers to the characteristic of restricting and normalizing the content of the service interface, and the measuring method comprises the following steps:
in the formula INiA normative evaluation value for the ith service; n isiThe ith service specification degree; n ═ n1,n2,…,nN]Is a normalized degree vector;
ease of use: the usability refers to the easiness of understanding, learning and mastering the service by a user, and the measuring method comprises the following steps:
in the formula IUiAn ease of use assessment value for the ith service; u. ofiUsing times for the ith service; u ═ u1,u2,…,uN]Is a vector of the number of times of use;
(3) the data infrastructure index measurement method comprises the following steps:
throughput: the data throughput refers to the quantity of data of modeling and simulation services received in unit time, and the measurement method comprises the following steps:
in the formula, DTiAn estimate of throughput for the ith service; t is tiData throughput for the ith service; t ═ t1,t2,…,tN]Is a data throughput vector;
redundancy: the redundancy means that in order to improve the robustness of modeling and simulation services, the data of the modeling and simulation services are backed up in a data infrastructure, so that the influence of faults on the modeling and simulation services is reduced, and the measurement method comprises the following steps:
in the formula, DRiA redundancy evaluation value for the ith service; r isiThe number of data backups for the ith service; r ═ r1,r2,…,rN]A quantity vector for data backup;
fault tolerance: the fault tolerance is also called fault tolerance, which means that when an error occurs, the normal capability of the system is restored through data backup of different storage devices, and the measurement method is as follows:
in the formula, DFiA fault tolerance evaluation value for the ith service; f. ofbi、friRespectively representing the total times of faults and the times of fault recovery;
judging whether an index value is smaller than a service screening class measurement index threshold value alpha in each service, if so, rejecting the service, and otherwise, reserving the service; the number of services eventually reached is set to N2;
and finally, integrating the service screening class measurement indexes by adopting an analytic hierarchy process, and calculating to obtain screening evaluation values E of the N2 servicesS=[ES1,ES2,...,ESN2],ESiN2 is the screening evaluation value for the ith service, i 1, 2.
5. The method of claim 4, wherein: in step 3, if a service screening evaluation result is greater than or equal to the threshold β, the service is available, otherwise the service is unavailable.
6. The method of claim 5, wherein: the step 5 comprises the following steps: setting the current service number as N3, firstly, calculating the service preference index values I for different servicesO=[IO1,IO2,...,ION3]In which IOi(i 1, 2.,. N3) is a vector consisting of each preferred index value of the ith service; the service optimization index calculation method comprises the following steps:
(1) performance index measurement method
The accuracy is as follows: the accuracy refers to the accuracy of modeling and simulation service, and the measurement method comprises the following steps:
in the formula, PAiAn accuracy evaluation value for the ith service; a isi=|rai-reiI is the accuracy of the ith service, rai、reiRespectively serving actual results and expected results; a ═ a1,a2,…,aN]Is an accuracy vector;
reliability: reliability refers to the ability of a service to maintain a normal, correct operating state when used in a given environment, and is measured by:
in the formula, PRiReliability evaluation value for ith service; r isiMean time to failure for the ith service; r ═ r1,r2,…,rN]Is an average time without failure vector;
stability: the stability refers to the capability of the service to continue normal operation when the operation environment changes, and the measurement method comprises the following steps:
in the formula, PSiStability assessment value for ith service; siThe service is normally operated after the environment is changed for the ith service; s ═ s1,s2,…,sN]The normal operation time vector is the normal operation time vector after the environment is changed;
(2) network index measuring method
Safety: the security refers to that the software and hardware of the network and the system thereof are protected and are not damaged, changed and leaked due to accidental or malicious reasons, and the reliable, normal and uninterrupted operation of modeling and simulation services can be supported, and the measurement method comprises the following steps:
in the formula, NSiA security assessment value for the ith service; siA service time for the ith service; s ═ s1,s2,…,sN]Is a service time vector;
time delay: the time delay refers to the time required for transmitting a data packet from a network transmitting end to a receiving end, and the measurement method comprises the following steps:
in the formula, NTiA delay evaluation value for the ith service; t is tiServing the ith delay; t ═ t1,t2,…,tN]Is a service delay vector;
packet loss rate: the packet loss rate refers to the total amount of data packets which are lost in the using process of the modeling and simulation service, and the packet loss rate measurement method comprises the following steps
In the formula, NPiEvaluating the packet loss rate of a single simulation task in the ith service; p is a radical ofli、ptiRespectively combing and transmitting the lost information packets during a single simulation task;
(3) user perception index measuring method
Service price: the service price refers to a method for measuring the cost which must be paid when the service requester calls the operation, and the method comprises the following steps:
in the formula of UPiA service price evaluation value for the ith service; p is a radical ofiFor the ith service price; p ═ p1,p2,…,pN]As a service price vector;
creditworthiness: the credibility refers to the degree of satisfaction and trust of the user on the service, and the measuring method comprises the following steps:
in the formula of UCiEvaluating the credibility of the ith service; c. CiScoring the ith service; c ═ c1,c2,…,cN]Scoring a vector for the service;
success rate: the success rate refers to the probability that the service can be correctly responded, and the measurement method is as follows:
in the formula of USiA success rate evaluation value for the ith service; sri、sciRespectively calling times for the requested service and times for the service to be correctly responded;
then, the service quality evaluation results of the N3 screening indexes are taken as an index and are included in the service preference class measurement indexes, the occupied weight is k, and 0< k < 1;
and finally, sequencing the indexes by adopting a TOPSIS method, and calculating to obtain the quality degree E of each serviceRiThe final quality ordering result is according to ERiThe magnitude of the values is determined in order and goes to step 7.
7. The method of claim 6, wherein: the step 6 comprises the following steps: firstly, for the current service, calculating a service preference class index value IOk;
Then, determining the weight of the service preference class measurement index by adopting an analytic hierarchy process;
finally, the service quality evaluation result of the screening type index is brought into the service optimization type measurement index, weighted summation is carried out, and the evaluation value E of the current service is obtained through calculationV。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010920457.XA CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010920457.XA CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112182848A true CN112182848A (en) | 2021-01-05 |
CN112182848B CN112182848B (en) | 2023-08-01 |
Family
ID=73925480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010920457.XA Active CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112182848B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115208808A (en) * | 2022-09-14 | 2022-10-18 | 北京智芯微电子科技有限公司 | Service quality testing method and device, chip equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102136034A (en) * | 2011-03-18 | 2011-07-27 | 北京航空航天大学 | Military aircraft reliability quantitative requirement demonstration method |
CN105930645A (en) * | 2016-04-18 | 2016-09-07 | 中国人民解放军重庆通信学院 | Communication station equipment maintenance support capability assessment method based on principal component analysis |
CN108615122A (en) * | 2018-05-11 | 2018-10-02 | 北京航空航天大学 | A kind of air-defense anti-missile system combat capability assessment method |
CN109190143A (en) * | 2018-07-11 | 2019-01-11 | 北京晶品镜像科技有限公司 | A kind of network-enabled intelligent ammunition multi-scheme appraisal procedure based on operation l-G simulation test |
CN109472494A (en) * | 2018-11-12 | 2019-03-15 | 中国人民解放军火箭军工程大学 | A kind of command and control system service guarantee effectiveness assessment index quantitative model |
CN110069815A (en) * | 2019-03-14 | 2019-07-30 | 中科恒运股份有限公司 | Index system construction method, system and terminal device |
-
2020
- 2020-09-04 CN CN202010920457.XA patent/CN112182848B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102136034A (en) * | 2011-03-18 | 2011-07-27 | 北京航空航天大学 | Military aircraft reliability quantitative requirement demonstration method |
CN105930645A (en) * | 2016-04-18 | 2016-09-07 | 中国人民解放军重庆通信学院 | Communication station equipment maintenance support capability assessment method based on principal component analysis |
CN108615122A (en) * | 2018-05-11 | 2018-10-02 | 北京航空航天大学 | A kind of air-defense anti-missile system combat capability assessment method |
CN109190143A (en) * | 2018-07-11 | 2019-01-11 | 北京晶品镜像科技有限公司 | A kind of network-enabled intelligent ammunition multi-scheme appraisal procedure based on operation l-G simulation test |
CN109472494A (en) * | 2018-11-12 | 2019-03-15 | 中国人民解放军火箭军工程大学 | A kind of command and control system service guarantee effectiveness assessment index quantitative model |
CN110069815A (en) * | 2019-03-14 | 2019-07-30 | 中科恒运股份有限公司 | Index system construction method, system and terminal device |
Non-Patent Citations (2)
Title |
---|
李士勇等: "《智能制导的导弹智能自适应导引律》", 31 December 2011 * |
赵生慧等: "SOA的QoS研究综述", 《计算机科学》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115208808A (en) * | 2022-09-14 | 2022-10-18 | 北京智芯微电子科技有限公司 | Service quality testing method and device, chip equipment and storage medium |
CN115208808B (en) * | 2022-09-14 | 2023-01-24 | 北京智芯微电子科技有限公司 | Service quality testing method and device, chip equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN112182848B (en) | 2023-08-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104850727B (en) | Distributed big data system risk appraisal procedure based on Cloud focus theory | |
Wang et al. | Strategic social team crowdsourcing: Forming a team of truthful workers for crowdsourcing in social networks | |
CN109960863B (en) | Complex simulation system credibility evaluation method based on network topology path | |
CN110348752A (en) | A kind of large scale industry system structure security assessment method considering environmental disturbances | |
US20080239983A1 (en) | Method for integrating downstream performance and resource usage statistics into load balancing weights | |
CN113542266B (en) | Virtual network element trust measurement method and system based on cloud model | |
CN112182848A (en) | Modeling and simulation service quality measurement method for weapon equipment simulation | |
CN112418341A (en) | Model fusion method, prediction method, device, equipment and storage medium | |
CN115225336A (en) | Vulnerability availability calculation method and device for network environment | |
Siriweera et al. | QoS and customizable transaction-aware selection for big data analytics on automatic service composition | |
CN114398685A (en) | Government affair data processing method and device, computer equipment and storage medium | |
CN107888561B (en) | Civil aircraft-oriented safety service combination system | |
CN113850669A (en) | User grouping method and device, computer equipment and computer readable storage medium | |
KR20220084752A (en) | Business model of route recommendation services based on artificial intelligence using marine meteorological big data | |
WO2021227303A1 (en) | Unmanned aerial vehicle signal suppression device scheduling method and apparatus, electronic device, and medium | |
CN114866338A (en) | Network security detection method and device and electronic equipment | |
CN110472191B (en) | Dynamic self-adaptive service evaluation calculation method and device | |
CN112668842A (en) | Vehicle insurance claim settlement risk factor evaluation method and device, electronic equipment and medium | |
CN113191674A (en) | Security risk assessment method and device, storage medium and electronic equipment | |
Chen et al. | Collaborative QoS prediction via feedback-based trust model | |
CN113590841B (en) | Intelligent rapid examination and intelligent early warning system and method based on knowledge graph | |
CN118885378A (en) | Unmanned aerial vehicle assessment model, unmanned aerial vehicle assessment method, unmanned aerial vehicle assessment equipment storage medium and program product | |
CN113032259B (en) | Fuzzy-based networked software system reliability index distribution method | |
US11861015B1 (en) | Risk scoring system for vulnerability mitigation | |
CN116932228B (en) | Edge AI task scheduling and resource management system based on volunteer calculation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |