CN112182848B - Modeling and simulation service quality measurement method for weapon equipment simulation - Google Patents
Modeling and simulation service quality measurement method for weapon equipment simulation Download PDFInfo
- Publication number
- CN112182848B CN112182848B CN202010920457.XA CN202010920457A CN112182848B CN 112182848 B CN112182848 B CN 112182848B CN 202010920457 A CN202010920457 A CN 202010920457A CN 112182848 B CN112182848 B CN 112182848B
- Authority
- CN
- China
- Prior art keywords
- service
- simulation
- services
- index
- ith
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000004088 simulation Methods 0.000 title claims abstract description 92
- 238000000691 measurement method Methods 0.000 title claims abstract description 66
- 238000011156 evaluation Methods 0.000 claims abstract description 95
- 238000012216 screening Methods 0.000 claims abstract description 61
- 238000005259 measurement Methods 0.000 claims abstract description 40
- 230000006870 function Effects 0.000 claims abstract description 19
- 238000013441 quality evaluation Methods 0.000 claims abstract description 12
- 238000005457 optimization Methods 0.000 claims abstract description 8
- 230000008447 perception Effects 0.000 claims abstract description 7
- 239000013598 vector Substances 0.000 claims description 60
- 238000000034 method Methods 0.000 claims description 42
- 230000004044 response Effects 0.000 claims description 18
- 238000004364 calculation method Methods 0.000 claims description 17
- 238000010606 normalization Methods 0.000 claims description 13
- 230000008569 process Effects 0.000 claims description 13
- 230000008859 change Effects 0.000 claims description 9
- 238000012986 modification Methods 0.000 claims description 9
- 230000004048 modification Effects 0.000 claims description 9
- 238000012163 sequencing technique Methods 0.000 claims description 5
- 238000005094 computer simulation Methods 0.000 claims description 4
- 238000011084 recovery Methods 0.000 claims description 3
- 238000003860 storage Methods 0.000 claims description 3
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 238000004422 calculation algorithm Methods 0.000 description 14
- 238000013461 design Methods 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 238000011161 development Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 230000007123 defense Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- OPDFUQJBZZJZRG-WPJYNPJPSA-N (4r,4as,7r,7ar,12bs)-7-[2-[2-[2-[[(4r,4as,7r,7ar,12bs)-3-(cyclopropylmethyl)-4a,9-dihydroxy-1,2,4,5,6,7,7a,13-octahydro-4,12-methanobenzofuro[3,2-e]isoquinoline-7-yl]amino]ethoxy]ethoxy]ethylamino]-3-(cyclopropylmethyl)-1,2,4,5,6,7,7a,13-octahydro-4,12-me Chemical compound N1([C@@H]2CC3=CC=C(C=4O[C@@H]5[C@](C3=4)([C@]2(CC[C@H]5NCCOCCOCCN[C@H]2[C@@H]3OC=4C(O)=CC=C5C[C@@H]6[C@]([C@@]3(CCN6CC3CC3)C5=4)(O)CC2)O)CC1)O)CC1CC1 OPDFUQJBZZJZRG-WPJYNPJPSA-N 0.000 description 1
- 208000021236 Hereditary diffuse leukoencephalopathy with axonal spheroids and pigmented glia Diseases 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 238000012550 audit Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000012043 cost effectiveness analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 208000036969 diffuse hereditary with spheroids 1 leukoencephalopathy Diseases 0.000 description 1
- 238000012854 evaluation process Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000004806 packaging method and process Methods 0.000 description 1
- 238000001303 quality assessment method Methods 0.000 description 1
- 230000000452 restraining effect Effects 0.000 description 1
- 229940081330 tena Drugs 0.000 description 1
- 238000012384 transportation and delivery Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F30/00—Computer-aided design [CAD]
- G06F30/20—Design optimisation, verification or simulation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/04—Constraint-based CAD
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2111/00—Details relating to CAD techniques
- G06F2111/10—Numerical modelling
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2119/00—Details relating to the type or aim of the analysis or the optimisation
- G06F2119/14—Force analysis or force optimisation, e.g. static or dynamic forces
Landscapes
- Business, Economics & Management (AREA)
- Engineering & Computer Science (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Tourism & Hospitality (AREA)
- Development Economics (AREA)
- Educational Administration (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- General Physics & Mathematics (AREA)
- Entrepreneurship & Innovation (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- General Health & Medical Sciences (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Health & Medical Sciences (AREA)
- Quality & Reliability (AREA)
- Primary Health Care (AREA)
- Computer Hardware Design (AREA)
- Evolutionary Computation (AREA)
- Geometry (AREA)
- General Engineering & Computer Science (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Data Exchanges In Wide-Area Networks (AREA)
Abstract
The invention discloses a modeling and simulation service quality measurement method for weapon equipment simulation, which establishes a modeling and simulation service quality measurement index system from two layers of service screening and service optimization, firstly performs service screening evaluation from the aspects of functions, interfaces, data infrastructures and the like, and ensures service availability; if a plurality of services pass the screening evaluation, carrying out service preference ranking evaluation in aspects of performance, network, user perception and the like, and giving out the priority ranking of the plurality of services; if only a single service passes the screening evaluation, carrying out service preference value evaluation by combining with a preference index to give a service quality measurement comprehensive value; otherwise, returning the information of the service which is not found to be available. The invention can effectively solve the problems of modeling and simulation service quality evaluation and selection for weapon equipment simulation.
Description
Technical Field
The invention relates to the field of computer simulation and evaluation, in particular to a modeling and simulation service quality measurement method for weapon equipment simulation.
Background
Through the development of more than half a century, modeling and simulation techniques have been widely used in the fields of aerospace, electronics, ships, manufacturing, and the like. In order to meet urgent requirements of distributed simulation, the united states department of defense proposes a concept of advanced distributed simulation, and a simulation architecture goes through development stages such as SIMNET, DIS, ALSP, HLA, TENA. However, current modeling and simulation architectures have difficulty meeting increasingly complex, accurate weapon equipment simulation application requirements. On one hand, modeling simulation objects are more and more complex, so that the work for constructing a complex simulation system is very complicated, and the existing simulation architecture is difficult to reuse and interoperate the existing architecture resources; on the other hand, how to realize rapid combined simulation, optimize the system design scheme or flow, and cope with the rapid iteration of the demands is the key for improving the design quality. The existing simulation architecture has the defects of interconnection, expandability and the like, and the iterative upgrading of the modeling and simulation architecture is needed to be realized, so that the modeling and simulation capability of the weapon equipment is improved.
Emerging information technologies such as service-oriented architecture, cloud computing and the like provide possibilities for constructing a generalized, extensible and reusable modeling and simulation architecture. The North approximately modeling and Simulation group (MSG-131) innovatively proposes a concept of MSaaS (modeling-Simulation-as-a-Service), and in 2014, works in aspects of research, proposal and evaluation of standards, protocols, architecture, execution, cost effectiveness analysis and the like of MSaaS are developed based on MSaaS results to establish MSG-136 (MSaaS: interoperable, reliable Simulation environment rapid deployment). By 2017, the group completed MSaaS concept development and evaluation. In addition, on 11 months and 29 days 2017, the SEA of the cobart company issues the simulation system structure, interoperability and management research project results of the national defense science and technology laboratory, and the MSaaS in the fields of simulation, experimental evaluation, collection based on simulation and the like supports users to save cost, improve efficiency, provide a better overall solution and is becoming an important method of simulation delivery strategy.
Quality of service (QoS) refers to the "overall effect of using services, determining the user's satisfaction with these services". Under the new architecture oriented to weapon equipment simulation, the quality of modeling and simulation service is evaluated, in fact, the ability of the service to meet the user's needs. When there are multiple services available for selection, it is important how the user selects the optimal service. In this regard, related research work has been carried out on cloud computing quality of service evaluation methods in the literature. The Hangzhong provides seven important characteristics of the cloud computing service from the two angles of core service and support service; li Kaiman and the like provide service availability, reliability, efficiency, readability and other service quality assessment models for Web services in cloud computing; rogowski et al studied a cloud computing service optimal pricing model based on quality of service guarantee. However, no relevant literature has been retrieved at present for how to evaluate the quality of service of modeling and simulation of metric-oriented weapon equipment simulation.
Disclosure of Invention
The invention aims to provide a modeling and simulation service quality measurement method for weapon equipment simulation, which can effectively solve the problems of modeling, simulation service quality evaluation and selection.
The specific technical scheme of the invention is as follows:
a modeling and simulation service quality measurement method for weapon equipment simulation comprises the following steps:
step 1: establishing a modeling and simulation service quality measurement index system according to service requirements;
step 2: performing service screening evaluation on one or more current services from a service availability level;
step 3: judging whether the current service is available or not, if yes, turning to step 4, otherwise, giving a suggestion that the current service is unavailable, and turning to step 7;
step 4: judging whether the current service is more than two, if so, turning to step 5, otherwise, turning to step 6;
step 5: from the service preference level, based on preference indexes, carrying out preference ranking evaluation on the current service quality, synthesizing service screening evaluation values, giving out service quality ranking, and turning to step 7;
step 6: from the service preference level, based on preference index, evaluating the service value of the current service quality, and simultaneously comprehensively screening the evaluation value to give the current service quality measurement value;
step 7: the quality of service measurement procedure ends.
The weapon equipment-oriented simulation modeling specifically comprises the following steps:
in the weapon equipment combat simulation, mainly comprises types of equipment such as sensors and aircrafts, wherein the aircrafts comprise a kinematics/dynamics model, a missile-based relative motion model, a guidance model and the like, and the parameters of the aircraft need to be solved, such as position/speed component solving, proportional guidance design and the like. Specifically, taking a missile attack on a certain moving target as an example, a kinematic/dynamic model of the missile is established:
wherein x and y are the abscissa and ordinate of the missile position respectively,and->The rates of change of x and y, respectively; v (V) x And V y The components of the missile speed V in the horizontal coordinate and the vertical coordinate are respectively;And->V respectively x And V y N is missile overload perpendicular to the view line of the missile, q is the target azimuth angle, and θ is the missile trajectory angle.
The target is set to do uniform motion in the attack plane, and the following steps are:
wherein x is T And y T Respectively the abscissa and ordinate of the target position,and->Respectively x T And y T Rate of change of V T For target speed, sigma T Is the target heading angle.
Establishing a bullet-mesh relative motion model:
wherein r is the relative distance between the missile and the target,and->Actual rates of change of r and q, respectively; eta is the missile velocity vector lead angle and eta T For target velocity vector lead angle, σ T Is the target heading angle.
Selecting a proportional guidance law and constructing a missile guidance model:
where K is a proportional navigational law coefficient.
The above model requires the pair V x 、V y 、x T 、y T And solving parameters such as K and the like, wherein a related solving algorithm is regarded as a service.
In step 1, the service requirement refers to the requirement of a user on functions, performances and interfaces of modeling and simulation services; the modeling and simulation service quality measurement index system comprises two kinds of measurement indexes of service screening and service optimization, wherein the service screening type measurement indexes comprise:
functional indicators including availability, interoperability, scalability, combinability, wherein availability is determined jointly by response time, failure time, service time;
interface indexes including reusability, standardability, and ease of use;
data infrastructure metrics including throughput, redundancy, fault tolerance;
the service preference class metrics include:
performance indexes including accuracy, reliability, stability;
network indexes including security, time delay and packet loss rate;
the user perceives indexes including service price, credibility and success rate.
The step 2 comprises the following steps:
setting the current service number as N1, and respectively calculating service screening class index values I of different services S =[I S1 ,I S2 ,...,I SN1 ]Wherein I Si A vector of index values for each screening class for the i-th service, i=1, 2, N1; the service screening class index calculation method comprises the following steps:
(1) The function index measuring method comprises the following steps:
availability of: availability refers to the ability of a service to provide a desired function at a certain point in time or period of time, and is determined by the response time, the failure time and the service time, and the measurement method is as follows:
wherein F is Ai An availability evaluation value for the ith service; [ omega ] r ω f ω s ]Weight coefficients, ω, respectively response time, failure time, service time r +ω f +ω s =1;r i 、f i 、s i Average response time, average failure time, average service time, i=1, 2, respectively, for the ith service T =[r 1 ,r 2 ,…,r N ]、f T =[f 1 ,f 2 ,…,f N ]、s T =[s 1 ,s 2 ,…,s N ]Average response time, failure time and service time vector of the same type of service respectively;
interoperability: interoperability refers to the ability to share information between two or more services, including the ability to invoke and be invoked by other services, and measures are:
wherein F is Ii An interoperability evaluation value for the ith service; [ omega ] call ω called ]Weight coefficient, ω, respectively, for invoking other services, invoked by other services call +ω called =1;c calli 、c calledi Calling other services for the ith service and calling times by the other services respectively; c call =[c call1 ,c call2 ,…,c callN ]、c called =[c called1 ,c called2 ,…,c calledN ]Calling other services and calling the number of times vector by other services respectively;
scalability: the expandability refers to the capability of the service to adapt to the change of the demand, and the new service is generated by expanding through function inheritance, and the measurement method is as follows:
wherein F is Ei An expandability evaluation value for the i-th service; e, e i The number of new services generated for the ith service extension; e= [ e 1 ,e 2 ,…,e N ]Generating a new service quantity vector for the extension;
combinability: combinability refers to the ability to form new services by combining existing services, and the measurement method is as follows:
wherein F is Ci A combinability evaluation value for an ith service; c i Generating a combined service number for the ith service;
c=[c 1 ,c 2 ,…,c N ]a vector of number of services for the portfolio;
(2) The interface index measurement method comprises the following steps:
reusability: reusability refers to the ability of a service interface to be reused without modification or with little modification, and is measured by:
wherein I is Ri Reusability assessment value for the ith service; r is (r) i Reuse times for the ith service; r= [ r ] 1 ,r 2 ,…,r N ]A number of times vector is used;
normalization: normalization refers to the feature of constraint and normalization of service interface content, and the measurement method is as follows:
wherein I is Ni A normalization evaluation value for the i-th service; n is n i The ith service specification level; n= [ n ] 1 ,n 2 ,…,n N ]Is a normative degree vector;
ease of use: the usability refers to the ease with which the service can be understood, learned and mastered by the user, and the measurement method is as follows:
wherein I is Ui An usability assessment value for the ith service; u (u) i The number of times of using the ith service; u= [ u ] 1 ,u 2 ,…,u N ]Is a vector of the number of uses;
(3) Data infrastructure index measurement method:
throughput: the data throughput refers to the quantity of data of modeling and simulation service received in unit time, and the measurement method is as follows:
wherein D is Ti Throughput evaluation value for the i-th service; t is t i Data throughput for the ith service; t= [ t ] 1 ,t 2 ,…,t N ]A data throughput vector;
redundancy: redundancy means that in order to improve the robustness of modeling and simulation services, the data of the modeling and simulation services are backed up in a data infrastructure, so that the influence of faults on the modeling and simulation services is reduced, and the measurement method is as follows:
wherein D is Ri Redundancy evaluation value for the i-th service; r is (r) i The number of data backups for the i-th service;
r=[r 1 ,r 2 ,…,r N ]a data backup quantity vector;
fault tolerance: fault tolerance is also called fault tolerance, and refers to the ability of recovering the system from data backup of different storage devices when an error occurs, and the measurement method is as follows:
wherein D is Fi Fault tolerance assessment value for the i-th service; f (f) bi 、f ri The total times of faults and the times of fault recovery are respectively;
judging whether index values in each service are smaller than a service screening class measurement index threshold alpha, if so, rejecting the service, otherwise, reserving the service; finally, setting the number of the up-to-standard services as N2;
finally, analytical hierarchical methods (references Deng Xue, li Guming, zeng Haojian, chen Junyang, zhao Junfeng. Analytical hierarchical methods weight calculation method analysis and application research [ J ]]Mathematical practice and knowledge 2012,42 (7): 93-100.), the service screening class metric indexes are synthesized, and N2 service screening evaluation values E are obtained by calculation S =[E S1 ,E S2 ,...,E SN2 ],E Si Screening evaluation value for the i-th service, i=1, 2,..n 2.
In step 3, if a service screening evaluation result is greater than or equal to the threshold β, the service is available, otherwise the service is not available.
The step 5 comprises the following steps: setting the current service number as N3, firstly, respectively calculating service preference index value I of different services O =[I O1 ,I O2 ,...,I ON3 ]Wherein I Oi (i=1, 2,.,. N3) is a vector of index values for each preference class of the ith service; the service preference index calculating method comprises the following steps:
(1) Performance index measurement method
Accuracy: accuracy refers to the accuracy of modeling and simulation services, and the measurement method is as follows:
wherein P is Ai An accuracy evaluation value for the i-th service; a, a i =|r ai -r ei I is the accuracy of the ith service, r ai 、r ei Respectively serving an actual result and an expected result; a= [ a ] 1 ,a 2 ,…,a N ]Is an accuracy vector;
reliability: reliability refers to the ability of a service to maintain a properly functioning state when used in a defined environment, and is measured by:
wherein P is Ri Reliability evaluation value for the i-th service; r is (r) i Mean time to failure for the ith service;
r=[r 1 ,r 2 ,…,r N ]is an average fault-free time vector;
stability: stability refers to the ability of a service to continue to operate normally when the operating environment changes, and the measurement method is as follows:
wherein P is Si A stability evaluation value for the i-th service; s is(s) i The normal running time of the ith service after the environment changes; s= [ s ] 1 ,s 2 ,…,s N ]The normal operation time vector is the normal operation time vector after the environment changes;
(2) Network index measurement method
Safety: the security means that the software and hardware of the network and the system thereof are protected from damage, modification and leakage caused by accidental or malicious reasons, and can support reliable, normal and uninterrupted operation of modeling and simulation services, and the measurement method comprises the following steps:
wherein N is Si A security evaluation value for the i-th service; s is(s) i Service time for the ith service; s= [ s ] 1 ,s 2 ,…,s N ]Is a service time vector;
time delay: the time delay is the time required for transmitting a data packet from a network sending end to a receiving end, and the measurement method is as follows:
wherein N is Ti Delay evaluation value for the ith service; t is t i Is the ith service delay; t= [ t ] 1 ,t 2 ,…,t N ]Is a service delay vector;
packet loss rate: the packet loss rate refers to the quantity of lost data packets accounting for the total quantity of the transmitted data packets in the using process of modeling and simulation service, and the measurement method of the packet loss rate is as follows
Wherein N is Pi The packet loss rate evaluation value of the single simulation task in the ith service is obtained; p is p li 、p ti The total amount of the lost information packets and the transmitted information packets during single simulation task are respectively calculated;
(3) User perception index measurement method
Service price: the service price refers to the fee measurement method that the service requester must pay to call this operation, which is:
in U Pi A service price evaluation value for the i-th service; p is p i The ith service price; p= [ p ] 1 ,p 2 ,…,p N ]Is a service price vector;
reputation degree: the credibility refers to the satisfaction and the credibility of the user on the service, and the measurement method comprises the following steps:
in U Ci A reputation evaluation value for the ith service; c i Scoring the ith service; c= [ c ] 1 ,c 2 ,…,c N ]Scoring the service vector;
success rate: the success rate refers to the probability that the service can be correctly responded, and the measurement method is as follows:
in U Si A success rate evaluation value for the i-th service; s is(s) ri 、s ci The number of times the service is called and the number of times the service is correctly responded is respectively requested;
then, the N3 screening class index service quality evaluation results are taken as an index to be included into a service preference class measurement index, wherein the occupied weight is k (0 < k < 1);
finally, the TOPSIS method (ref. Hu Yonghong. Improvement of TOPSIS method for comprehensive evaluation [ J)]Mathematical practice and knowledge 2002,32 (4): 572-575.) the above indexes are sequenced, and the quality degree E of each service is calculated Ri The final good and bad sequencing result is according to E Ri The magnitude order of the values is determined and step 7 is entered.
The step 6 comprises the following steps: first, for the current service, a service preference index value I is calculated Ok ;
Then, determining the weight of the service preference type measurement index by adopting an analytic hierarchy process;
finally, the service quality evaluation result of the screening class index is incorporated into the service preference class measurement index, weighted summation is carried out, and the evaluation value E of the current service is obtained by calculation V 。
In particular, for cost-type indicators, e.g. quality of service V, which measures response time r Expressed more strictly as
The cost index measurement expression in the invention is similar, but is simplified from the viewpoint of simplicity of expression. Similarly, for benefit-type indicators, e.g. quality of service V, for measuring service time s Expressed more strictly as
The invention also simplifies the measurement formula of the benefit index.
In particular, the thresholds α, β e [0,1] to which the present invention relates are typically obtained from historical experience, or are assessed by an expert in the relevant field.
The invention has the positive progress effects that: the invention starts from two dimensions of service screening and service optimization, and establishes a service quality measurement index system covering six aspects of functions, interfaces, data infrastructure, performance, network, user perception and the like; the service quality evaluation flow is further designed, the service which does not meet the availability requirement is removed through screening evaluation, the quality sorting of a plurality of services is realized through sorting evaluation, and the user preference is supported. The method has the advantages of complete index, reasonable index measurement method, strong operability of the evaluation process, easy understanding and acceptance by the personnel in the field, and important decision reference basis for the user to preferentially select from a plurality of services.
Drawings
FIG. 1 is a schematic diagram of a quality of service measurement flow for modeling and simulation of weapon equipment simulation.
FIG. 2 is a schematic diagram of a modeling and simulation quality of service metrics architecture for weapon equipment simulation.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and detailed description.
As shown in FIG. 1, the modeling and simulation service quality measurement method for weapon equipment simulation of the invention can be divided into two main categories of service screening and service optimization, wherein the service quality measurement and evaluation index covers six aspects of functions, interfaces, data infrastructure, performance, network, user perception and the like, service optimization is performed on the basis of ensuring service availability, and a service quality measurement comprehensive value is given.
With reference to fig. 1, the modeling and simulation service quality measurement method for weapon equipment simulation of the present invention includes the following steps:
step 1: and establishing a modeling and simulation service quality measurement index system according to the service requirements.
As shown in FIG. 2, the metric index system is composed of service screening class indexes and service preference class indexes, wherein the service screening class indexes are as follows
1) Functional indicators, including availability, interoperability, scalability, combinability, etc., wherein availability is determined jointly by factors such as response time, failure time, service time, etc.;
2) Interface indexes including reusability, standardability, ease of use, and the like;
3) Data infrastructure metrics, including throughput, redundancy, fault tolerance, and the like.
The service preference class measurement index comprises
1) Performance indicators including accuracy, reliability, stability, etc.;
2) Network indexes including security, time delay, packet loss rate and the like;
3) User perceived metrics including service price, reputation, success rate, etc.
Step 2: from the service availability level, a service screening evaluation is performed on the current one or more services.
Assume that the current number of services is N1. Firstly, respectively calculating service screening class index values I of different services S =[I S1 ,I S2 ,...,I SN1 ]Wherein I Si (i=1, 2.,. N1) is a vector of index values for each filter class for the ith service. The service screening class index calculation method comprises the following steps:
1) Function index measuring method
a) Availability of
Availability refers to the ability of a service to provide a desired function at a certain point in time or period of time, determined by factors such as response time, failure time, service time, etc., measured by
Wherein F is Ai An availability evaluation value for the ith service; [ omega ] r ω f ω s ]For the weight coefficient of response time, fault time, service time, omega r +ω f +ω s =1;r i 、f i 、s i (i=1, 2,., N) is the i-th service average response time, average failure time, average service time, respectively; r is (r) T =[r 1 ,r 2 ,…,r N ]、f T =[f 1 ,f 2 ,…,f N ]、s T =[s 1 ,s 2 ,…,s N ]The average response time, the failure time and the service time vector of the same type of service are respectively, and N is the number of the same type of service. In particular, for cost-type indicators, e.g. quality of service V, which measures response time r Expressed more strictly as
The measurement expression of the subsequent cost type index is similar, but is simplified from the viewpoint of simplicity of expression. Similarly, for benefit-type indicators, e.g. quality of service V, for measuring service time s Expressed more strictly as
In the subsequent metrology equations, the simplification is also made.
b) Interoperability
Interoperability refers to the ability to share information between two or more services, including the ability to invoke and be invoked by other services, measured as
Wherein F is Ii An interoperability evaluation value for the ith service; [ omega ] call ω called ]To call other services, weight coefficients called by other services, ω call +ω called =1;c calli 、c calledi Calling other services for the ith service and calling times by the other services respectively; c call =[c call1 ,c call2 ,…,c callN ]、c called =[c called1 ,c called2 ,…,c calledN ]The number of times vector is called for other services and called for other services respectively, and N is the number of the services of the same type.
c) Extensibility and method for making same
The expandability refers to the capability of the service to adapt to the change of the demand, and the capability of generating new service is expanded through function inheritance, and the measurement method is as follows
Wherein F is Ei An expandability evaluation value for the i-th service; e, e i The number of new services generated for the ith service extension; e= [ e 1 ,e 2 ,…,e N ]For expansion, a new service number vector is generated, N being the number of the same type of service.
d) Combinability of
Combinability refers to the ability to form new services by combining existing services, measured as
Wherein F is Ci A combinability evaluation value for an ith service; c i Generating a combined service number for the ith service;
c=[c 1 ,c 2 ,…,c N ]to combine the number of services vector, N is the number of the same type of services.
2) Interface index measurement method
a) Reusability of
Reusability refers to the ability of a service interface to be reused without modification or with little modification, and is measured by
Wherein I is Ri Reusability assessment value for the ith service; r is (r) i Reuse times for the ith service; r= [ r ] 1 ,r 2 ,…,r N ]For reuse number vector, N is the number of same type of service.
b) Normalization of
Normalization refers to the feature of restraining and normalizing service interface content, generally refers to scoring item standards to audit and score interface design documents, and adopts benefit indexes to perform normalization processing, and the measurement method is as follows
Wherein I is Ni A normalization evaluation value for the i-th service; n is n i The ith service specification level; n= [ n ] 1 ,n 2 ,…,n N ]N is the number of services of the same type, which is the norm vector.
c) Ease of use
The usability refers to the easiness of understanding, learning and mastering the service by a user, and the measuring method is as follows
Wherein I is Ui An usability assessment value for the ith service; u (u) i The number of times of using the ith service; u= [ u ] 1 ,u 2 ,…,u N ]For the usage number vector, N is the number of the same type of service.
3) Data infrastructure index measurement method
a) Throughput of
The data throughput refers to the quantity of data of modeling and simulation service received in unit time, and can be measured by bits, bytes, data packet quantity and the like, and the measuring method is that
Wherein D is Ti Throughput evaluation value for the i-th service; t is t i Data throughput for the ith service; t= [ t ] 1 ,t 2 ,…,t N ]For data throughput vectors, N is the number of same type of services.
b) Redundancy of
Redundancy means that in order to improve the robustness of modeling and simulation services, the influence of faults on the modeling and simulation services is reduced by backing up the data of the modeling and simulation services in a data infrastructure, and the measurement method is that
Wherein D is Ri Redundancy evaluation value for the i-th service; r is (r) i The number of data backups for the i-th service;
r=[r 1 ,r 2 ,…,r N ]data backup number vector, N is the number of same type of service.
c) Fault tolerance
Fault tolerance, also known as fault tolerance, refers to the ability to restore the system's normal ability by data backup of different storage devices when an error occurs, and is measured by
Wherein D is Fi Fault tolerance assessment value for the i-th service; f (f) bi 、f ri The total times of faults and the times of fault recovery are respectively.
Then judging whether the index value in each service is smaller than the service screening class measurement index threshold alpha, if so, rejecting the service, otherwise, reserving the service; the final up-to-standard number of services is assumed to be N2.
Finally, the service screening class measurement indexes are synthesized by adopting an analytic hierarchy process, and N2 service screening evaluation values E are obtained through calculation S =[E S1 ,E S2 ,...,E SN2 ]。
Step 3: and judging whether the current service is available, if so, turning to the step 4, otherwise, giving a suggestion that the current service is unavailable, and turning to the step 7.
The method of judging whether the current service is available is to determine whether the current service is available according to the threshold value beta (0<β<1) Judging if a certain service filters the evaluation result E Si If not, the service is available, otherwise, the service is not available; the number of services that are ultimately currently available is assumed to be N3.
Step 4: and judging whether the number of the current services is multiple, if so, turning to the step 5, otherwise turning to the step 6.
And judging whether the number N3 of the currently available services is larger than 1, if so, turning to the step 5, otherwise turning to the step 6.
Step 5: and from the service preference level, carrying out preference ranking evaluation on the current service quality, and simultaneously synthesizing service screening evaluation values to give the service quality ranking.
Assume that the current number of services is N3. First, for different services, calculating the service preference index value I of the service O =[I O1 ,I O2 ,...,I ON3 ]Wherein I Oi (i=1, 2,., N3) is a vector of index values for each preference class of the ith service. The service preference index calculating method comprises the following steps:
1) Performance index measurement method
a) Accuracy of
Accuracy refers to the accuracy of modeling and simulation service, and can be measured by the consistency degree of the service result and the expected result, and the measuring method is that
Wherein P is Ai An accuracy evaluation value for the i-th service; a, a i =|r ai -r ei I is the accuracy of the ith service, r ai 、r ei Respectively serving an actual result and an expected result; a= [ a ] 1 ,a 2 ,…,a N ]For the accuracy vector, N is the number of services of the same type.
b) Reliability of
Reliability refers to the ability of a service to maintain a normal, correctly functioning state when used in a defined environment, typically by normalizing the average time to failure
Wherein P is Ri Reliability evaluation value for the i-th service; r is (r) i Mean time to failure for the ith service;
r=[r 1 ,r 2 ,…,r N ]n is the number of same type of service, which is the average time-to-failure vector.
c) Stability of
Stability refers to the ability of a service to continue to operate normally when the operating environment changes, and is measured by
Wherein P is Si A stability evaluation value for the i-th service; s is(s) i The normal running time of the ith service after the environment changes; s= [ s ] 1 ,s 2 ,…,s N ]And N is the number of the same type of service for the normal running time vector after the environment changes.
2) Network index measurement method
a) Safety of
The security means that the software and hardware of the network and the system thereof are protected from being damaged, changed and leaked due to accidental or malicious reasons, and can support the reliable, normal and uninterrupted operation of modeling and simulation services, and the measurement method is that
Wherein N is Si A security evaluation value for the i-th service; s is(s) i Service time for the ith service; s= [ s ] 1 ,s 2 ,…,s N ]N is the number of the same type of service, which is the service time vector.
b) Time delay
The time delay is the time required for transmitting the data packet from the network transmitting end to the receiving end, and generally comprises transmission time delay, propagation time delay, processing time delay, queuing time delay and the like, and the measurement method is that
Wherein N is Ti Delay evaluation value for the ith service; t is t i Is the ith service delay; t= [ t ] 1 ,t 2 ,…,t N ]N is the number of the same type of service, which is the service delay vector.
c) Packet loss rate
The packet loss rate refers to the number of lost data packets in the modeling and simulation service using process accounting for the total amount of the transmitted data packets. The packet loss rate of the service is the average value of the packet loss rates of a plurality of simulation tasks (comprising the modeling and simulation service). A comprehensive packet loss rate measurement method is provided
Wherein N is Pi The packet loss rate evaluation value of the single simulation task in the ith service is obtained; p is p li 、p ti And respectively combing the lost information packets and transmitting the total amount of the information packets during single simulation task.
3) User perception index measurement method
a) Service price
The service price refers to the fee that must be paid by the service requester to invoke the operation, and is generally specified by the service provider, and the measurement method is that
In U Pi A service price evaluation value for the i-th service; p is p i The ith service price; p= [ p ] 1 ,p 2 ,…,p N ]N is the number of the same type of service, which is the service price vector.
b) Reputation degree
Reputation refers to the degree of satisfaction and trust of a user with respect to a service, which is scored by the user according to the experience of using the service, with reference to scoring criteria, the measure being
In U Ci A reputation evaluation value for the ith service; c i Scoring the ith service; c= [ c ] 1 ,c 2 ,…,c N ]The service scoring vector, N, is the number of the same type of service.
c) Success rate
Success rate refers to the probability that a service can be correctly responded, and the measurement method is that
In U Si A success rate evaluation value for the i-th service; s is(s) ri 、s ci The number of times the service is called and the number of times the service is correctly responded are respectively requested.
Then, the N3 screening class index service quality evaluation results are taken as an index to be included into a service preference class measurement index, wherein the occupied weight is k (0 < k < 1);
finally, sequencing the indexes by adopting a TOPSIS method, and calculating to obtain the quality degree E of each service Ri The final good and bad sequencing result is according to E Ri The magnitude order of the values is determined and step 7 is entered.
Step 6: and from the service preference level, carrying out service value evaluation on the current service quality, and simultaneously integrating service screening evaluation values to give the current service quality measurement value.
First, for the current service, a service preference index value I is calculated Ok ;
Then, determining the weight of the service preference type measurement index by adopting an analytic hierarchy process;
finally, the service quality evaluation result of the screening class index is incorporated into the service preference class measurement index, weighted summation is carried out, and the evaluation value E of the current service is obtained by calculation V 。
Step 7: the quality of service measurement procedure ends.
The invention will be further described with reference to specific embodiments.
In the weapon equipment combat simulation, the device mainly comprises sensors, aircrafts and other types of equipment, wherein the aircrafts equipment such as the modeling simulation of guided missiles comprises a kinematics/dynamics model, a missile-based relative motion model, a guidance model and other parts, and relates to position/speed component solving, proportional guidance design and the like, parameters of the device need to be solved, differential equation solving is a main numerical simulation algorithm, and algorithm solving efficiency is influenced by a programming language, a packaging format and the like adopted by the device.
Aiming at service quality evaluation in weapon equipment modeling simulation, important attention is paid to the efficiency of a related differential equation solving algorithm. Nine ordinary differential equation solving algorithm services are selected as analysis objects, and the service quality is evaluated to verify the effectiveness of the modeling simulation service quality evaluation method provided by the patent.
Nine algorithm services can be divided into four-order-five-order range-Kutta algorithm RK45, two-order-three-order range-Kutta algorithm RK23 and first-order-five-order range-Kutta variable step length algorithm RK15 according to algorithm types, each algorithm has 3 versions of C++ function, matlab library function and C++ dynamic link library, and compared with the algorithm shown in table 1, RK45 is a widely applied non-rigid differential equation solving method, calculation errors are small, and the solving speed is moderate. RK23 is a non-rigid differential equation solving method that calculates relatively faster but with larger errors. RK15 is an algorithm for solving rigid body differential equations and differential algebraic equations.
Table 1 nine ordinary differential equation solving algorithm service comparisons
Step 1, the modeling and simulation service quality measurement index system in fig. 2 is adopted, and no clipping is performed.
Step 2, collecting the raw data calculated by each index, wherein the data used for usability assessment are shown in table 2.
Table 2 raw data for partial index calculation
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-C | R15-M | R15-D | |
Response time r i /sec | 0.12 | 0.15 | 0.16 | 0.1 | 0.12 | 0.13 | 0.1 | 0.13 | 0.14 |
Time of failure f i /sec | 38 | 66 | 57 | 50 | 52 | 40 | 46 | 57 | 36 |
Service time s i /sec | 302 | 521 | 396 | 202 | 241 | 168 | 209 | 262 | 169 |
Screening evaluation was performed. Taking the function availability index as an example, according to the data in Table 2, the weights are determined to be omega respectively by adopting a analytic hierarchy process r =0.2,ω f =0.4,ω s =0.4. According to the usability measuring method, the calculation process is as follows:
min(r T )=0.1,min(f T )=36,max(s T )=521
taking the calculation of the availability index of R45-C as an example, R 1 =0.12,f 1 =38,s 1 =302
According to the above steps, calculating index value of each screening class, I S =[I S1 ,I S2 ,...,I SN1 ]The calculation results are shown in Table 3. In the case, the threshold alpha of the screening class index is 0.6, and the combinability index and the fault tolerance index of R15-C, R15-D are found to be lower than 0.6 after pre-screening, and other 7 services pass through the pre-screening.
Table 3 screening of bottom-like index calculation results
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-C | R15-M | R15-D | |
Availability of | 0.78 | 0.75 | 0.68 | 0.64 | 0.63 | 0.64 | 0.67 | 0.61 | 0.67 |
Interoperability | 0.8 | 0.75 | 0.68 | 0.76 | 0.77 | 0.65 | 0.72 | 0.68 | 0.63 |
Extensibility and method for making same | 1 | 0.85 | 0.82 | 0.87 | 0.82 | 0.78 | 0.86 | 0.77 | 0.76 |
Combinability of | 0.75 | 0.77 | 1 | 0.73 | 0.73 | 0.8 | 0.57 | 0.63 | 0.73 |
Reusability of | 0.78 | 0.8 | 0.9 | 0.74 | 0.7 | 1 | 0.69 | 0.62 | 0.79 |
Normalization of | 1 | 0.83 | 0.75 | 0.82 | 0.82 | 0.75 | 0.83 | 0.8 | 0.76 |
Ease of use | 1 | 0.85 | 0.77 | 0.84 | 0.84 | 0.77 | 0.85 | 0.82 | 0.78 |
Throughput of | 1 | 0.86 | 0.79 | 1 | 0.86 | 0.79 | 1 | 0.86 | 0.79 |
Redundancy of | 1 | 0.84 | 0.75 | 1 | 0.84 | 0.75 | 1 | 0.84 | 0.75 |
Fault tolerance | 0.76 | 0.8 | 0.67 | 0.77 | 0.78 | 0.69 | 0.71 | 0.73 | 0.58 |
And (4) sequentially carrying out index comprehensive evaluation on the service quality of the function, the interface, the data infrastructure and the screening class indexes by adopting an analytic hierarchy process, wherein the evaluation results are shown in table 4.
Table 4 screening class index comprehensive evaluation results
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | |
Function of | 0.84 | 0.78 | 0.81 | 0.77 | 0.75 | 0.73 | 0.68 |
Interface | 0.96 | 0.83 | 0.79 | 0.81 | 0.80 | 0.81 | 0.77 |
Data infrastructure | 0.92 | 0.83 | 0.74 | 0.92 | 0.83 | 0.74 | 0.81 |
Screening class indicator quality of service | 0.91 | 0.82 | 0.78 | 0.83 | 0.79 | 0.76 | 0.75 |
Step 3, screening class index service quality evaluation result E according to Table 4 S =[E S1 ,E S2 ,...,E SN2 ],=[0.91,0.82,0.78,0.83,0.79,0.76,0.75]All 7 services are available that exceed the threshold β= 0.6,7.
Step 4, judging the current service number, and turning to the preferred evaluation step because the current service number is n3= 7>1.
Step 5, performing preference evaluation, firstly obtaining a measurement method according to each preference index, and calculating a preference result as shown in table 5. Because the TOPSIS method is different from the analytic hierarchy process in comprehensive mode, in order to consider screening results in the optimization process, the quality of service of the screening indexes is included in the order, the weight k=0.4, and the total weight of other optimization indexes is 0.6.
TABLE 5 evaluation results of preference index
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | Weighting of | |
Accuracy of | 1 | 1 | 1 | 0.78 | 0.78 | 0.78 | 0.85 | 0.08 |
Reliability of | 0.81 | 0.78 | 0.8 | 0.79 | 0.75 | 0.77 | 0.75 | 0.08 |
Stability of | 0.89 | 0.87 | 1 | 0.85 | 0.84 | 0.85 | 0.78 | 0.08 |
Safety of | 0.83 | 0.78 | 1 | 0.83 | 0.78 | 1 | 0.78 | 0.04 |
Time delay | 1 | 0.88 | 0.9 | 0.89 | 0.85 | 0.87 | 0.85 | 0.04 |
Packet loss rate | 0.86 | 0.95 | 0.81 | 0.92 | 0.97 | 0.9 | 0.98 | 0.04 |
Service price | 0.73 | 0.81 | 0.77 | 0.87 | 1 | 0.79 | 0.83 | 0.08 |
Reputation degree | 0.81 | 1 | 0.72 | 0.71 | 0.77 | 0.68 | 0.75 | 0.08 |
Success rate | 0.82 | 1 | 0.75 | 0.81 | 0.85 | 0.68 | 0.7 | 0.08 |
TOPSIS ranking calculation was performed, and the evaluation results and ranking are shown in Table 6. According to the preferred ranking result, the ranking of the 7 differential equation solving services is R45-C, R45-M, R-C, R23-M, R45-D, R45-M, and the user can select the solving service according to the requirement.
TABLE 6 evaluation results of preference index
R45-C | R45-M | R45-D | R23-C | R23-M | R23-D | R15-M | |
Evaluation results | 0.67 | 0.54 | 0.31 | 0.44 | 0.37 | 0.14 | 0.15 |
Ordering of | 1 | 2 | 5 | 3 | 4 | 7 | 6 |
The specific embodiments described herein are offered by way of example only to illustrate the spirit of the invention. Various modifications or additions may be made to the described embodiments or in a similar manner without departing from the spirit of the invention or exceeding the scope of the invention as defined by the appended claims.
Claims (1)
1. The modeling and simulation service quality measurement method for weapon equipment simulation is characterized by comprising the following steps of:
step 1: simulation modeling is conducted towards weapon equipment, and a modeling and simulation service quality measurement index system is established according to service requirements;
step 2: performing service screening evaluation on one or more current services from a service availability level;
step 3: judging whether the current service is available or not, if yes, turning to step 4, otherwise, giving a suggestion that the current service is unavailable, and turning to step 7;
step 4: judging whether the current service is more than two, if so, turning to step 5, otherwise, turning to step 6;
step 5: from the service preference level, based on preference indexes, carrying out preference ranking evaluation on the current service quality, synthesizing service screening evaluation values, giving out service quality ranking, and turning to step 7;
step 6: from the service preference level, based on preference index, evaluating the service value of the current service quality, and simultaneously comprehensively screening the evaluation value to give the current service quality measurement value;
step 7: the quality of service measurement procedure ends;
in step 1, the simulation modeling for the weapon equipment specifically includes:
establishing a kinematic model of the missile:
wherein x and y are eachThe abscissa and ordinate of the missile position,and->The rates of change of x and y, respectively; v (V) x And V y The components of the missile speed V in the horizontal coordinate and the vertical coordinate are respectively;And->V respectively x And V y N is missile overload perpendicular to the view line of the missile, q is the target azimuth angle, and θ is the missile trajectory angle;
the target is set to do uniform motion in the attack plane, and the following steps are:
wherein x is T And y T Respectively the abscissa and ordinate of the target position,and->Respectively x T And y T Rate of change of V T For target speed, sigma T Is the target course angle;
establishing a bullet-mesh relative motion model:
wherein r isThe relative distance between the missile and the target,and->Actual rates of change of r and q, respectively; eta is the missile velocity vector lead angle and eta T For target velocity vector lead angle, σ T Is the target course angle;
selecting a proportional guidance law and constructing a missile guidance model:
wherein K is a proportional guidance law coefficient;
in step 1, the service requirement refers to the requirement of a user on functions, performances and interfaces of modeling and simulation services; the modeling and simulation service quality measurement index system comprises two kinds of measurement indexes of service screening and service optimization, wherein the service screening type measurement indexes comprise:
functional indicators including availability, interoperability, scalability, combinability, wherein availability is determined jointly by response time, failure time, service time;
interface indexes including reusability, standardability, and ease of use;
data infrastructure metrics including throughput, redundancy, fault tolerance;
the service preference class metrics include:
performance indexes including accuracy, reliability, stability;
network indexes including security, time delay and packet loss rate;
user perception indexes including service price, credibility and success rate;
the step 2 comprises the following steps:
setting the current service number as N1, and respectively calculating service screening class index values I of different services S =[I S1 ,I S2 ,…,I SN1 ]Wherein I Si A vector consisting of index values of each screening class is served for the ith service, i=1, 2, …, N1; the service screening class index calculation method comprises the following steps:
(1) The function index measuring method comprises the following steps:
availability of: availability refers to the ability of a service to provide a desired function at a certain point in time or period of time, and is determined by the response time, the failure time and the service time, and the measurement method is as follows:
wherein F is Ai An availability evaluation value for the ith service; [ omega ] r ω f ω s ]Weight coefficients, ω, respectively response time, failure time, service time r +ω f +ω s =1;r i 、f i 、s i Average response time, average failure time, average service time, i=1, 2, respectively, for the ith service T =[r 1 ,r 2 ,…,r N ]、f T =[f 1 ,f 2 ,…,f N ]、s T =[s 1 ,s 2 ,…,s N ]Average response time, failure time and service time vector of the same type of service respectively;
interoperability: interoperability refers to the ability to share information between two or more services, including the ability to invoke and be invoked by other services, and measures are:
wherein F is Ii An interoperability evaluation value for the ith service; [ omega ] call ω called ]Weight coefficient, ω, respectively, for invoking other services, invoked by other services call +ω called =1;c calli 、c calledi Calling other services for the ith service and calling times by the other services respectively; c call =[c call1 ,c call2 ,…,c callN ]、c called =[c called1 ,c called2 ,…,c calledN ]Calling other services and calling the number of times vector by other services respectively;
scalability: the expandability refers to the capability of the service to adapt to the change of the demand, and the new service is generated by expanding through function inheritance, and the measurement method is as follows:
wherein F is Ei An expandability evaluation value for the i-th service; e, e i The number of new services generated for the ith service extension; e= [ e 1 ,e 2 ,…,e N ]Generating a new service quantity vector for the extension;
combinability: combinability refers to the ability to form new services by combining existing services, and the measurement method is as follows:
wherein F is Ci A combinability evaluation value for an ith service; c i Generating a combined service number for the ith service; c= [ c ] 1 ,c 2 ,…,c N ]A vector of number of services for the portfolio;
(2) The interface index measurement method comprises the following steps:
reusability: reusability refers to the ability of a service interface to be reused without modification or with little modification, and is measured by:
wherein I is Ri Reusability assessment value for the ith service; r is (r) i Reuse times for the ith service; r= [ r ] 1 ,r 2 ,…,r N ]A number of times vector is used;
normalization: normalization refers to the feature of constraint and normalization of service interface content, and the measurement method is as follows:
wherein I is Ni A normalization evaluation value for the i-th service; n is n i The ith service specification level; n= [ n ] 1 ,n 2 ,…,n N ]Is a normative degree vector;
ease of use: the usability refers to the ease with which the service can be understood, learned and mastered by the user, and the measurement method is as follows:
wherein I is Ui An usability assessment value for the ith service; u (u) i The number of times of using the ith service; u= [ u ] 1 ,u 2 ,…,u N ]Is a vector of the number of uses;
(3) Data infrastructure index measurement method:
throughput: the data throughput refers to the quantity of data of modeling and simulation service received in unit time, and the measurement method is as follows:
wherein D is Ti Throughput evaluation value for the i-th service; t is t i Data throughput for the ith service; t= [ t ] 1 ,t 2 ,…,t N ]A data throughput vector;
redundancy: redundancy means that in order to improve the robustness of modeling and simulation services, the data of the modeling and simulation services are backed up in a data infrastructure, so that the influence of faults on the modeling and simulation services is reduced, and the measurement method is as follows:
wherein D is Ri Redundancy evaluation value for the i-th service; r is (r) i The number of data backups for the i-th service;
r=[r 1 ,r 2 ,…,r N ]a data backup quantity vector;
fault tolerance: fault tolerance is also called fault tolerance, and refers to the ability of recovering the system from data backup of different storage devices when an error occurs, and the measurement method is as follows:
wherein D is Fi Fault tolerance assessment value for the i-th service; f (f) bi 、f ri The total times of faults and the times of fault recovery are respectively;
judging whether index values in each service are smaller than a service screening class measurement index threshold alpha, if so, rejecting the service, otherwise, reserving the service; finally, setting the number of the up-to-standard services as N2;
finally, the service screening class measurement indexes are synthesized by adopting an analytic hierarchy process, and N2 service screening evaluation values E are obtained through calculation S =[E S1 ,E S2 ,...,E SN2 ],E Si Screening assessment value for the i-th service, i=1, 2, N2;
in step 3, if a service screening evaluation result is greater than or equal to a threshold value beta, the service is available, otherwise, the service is not available;
the step 5 comprises the following steps: setting the current service quantity asN3, firstly, calculating service preference index value I of different services O =[I O1 ,I O2 ,...,I ON3 ]Wherein I Oi (i=1, 2,.,. N3) is a vector of index values for each preference class of the ith service; the service preference index calculating method comprises the following steps:
(1) Performance index measurement method
Accuracy: accuracy refers to the accuracy of modeling and simulation services, and the measurement method is as follows:
wherein P is Ai An accuracy evaluation value for the i-th service; a, a i =|r ai -r ei I is the accuracy of the ith service, r ai 、r ei Respectively serving an actual result and an expected result; a= [ a ] 1 ,a 2 ,…,a N ]Is an accuracy vector;
reliability: reliability refers to the ability of a service to maintain a properly functioning state when used in a defined environment, and is measured by:
wherein P is Ri Reliability evaluation value for the i-th service; r is (r) i Mean time to failure for the ith service;
r=[r 1 ,r 2 ,…,r N ]is an average fault-free time vector;
stability: stability refers to the ability of a service to continue to operate normally when the operating environment changes, and the measurement method is as follows:
wherein P is Si A stability evaluation value for the i-th service; s is(s) i The normal running time of the ith service after the environment changes; s= [ s ] 1 ,s 2 ,…,s N ]The normal operation time vector is the normal operation time vector after the environment changes;
(2) Network index measurement method
Safety: the security means that the software and hardware of the network and the system thereof are protected from damage, modification and leakage caused by accidental or malicious reasons, and can support reliable, normal and uninterrupted operation of modeling and simulation services, and the measurement method comprises the following steps:
wherein N is Si A security evaluation value for the i-th service; s is(s) i Service time for the ith service; s= [ s ] 1 ,s 2 ,…,s N ]Is a service time vector;
time delay: the time delay is the time required for transmitting a data packet from a network sending end to a receiving end, and the measurement method is as follows:
wherein N is Ti Delay evaluation value for the ith service; t is t i Is the ith service delay; t= [ t ] 1 ,t 2 ,…,t N ]Is a service delay vector;
packet loss rate: the packet loss rate refers to the quantity of lost data packets accounting for the total quantity of the transmitted data packets in the using process of modeling and simulation service, and the measurement method of the packet loss rate is as follows
Wherein N is Pi Single simulation run for ith servicePacket loss rate evaluation value of the service; p is p li 、p ti The total amount of the lost information packets and the transmitted information packets during single simulation task are respectively calculated;
(3) User perception index measurement method
Service price: the service price refers to the fee measurement method that the service requester must pay to call this operation, which is:
in U Pi A service price evaluation value for the i-th service; p is p i The ith service price; p= [ p ] 1 ,p 2 ,…,p N ]Is a service price vector;
reputation degree: the credibility refers to the satisfaction and the credibility of the user on the service, and the measurement method comprises the following steps:
in U Ci A reputation evaluation value for the ith service; c i Scoring the ith service; c= [ c ] 1 ,c 2 ,…,c N ]Scoring the service vector;
success rate: the success rate refers to the probability that the service can be correctly responded, and the measurement method is as follows:
in U Si A success rate evaluation value for the i-th service; s is(s) ri 、s ci The number of times the service is called and the number of times the service is correctly responded is respectively requested;
then, taking N3 screening class index service quality evaluation results as an index to be included in a service preference class measurement index, wherein the occupied weight is k, and k is more than 0 and less than 1;
finally, sequencing the indexes by adopting a TOPSIS method, and calculating to obtain the quality degree E of each service Ri The final good and bad sequencing result is according to E Ri Determining the magnitude sequence of the values, and turning to the step 7;
the step 6 comprises the following steps: first, for the current service, a service preference index value I is calculated Ok ;
Then, determining the weight of the service preference type measurement index by adopting an analytic hierarchy process;
finally, the service quality evaluation result of the screening class index is incorporated into the service preference class measurement index, weighted summation is carried out, and the evaluation value E of the current service is obtained by calculation V 。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010920457.XA CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010920457.XA CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112182848A CN112182848A (en) | 2021-01-05 |
CN112182848B true CN112182848B (en) | 2023-08-01 |
Family
ID=73925480
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010920457.XA Active CN112182848B (en) | 2020-09-04 | 2020-09-04 | Modeling and simulation service quality measurement method for weapon equipment simulation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112182848B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115208808B (en) * | 2022-09-14 | 2023-01-24 | 北京智芯微电子科技有限公司 | Service quality testing method and device, chip equipment and storage medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102136034A (en) * | 2011-03-18 | 2011-07-27 | 北京航空航天大学 | Military aircraft reliability quantitative requirement demonstration method |
CN105930645A (en) * | 2016-04-18 | 2016-09-07 | 中国人民解放军重庆通信学院 | Communication station equipment maintenance support capability assessment method based on principal component analysis |
CN108615122A (en) * | 2018-05-11 | 2018-10-02 | 北京航空航天大学 | A kind of air-defense anti-missile system combat capability assessment method |
CN109190143A (en) * | 2018-07-11 | 2019-01-11 | 北京晶品镜像科技有限公司 | A kind of network-enabled intelligent ammunition multi-scheme appraisal procedure based on operation l-G simulation test |
CN109472494A (en) * | 2018-11-12 | 2019-03-15 | 中国人民解放军火箭军工程大学 | A kind of command and control system service guarantee effectiveness assessment index quantitative model |
CN110069815A (en) * | 2019-03-14 | 2019-07-30 | 中科恒运股份有限公司 | Index system construction method, system and terminal device |
-
2020
- 2020-09-04 CN CN202010920457.XA patent/CN112182848B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102136034A (en) * | 2011-03-18 | 2011-07-27 | 北京航空航天大学 | Military aircraft reliability quantitative requirement demonstration method |
CN105930645A (en) * | 2016-04-18 | 2016-09-07 | 中国人民解放军重庆通信学院 | Communication station equipment maintenance support capability assessment method based on principal component analysis |
CN108615122A (en) * | 2018-05-11 | 2018-10-02 | 北京航空航天大学 | A kind of air-defense anti-missile system combat capability assessment method |
CN109190143A (en) * | 2018-07-11 | 2019-01-11 | 北京晶品镜像科技有限公司 | A kind of network-enabled intelligent ammunition multi-scheme appraisal procedure based on operation l-G simulation test |
CN109472494A (en) * | 2018-11-12 | 2019-03-15 | 中国人民解放军火箭军工程大学 | A kind of command and control system service guarantee effectiveness assessment index quantitative model |
CN110069815A (en) * | 2019-03-14 | 2019-07-30 | 中科恒运股份有限公司 | Index system construction method, system and terminal device |
Non-Patent Citations (2)
Title |
---|
SOA的QoS研究综述;赵生慧等;《计算机科学》;20090430;第36卷(第4期);第16-31页 * |
李士勇等.2.3.5目标运动学描述.《智能制导的导弹智能自适应导引律》.2011,第25-28页. * |
Also Published As
Publication number | Publication date |
---|---|
CN112182848A (en) | 2021-01-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN105550323B (en) | Load balance prediction method and prediction analyzer for distributed database | |
Baukus et al. | Verification of parameterized protocols | |
CN104850727B (en) | Distributed big data system risk appraisal procedure based on Cloud focus theory | |
CN112182848B (en) | Modeling and simulation service quality measurement method for weapon equipment simulation | |
CN110348752A (en) | A kind of large scale industry system structure security assessment method considering environmental disturbances | |
US20080239983A1 (en) | Method for integrating downstream performance and resource usage statistics into load balancing weights | |
CN111708054A (en) | ARAIM vertical protection level optimization method based on particle swarm optimization algorithm | |
Lu et al. | Hybrid state estimation for aircraft engine anomaly detection and fault accommodation | |
CN112418341A (en) | Model fusion method, prediction method, device, equipment and storage medium | |
CN113542266B (en) | Virtual network element trust measurement method and system based on cloud model | |
Beynier et al. | A polynomial algorithm for decentralized Markov decision processes with temporal constraints | |
CN116519021A (en) | Inertial navigation system fault diagnosis method, system and equipment | |
CN110348540A (en) | Electrical power system transient angle stability Contingency screening method and device based on cluster | |
CN107888561B (en) | Civil aircraft-oriented safety service combination system | |
CN112199842A (en) | Task-oriented-based complex simulation system reliability evaluation method | |
Liu et al. | An Accelerated Safety Probability Estimation Method for Control Policies of Autonomous Vehicles in Open Environments | |
Zhang et al. | Multi-target threat assessment in air combat based on AHP and FVIKOR | |
CN115235463A (en) | Integrity risk demand distribution method for GNSS/INS integrated navigation system | |
CN114610615A (en) | Project test processing method, device, equipment and storage medium | |
Dzielski et al. | Implementing a Decision Framework in SysML Integrating MDAO Tools | |
CN113590841B (en) | Intelligent rapid examination and intelligent early warning system and method based on knowledge graph | |
CN117493213B (en) | Method, device, equipment and medium for detecting test coverage rate of financial business system | |
CN108599834A (en) | A kind of satellite communication network link utilization analysis method and system | |
CN113032259B (en) | Fuzzy-based networked software system reliability index distribution method | |
CN112416774B (en) | Software reliability testing method with added weight |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |