WO2011153657A1 - 一种软交换可用性综合评估系统 - Google Patents

一种软交换可用性综合评估系统 Download PDF

Info

Publication number
WO2011153657A1
WO2011153657A1 PCT/CN2010/000805 CN2010000805W WO2011153657A1 WO 2011153657 A1 WO2011153657 A1 WO 2011153657A1 CN 2010000805 W CN2010000805 W CN 2010000805W WO 2011153657 A1 WO2011153657 A1 WO 2011153657A1
Authority
WO
WIPO (PCT)
Prior art keywords
test
softswitch
comprehensive
protocol
evaluated
Prior art date
Application number
PCT/CN2010/000805
Other languages
English (en)
French (fr)
Inventor
吴宏建
张雪丽
崔泰相
李海花
吕军
Original Assignee
工业和信息化部电信研究院
韩国电子通信研究院
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 工业和信息化部电信研究院, 韩国电子通信研究院 filed Critical 工业和信息化部电信研究院
Priority to CN201080067273.8A priority Critical patent/CN103098417B/zh
Priority to PCT/CN2010/000805 priority patent/WO2011153657A1/zh
Publication of WO2011153657A1 publication Critical patent/WO2011153657A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L49/00Packet switching elements
    • H04L49/65Re-configuration of fast packet switches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L49/00Packet switching elements
    • H04L49/55Prevention, detection or correction of errors
    • H04L49/555Error detection

Definitions

  • the present invention relates to the field of softswitch, and in particular to a softswitch availability comprehensive evaluation system.
  • softswitch equipment based on packet switching technology is gradually replacing traditional switch equipment based on circuit switching.
  • softswitch devices involve many protocols and are complex, there are many vendors that can provide softswitch devices. Therefore, for most operators, how to purchase and evaluate softswitch devices is time consuming. It is a laborious matter. More importantly, since the operation and maintenance personnel are generally not the agreement experts themselves, it is difficult for them to form an objective and accurate evaluation conclusion for the equipment to be evaluated from the simple test results indicating whether or not they pass or not.
  • the evaluation of softswitches is usually achieved through two types of tests.
  • One is the interface protocol test, and the other is the performance test, which is the big traffic test.
  • the purpose of the interface protocol test is to check whether the interface protocol provided by the evaluated device is consistent with the requirements of the international standard. Only the interface protocol conforming to the international standard can be used to reduce the equipment provided by the evaluated device and other manufacturers as much as possible. The probability of compatibility ensures interoperability between devices.
  • Performance testing is mainly to examine the capacity and capabilities of the device to see if the device can serve a large number of users at the same time.
  • protocol analyzers For these two types of tests, there are two types of instruments available on the market: protocol analyzers and performance testers.
  • the existing protocol analyzer can test the consistency of the protocol, the results of the test can only give the number of items passed, how many items fail, and lack of in-depth analysis of the test results.
  • test results from different devices cannot be compared to each other. For example, the first item of the SIP protocol of the device of the manufacturer A does not pass, and the item 10 of the BICC protocol of the device of the B manufacturer does not pass. From the number of items, the number of items that the two manufacturers fail to pass is one. But this does not mean that the capabilities of the equipment of the two manufacturers are the same. Because different protocols play different roles in the device, the weight of different items in the same protocol It is also different. Therefore, the results of the test must not simply use the pass rate to measure the capabilities of the equipment being evaluated.
  • a softswitch with a SIP protocol of up to 300 CAPS is not necessarily superior in performance to a softswitch that achieves 200 CAPS with the BICC protocol. Even if it is the SIP protocol, some factors, such as the time of each call hold, whether the call has media, the length of each message, whether to use a reliable temporary response, whether to use preconditions, etc., will eventually affect the Evaluate equipment CAPS indicators. Therefore, to ensure the performance of the performance test is comparable, you must provide a benchmark in advance to ensure that the model, parameters and mechanisms of the call used in each test are exactly the same, which is also considered by the existing performance test instrument. local.
  • an object of the present invention is to provide a comprehensive evaluation system for softswitch availability for comprehensive evaluation of different softswitch devices, and the evaluation results are intuitive, accurate, and comparable.
  • the present invention provides a softswitch availability comprehensive evaluation system, including: a protocol interface module for loading a protocol stack to implement an interface corresponding to an evaluated softswitch device; a test script module for being evaluated for soft
  • the switching device provides protocol conformance test scripts and performance test scripts;
  • the comprehensive analysis module is configured to perform comprehensive analysis on each test of the evaluated softswitch device, including: a weight management unit, configured to set weights for each test;
  • a display module configured to interact with the user by the comprehensive evaluation system, including:
  • the comprehensive evaluation unit is configured to calculate, according to various test results and corresponding weights of each test, a score of the evaluated softswitch device in the protocol conformance test according to a predetermined algorithm, in the performance test. Score and composite score.
  • the scores in the protocol conformance test, the scores in the performance test, and the predetermined algorithm for the comprehensive score are as follows:
  • W denotes the weight of the interface protocol conformance test
  • Sprac ⁇ denotes the score obtained by the protocol i in the protocol conformance test
  • Wpra ⁇ denotes the weight of the protocol i
  • bk denotes whether the kth test passes, Then take the value "1", otherwise take the value "0”, Wk represents the weight of the kth test.
  • the comprehensive evaluation unit is provided with a score in the performance test of the evaluated softswitch device according to the number of trial calls per second.
  • the display evaluation result unit is further configured to display a score of each test.
  • the performance test script in the test script module adopts two segments as a preference of the foregoing technical solution
  • the comprehensive analysis module further includes a database
  • the preset information in the database includes: each test One or more of the test objectives, important levels, issues that may be caused when a test fails, how to improve recommendations, and considerations in routine maintenance.
  • the display module further includes a display evaluation suggesting unit, configured to display, according to each test result, a test purpose of the failed test item, an important level, a possible problem, a suggestion for improvement, and a daily routine.
  • a display evaluation suggesting unit configured to display, according to each test result, a test purpose of the failed test item, an important level, a possible problem, a suggestion for improvement, and a daily routine.
  • the softswitch availability comprehensive evaluation system proposed by the present invention has the following characteristics: Evaluation of test methods based on internationally recognized standards;
  • the present invention can at least have the following beneficial effects:
  • Comparability of evaluation results between different softswitch devices uses standard test methods, unified test cases, a unified test structure, and a unified evaluation algorithm to ensure the comparability of the final evaluation results.
  • the results of this evaluation apply not only to comparisons between softswitch devices produced by different manufacturers, but also to comparisons between different versions of the same softswitch device produced by the same manufacturer.
  • test cases Evaluation tests of previous softswitch equipment are often tested on a protocol-by-protocol basis. Different test cases have different requirements for the test structure and the configuration data of the device under test. The test is complex and the test cycle is long. In fact, the test cases specified in the international standards are mainly used for the development of equipment, so the content of the investigation will be more detailed, but for the users of the equipment, it may be more concerned with some basic functions and protocol compatibility issues. . Therefore, the present invention simplifies the test cases and adopts a test structure suitable for all test cases, so that the evaluated device can perform all the tests as long as it is configured once, which greatly improves the efficiency of the evaluation test and shortens the evaluation. Cycle.
  • FIG. 1 is a schematic view showing a first preferred embodiment of the integrated evaluation system proposed by the present invention
  • FIG. 2 is a schematic diagram of a SIP protocol performance testing process according to a first preferred embodiment of the present invention
  • FIG. 3 is a schematic diagram of a BICC protocol performance testing process according to a first preferred embodiment of the present invention
  • Figure 4 is a schematic view showing the test structure using the first preferred embodiment of the present invention
  • Figure 5 is a schematic illustration of a second preferred embodiment of the integrated evaluation system of the present invention.
  • MSC Server Mobile Switching Center Server Mobile Switching Center
  • IAM Initial Address Message
  • Tunnel data forward connection, tunnel data
  • APM Application transport Message application delivery message
  • ACM Address Complete Message address full message
  • ANM Answer Message Reply Message
  • a first preferred embodiment of a softswitch availability comprehensive evaluation system includes: a protocol interface module 101 for loading a protocol stack to implement an interface corresponding to the evaluated softswitch device; and a test script module 102, Used to provide protocol conformance test scripts and behaviors for evaluated softswitch devices Can test the script;
  • the comprehensive analysis module 103 is configured to perform comprehensive analysis on each test of the evaluated softswitch device;
  • the display module 104 is configured to perform interaction between the integrated evaluation system and the user;
  • the comprehensive analysis module 103 includes:
  • a weight management unit 1031 configured to set weights for each test
  • the comprehensive evaluation unit 1032 is configured to provide a comprehensive assessment of the evaluated softswitch device according to each test result and the corresponding weight of each test;
  • the display module 104 includes:
  • the configuration management unit 1041 is configured to display and provide the user with configuration management of the comprehensive evaluation system; and display an evaluation result unit 1042 for displaying the comprehensive assessment obtained by the comprehensive analysis module.
  • the invention adopts standard test methods, unified test cases, unified test structure, and unified evaluation algorithm to ensure the comparability of the final evaluation results.
  • Protocol Interface Module 101 (1) Protocol Interface Module 101:
  • Softswitch usually has two modes, namely, fixed-line softswitch and softswitch of mobile network.
  • Fixed network softswitch is often used in VoIP networks.
  • the key protocols are SIP (Session Initiation Protocol) and H.248.
  • the softswitch of mobile network is also called MSC server.
  • the key protocol is BICC (Bearer Independent Call). Control, bearer-independent call control protocol) and H.248, where the BICC protocol is transmitted through the M3UA (Message Transfer Part Level 3 User Adaptation Layer) and the Stream Control Transmission Protocol (SCTP) Protocol) is carried on top of the IP layer. Therefore, the requirements for the interface protocol are different for different application scenarios.
  • the protocol interface module 101 can load the corresponding protocol stack on the interface according to the configuration, and implement the corresponding interface function. Therefore, the integrated evaluation system of the present invention can support multiple protocols, laying a foundation for supporting testing and evaluation of various softswitch devices.
  • Test script module 102 (2) Test script module 102:
  • test script module 102 For protocol conformance testing, ETSI (European Telecommunications Standards Institute) and IETF (Internet Engineering Task Force, Internetworking) Network Engineering Task Force) and other international standards organizations have provided corresponding test methods for BICC, M3UA, SCTP, SIP, H.248 and other protocols. Based on these international standards, the test script module 102 provides a complete set of tests for each protocol.
  • test items given by the international standards are for development purposes, so the test set given is relatively complete and the number of items is relatively large. It is considered that one of the main purposes of the present invention is to help the operator to evaluate the compatibility of the softswitch device interface protocol, so that it can be streamlined on the basis of international standards. This can reduce the workload of the evaluation test and shorten the evaluation cycle.
  • the performance test will consider the processing capabilities of the two protocols. It is recommended that the SIP protocol performance test script adopt the test flow shown in Figure 2, and the pre-conditions are not used without media.
  • the BICC protocol adopts the test flow shown in Figure 3, adopting a forward fast mode and no media. The call model is maintained for 5 seconds per call and the total test time is 1 hour. Those skilled in the art will appreciate that other two-stage test procedures can also be employed.
  • Both of these processes use a two-stage test structure, which is to establish a call from the evaluation system to the device under evaluation, and to establish a two-stage signaling route for the call from the device under evaluation to the evaluation system.
  • the advantages of using a two-stage test structure are:
  • test structure of Figure 4 Simplified test methods. Due to the uniform standard test procedure used, all test items, including performance test items, can be tested using only one test structure, the test structure of Figure 4. More importantly, this test structure and test procedure is applicable to all softswitch devices. When testing different softswitch devices, just connect according to the structure of Figure 4 and configure some basic office data, such as IP address. Signaling point coding, etc. can start testing. 3) Ability to verify some of the more important features of the SIP protocol and BICC. In addition to the basic call control functions, it is also able to verify SIP's support for reliable temporary response, SDP offer and SDP answer interaction, BICC tunnel bearer setup and continuity check functions. These two processes can be considered as the most classic processes in the many processes of the SIP protocol and the BICC protocol.
  • the comprehensive analysis module 103 includes a weight management unit 1031 and an integrated rating unit 1032; wherein: the weight management unit 1031 is used for the assignment of weights and the setting of weights. Specifically, it can assign weights for each type of test (protocol or performance test), each protocol, and even each test case. All weight values are configurable, and as a further preference, suggested values can also be given. The suggested values for each weight are as follows:
  • the weight Wi of each test item in the protocol conformance test can be assessed by an expert.
  • a five-point system is used, with 5 points indicating that the test item is of the highest importance, and 1 point means that the importance is the lowest, and the score is given by multiple experts.
  • the values are averaged as the weight of the test item.
  • the weight management unit 1031 assigns a determined weight to each test item.
  • the comprehensive evaluation unit 1032 gives a comprehensive score that can be used for horizontal comparison to the evaluated softswitch device according to a predetermined algorithm based on the test results of the respective tests.
  • the overall score can be as follows
  • &w represents the composite score of the softswitch device being evaluated
  • S con represents the score of the evaluated softswitch device in the protocol conformance test
  • w con represents the weight of the interface protocol conformance test
  • Sprai ⁇ indicates the score obtained by the protocol i in the protocol conformance test
  • Wpra ⁇ represents the weight of the protocol i, and the value is the value set in the weight management unit 1031, and the recommended value may also be used;
  • b k indicates whether the kth test passes, and the value is "1", otherwise the value is "0";
  • W k represents the weight of the kth test, which is suggested by multiple experts, and the average is taken as the recommended weight of the item.
  • the value can be obtained according to the actual test CAPS (Call Attempt Per Second).
  • the SIP test uses the flow of Figure 3
  • the BICC test uses the flow of Figure 4. It should be noted that the CAPS here refers to the number of calls initiated by UA1 or MSC Server A, and all are calls without media.
  • the display module 104 is an interface between the integrated evaluation system of the present invention and the user, and mainly includes a configuration management unit 1041 that displays the comprehensive evaluation unit 1042.
  • the configuration management unit 1041 mainly provides some configuration management functions of the system, including selection of the application environment of the evaluated softswitch device (the selection of the fixed softswitch or the mobile softswitch), the configuration of the protocol stack, the definition and allocation of the weight parameters, the IP address, and The configuration of the port number and so on.
  • the display evaluation result unit 1042 displays the comprehensive score calculated by the above-described comprehensive evaluation unit 1032 after testing the evaluated soft switching device.
  • the display evaluation result unit 1042 can also display the scores of each test item, such as the score of the performance test, the score of the SIP protocol test, and even the test result of each test case.
  • the invention adopts standard test methods, unified test cases, unified test structure, and unified evaluation algorithm to ensure the comparability of the final evaluation results.
  • the results of this evaluation apply not only to comparisons between softswitch devices produced by different manufacturers, but also to comparisons between different versions of the same softswitch device produced by the same manufacturer.
  • FIG. 5 Another preferred embodiment of the present invention, as shown in Fig. 5, is based on the above embodiment, in which the database 1033 is added to the comprehensive analysis module 103.
  • the information preset in the database includes: test purpose of each test, problems that may be caused when the test fails, suggestions on how to improve, and one or more kinds of precautions in daily maintenance.
  • the display module 104 further includes a display evaluation suggesting unit 1043 for displaying one or more of the problems that may be caused by the failed test item, the suggestions for improvement, and the precautions in routine maintenance.
  • the invention realizes the analysis of the test results by establishing a database, correlating an expert conclusion and suggestion for each test case, and satisfying the needs of the user.
  • the evaluation system can easily give expert evaluation conclusions to the device under test, and also give suggestions for improvement suggestions or precautions.
  • the conclusion of the assessment is not a simple pass or not, but a more in-depth analysis of the untested test items, indicating possible problems.

Description

一种软交换可用性综合评估系统 技术领域 本发明涉及软交换领域, 特别是涉及一种软交换可用性综合评估系统。 背景技术 说
随着电信技术的发展, 基于分组交换技术的软交换设备正在逐渐取代基于 电路交换的传统交换机设备。 由于软交换书设备涉及的协议较多也较复杂, 能够 提供软交换设备的厂家也较多, 所以对于大多数运营商运维人员来说, 如何选 购和评估软交换设备是一件既费时又费力的事。 更重要的是, 由于运维人员本 身一般都不是协议专家, 他们也很难从仅标明通过与否的简单的测试结果中对 被评估设备形成一个客观准确的评估结论。
现阶段, 对软交换的评估通常通过两种类型的测试来实现。 一个是接口协 议测试, 另一个是性能测试, 即大话务量测试。
接口协议测试的目的是用来检查被评估设备提供的接口协议是否与国际标 准的要求相一致, 只有采用了符合国际标准的接口协议, 才能尽可能地减少被 评估设备与其它厂家提供的设备不兼容的概率, 保证了设备间的互通。
性能测试主要是对设备的容量和能力的考察, 看设备能否同时为大量的用 户提供服务。
针对这两类型的测试, 市面上分别有两种类型的仪表: 协议分析仪和性能 测试仪。 现有的协议分析仪虽然能对协议的一致性进行测试, 但是对测试的结 果也只能给出通过多少项, 失败多少项, 缺乏对测试结果的深入分析。 特别是, 不同设备的测试结果不能进行相互比较。 例如, A厂家的设备 SIP协议的第 1 项不通过, 而 B厂家的设备的 BICC协议的第 10项不通过, 从项数上看, 这两 个厂家未通过的项数都是 1 项, 但这并不意味着两个厂家的设备的能力是一样 的。 因为不同的协议在设备中所起的作用不一样, 同一种协议不同项目的权重 也是不一样的。 因此, 测试的结果决不能简单地用通过率来衡量被评估设备的 能力。
同样, 对于性能测试仪来说, 也存在着测试结果无法比较的问题。 例如,
SIP 协议能达到 300CAPS 的软交换不见得在性能上优于 BICC 协议能达到 200CAPS的软交换。 即使同是 SIP协议, 一些因素, 如每个呼叫保持的时间, 呼叫是否带媒体, 每条消息的长度, 是否采用可靠的临时响应、 是否采用前提 条件 (precondition) 等, 也最终会影响到被评估设备 CAPS指标。 因此, 要保 证性能测试的结果具有可比性, 就必须事先提供一种基准, 以保证每次测试所 采用呼叫的模型、 各种参数和机制都完全一致, 这也是现有性能测试仪表欠考 虑的地方。
总体来说, 目前的软交换测试仪表缺乏对测试结果的深入分析, 测试结果 缺乏可比性, 很难让人对被测的软交换设备形成一个直观、 准确的认识。 发明内容 针对现有技术中存在的缺陷和不足, 本发明的目的是提出一种软交换可用 性综合评估系统, 以针对不同的软交换设备进行综合评估, 并且评估结果直观 准确、 具有可比性。
为了达到上述目的, 本发明提出一种软交换可用性综合评估系统, 包括: 协议接口模块, 用于加载协议栈, 实现与被评估软交换设备相应的接口; 测试脚本模块, 用于为被评估软交换设备提供协议一致性测试脚本和性能 测试脚本;
综合分析模块, 用于对被评估软交换设备的各项测试进行综合分析, 包括: 权重管理单元, 用于为各项测试设置权重;
综合评定单元, 用于根据各项测试结果以及各项测试相应的权重对被 评估软交换设备给出综合评定;
显示模块, 用于所述综合评估系统与用户之间的交互, 包括:
配置管理单元, 用于为用户提供对该综合评估系统的配置管理; 显示评估结果单元, 用于显示所述综合分析模块得出的综合评定。 作为上述技术方案的优选, 所述综合评定单元用于根据各项测试结果以及 各项测试相应的权重按预定算法计算出被评估软交换设备在协议一致性测试中 的得分、 在性能测试中的得分以及综合得分。
作为上述技术方案的优选, 所述协议一致性测试中的得分, 在性能测试中 的得分以及综合得分的预定算法如下:
^sut _ con ' ' con ' per " per
Scon = ^ Sprotoi * Wprotot
i=\
Figure imgf000005_0001
Sproto =― ~ k=l 其中, 表示被评估软交换设备的综合得分, &。„表示被评估软交换设备 在协议一致性测试中的得分, S^表示被评估软交换设备在性能测试中的得分,
W 表示接口协议一致性测试的权重, 表示性能测试的权重, Sprac^表示协 议 i在协议一致性测试中获得的评分, Wpra^表示协议 i的权重, bk表示第 k 项测试是否通过, 是则取值 " 1 ", 否则取值 " 0", Wk表示第 k项测试的权重。
作为上述技术方案的优选, 所述综合评定单元中预设有根据每秒试呼数的 被评估软交换设备在性能测试中的得分。
作为上述技术方案的优选, 所述显示评估结果单元还用于显示各项测试的 得分。
作为上述技术方案的优选, 所述测试脚本模块中的性能测试脚本采用两段 作为上述技术方案的优选, 所述综合分析模块中还包括数据库, 所述数据 库中预置的信息包括: 各项测试的测试目的, 重要级别, 某项测试未通过时可 能导致的问题, 如何改进的建议, 日常维护中的注意事项中的一种或多种信息。
作为上述技术方案的优选, 所述显示模块中还包括显示评估建议单元, 用 于根据各项测试结果显示未通过的测试项的测试目的, 重要级别, 可能导致的 问题, 如何改进的建议, 日常维护中的注意事项中的一种或多种信息。 本发明提出的软交换可用性综合评估系统具有以下几个特点: 评估基于国际认可的标准的测试方法;
评估的结果直观, 且可用于不同供应商的设备之间以及不同软件版本之间 的比较;
能够考察出软交换设备接口协议的兼容性以及设备的性能;
这样可以减轻运营商采购和运维人员的工作负担, 同时降低评估工作的技 术门槛。
因此, 本发明至少可以具有以下有益效果:
实现不同软交换设备之间的评估结果的可比性: 本发明采用标准的测试方 法、 统一的测试用例、 统一的测试结构、 统一的评估算法, 保证了最终给出的 评估结果的可比性。 该评估结果不仅适用于不同厂家生产的软交换设备之间的 比较, 而且适用于同一厂家生产的同一款软交换设备的不同版本之间的比较。
简化了软交换设备的评估工作: 以往的软交换设备的评估测试常常是逐个 协议测试, 不同的测试用例对测试结构以及被测设备的配置数据要求也不一样。 测试复杂且测试周期较长。 事实上, 国际标准中规定的测试用例主要用于设备 的开发, 因此考察的内容会比较详尽, 但是对于设备的使用方来说, 可能更多 的会关注于一些基本的功能以及协议兼容性问题。 因此, 本发明会对测试用例 做一些精简, 同时采用适用于所有测试用例的测试结构, 这样被评估设备只要 做一次配置, 就可以完成所有的测试, 大大提高了评估测试的效率, 缩短了评 估的周期。
下面结合附图, 对本发明的具体实施方式作进一步的详细说明。 对于所属 技术领域的技术人员而言, 从对本发明的详细说明中, 本发明的上述和其他目 的、 特征和优点将显而易见。 附图说明 图 1为本发明提出的综合评估系统的第一优选实施例的示意图;
图 2为本发明第一优选实施例的 SIP协议性能测试流程示意图;
图 3为本发明第一优选实施例的 BICC协议性能测试流程示意图; 图 4为使用本发明的第一优选实施例的测试结构示意图;
图 5为本发明提出的综合评估系统的第二优选实施例的示意图。
附图中英文缩写对应的英文全称及中文译文对照如下:
SUT: System under Test 被测系统
UA: User Agent 用户代理
INVITE: 邀请
SDP: Session Description Protocol 会话描述协议
PRACK: Provisional Response Acknowledgement 临时性响应的证
ACK: Acknowledgement证实
MSC Server: Mobile Switching Center Server移动交换中心月艮务器
IAM: Initial Address Message 初始地址消息
Connect forward, Tunnel data: 前向连接, 隧道数据
Connect forward, plus notification: 前向连接, 需附通知
APM: Application transport Message 应用传送消息
Action = Connected: 云力作 =已连接
COT: Continuity 连通性消息
ACM: Address Complete Message地址全消息
ANM: Answer Message 应答消息
REL: Release 释放消息
RLC: Release Complete释放完成消息 具体实施方式
如图 1所示, 一种软交换可用性综合评估系统的第一优选实施例, 包括: 协议接口模块 101,用于加载协议栈,实现与被评估软交换设备相应的接口; 测试脚本模块 102,用于为被评估软交换设备提供协议一致性测试脚本和性 能测试脚本;
综合分析模块 103, 用于对被评估软交换设备的各项测试进行综合分析; 显示模块 104, 用于所述综合评估系统与用户之间的交互;
其中, 综合分析模块 103包括:
权重管理单元 1031, 用于为各项测试设置权重;
综合评定单元 1032, 用于根据各项测试结果以及各项测试相应的权重对被 评估软交换设备给出综合评定;
显示模块 104包括:
配置管理单元 1041,用于显示以及为用户提供对综合评估系统的配置管理; 显示评估结果单元 1042, 用于显示所述综合分析模块得出的综合评定。 本发明采用标准的测试方法、 统一的测试用例、 统一的测试结构、 统一的 评估算法, 保证了最终给出的评估结果的可比性。
以下举例详细说明上述各模块及单元的实施例以及有益效果:
( 1 ) 协议接口模块 101 :
软交换通常有两种模式, 即固网软交换和移动网的软交换。 固网软交换常 用于 VoIP网络中,其关键协议是 SIP ( session Initiation Protocol,会话初始协议) 和 H.248; 而移动网的软交换也称为 MSC server, 其关键协议是 BICC (Bearer Independent Call Control, 与承载无关的呼叫控制协议) 和 H.248, 其中, BICC 协议是通过 M3UA (Message Transfer Part level 3 User Adaptation Layer, MTP3 用户适配层) 禾卩 SCTP (Stream Control Transmission Protocol, 流控制传送协议) 承载在 IP层之上。 因此, 对于不同的应用场景, 对接口协议的需求是不一样的。 协议接口模块 101 能够根据配置在接口上加载相应的协议栈, 实现相应的接口 功能。 因此, 使本发明的综合评估系统可以支持多种协议, 为支持各种软交换 设备的测试和评估奠定了基础。
(2) 测试脚本模块 102:
对于协议一致性测试, ETSI ( European Telecommunications Standards Institute, 欧洲电信标准化协会) 禾口 IETF (Internet Engineering Task Force, 互联 网工程任务组) 等国际标准组织对于 BICC、 M3UA、 SCTP、 SIP, H.248 等协 议, 都已经提供了相应的测试方法。 测试脚本模块 102基于这些国际标准, 为 每一种协议都提供一整套的测试集。
通常来说, 国际标准给出的测试项目都是出于开发目的的, 所以给出的测 试集比较完备, 项数也比较多。 考虑到本发明的一个主要目的在于帮助运维人 员评估软交换设备接口协议的兼容性, 所以完全可以在国际标准的基础上进行 精简。 这样可以减少评估测试的工作量, 缩短评估周期。
对于性能测试, 目前, 国际标准都还没有给出统一的测试流程和呼叫模型。 为了保证性能测试的结果具有可比性, 统一的测试脚本和呼叫模型还是非常必 要的。
由于固定软交换和移动软交换分别以 SIP协议和 BICC协议作为呼叫控制协 议, 所以性能测试将考虑以这两种协议的处理能力为主。 建议 SIP协议性能测 试脚本采用图 2所示测试流程, 且不带媒体不采用前提条件, BICC协议采用图 3所示的测试流程, 采用前向快速方式, 也不带媒体。 呼叫模型按照每个呼叫保 持 5秒, 总测试时间为 1小时。 本领域技术人员应当了解的是, 还可以采用其 他两段式的测试流程。
这两个流程都采用两段式测试结构, 即建立从评估系统到被评估设备的呼 叫, 以及建立从被评估设备到评估系统的呼叫两段信令路由。 采用两段式的测 试结构的优势在于:
1 ) 降低了被评估设备对数据配置 (Configuration) 的要求。 被评估设备不 需要配置相应的终端号码和终端 IP地址, 只要有相应的路由数据就可以, 能够 将始发的呼叫转发给被叫侧。 这在大话务量测试时, 优势尤为明显, 因为在大 话务量测试是常常需要数以千计的终端数量。
2) 简化了测试方法。 由于采用的统一的标准的测试流程, 测试中, 所有的 测试项目, 包括性能测试项目都可以只采用一种测试结构, 即图 4 的测试结构。 更重要的是, 这种测试结构和测试流程适用于所有的软交换设备, 在测试不同 的软交换设备时, 只要按照图 4 的结构连接好, 并且配置一些基本的局数据, 如 IP地址、 信令点编码等就可以开始测试。 3 ) 能够对 SIP协议和 BICC的一些比较重要的功能进行检验。 除了基本呼 叫控制功能外, 还能够验证 SIP对可靠的临时性响应的支持, SDP offer和 SDP answer的交互, BICC的隧道承载建立方式和导通检验功能等等。 这两个流程可 认为是 SIP协议和 BICC协议众多流程中最为经典的流程。
(3 ) 综合分析模块 103:
综合分析模块 103包括权重管理单元 1031以及综合评定单元 1032; 其中: 权重管理单元 1031用于权重的分配和权重的设置。 具体地, 其可以为每一 种类型的测试 (协议测试或性能测试)、 每一种协议甚至是每一个测试用例分配 权重。 所有的权重值都是可设置的, 作为进一步的优选, 还可以给出建议值。 各权重的建议值如下:
Figure imgf000010_0001
例如, 协议一致性测试中每个测试项目的权重 Wi可以由专家来评定, 采用 五分制, 5分表示该测试项目重要性最高, 1分表示重要性最低, 由多名专家给 出的分值取平均值作为该测试项目的权重。 每个测试项目的权重 Wi确定之后, 输入到 "权重管理单元 1031 "中, 由权重管理单元 1031为每个测试项目分配确 定好的权重。
综合评定单元 1032根据各项测试的测试结果按照预定算法对被评估软交换 设备给出一个可以用于横向比较的综合评分。 作为优选, 综合评分可以按如下
^sut一 con Y V con 1 ^ per " per ( 1-1 )
Scon = ^ Sprotoi ^ Wprotoi ( 1-2) ^ bk * w, * 100
Sproto =―—— ( ι_ ) k=\
其中:
&w表示被评估软交换设备的综合得分;
Scon表示被评估软交换设备在协议一致性测试中的得分;
表示被评估软交换设备在性能测试中的得分;
wcon表示接口协议一致性测试的权重;
表示性能测试的权重;
Sprai^表示协议 i在协议一致性测试中获得的评分;
Wpra^表示协议 i的权重, 其值为上述权重管理单元 1031中设置的值, 也 可以采用建议值;
bk表示第 k项测试是否通过, 通过则取值 " 1 ", 否则取值 " 0 ";
Wk表示第 k项测试的权重, 该值建议由多名专家给出, 取平均值作为该项 的建议权重。
其中, 的取值可根据实际测试的 CAPS ( Call Attempt Per Second, 每秒 试呼数) 对照下表获得。 其中, SIP测试采用图 3的流程, BICC测试采用图 4 的流程。需要说明的是,这里的 CAPS都是指 UA1或 MSC Server A发起的呼叫 数, 且都是不带有媒体的呼叫。
协议 CAPS Sper
SIP > 650 100
SIP 500 80
SIP 400 60
SIP 300 40
SIP 200 20
SIP <50 0
BICC ^400 100
BICC 300 80
BICC 250 60
BICC 200 40
BICC 100 20
BICC <50 0
(4) 显示模块 104:
显示模块 104是本发明的综合评估系统与用户之间交互的接口, 主要包括 配置管理单元 1041, 显示综合评定单元 1042。 配置管理单元 1041主要提供系统的一些配置管理功能, 包括被评估软交换 设备应用场景的选择 (固定软交换还是移动软交换的选择)、 协议栈的配置、 权 重参数的定义和分配、 IP地址和端口号的配置等等。
显示评估结果单元 1042在对被评估软交换设备进行测试之后显示上述综合 评定单元 1032计算所得的综合评分。
显示评估结果单元 1042还可以显示出各个测试项目的得分, 例如性能测试 的得分, SIP协议测试的得分, 甚至还可以查看每一个测试用例的测试结果。
本发明采用标准的测试方法、 统一的测试用例、 统一的测试结构、 统一的 评估算法, 保证了最终给出的评估结果的可比性。 该评估结果不仅适用于不同 厂家生产的软交换设备之间的比较, 而且适用于同一厂家生产的同一款软交换 设备的不同版本之间的比较。
对软交换的运维人员来说, 除了关心被测软交换哪些项目未通过之外, 其 实还会关心这些未通过的项目可能会带来哪些问题。 而且后者通常是软交换运 维人员更为关心的问题, 因为软交换运维人员由于自身专业背景和经验有限, 往往更希望有权威的专家能明确地为之解答第二个问题。
因此, 本发明的另一个优选实施例如图 5所示, 是在上述实施例的基础上, 在综合分析模块 103中增加了数据库 1033。 所述数据库中预置的信息包括: 各 项测试的测试目的, 测试未通过时可能导致的问题, 如何改进的建议, 日常维 护中的注意事项中的一种或多种信息。
相应地, 显示模块 104中还包括显示评估建议单元 1043, 用于显示未通过 的测试项可能导致的问题, 如何改进的建议, 日常维护中的注意事项中的一种 或多种信息。
本发明正是通过建立数据库, 为每一个测试用例关联一条专家结论和建议, 实现了对测试结果的分析, 满足用户的需求。
由于采用了存储有专家知识的数据库, 在评估测试结束后, 评估系统很容 易对被测设备给出专家级的评估结论, 同时还能给出一些改进建议或注意事项 等建议。 评估的结论并非是简单的通过与否, 而是针对未通过的测试项, 给出 更深入的分析, 说明可能会存在的问题。
虽然, 本发明已通过以上实施例及其附图而清楚说明, 然而在不背离本发 明精神及其实质的情况下, 所属技术领域的技术人员当可根据本发明作出各种 相应的变化和修正, 但这些相应的变化和修正都应属于本发明的权利要求的保 护范围。

Claims

权 利 要 求 书
1、 一种软交换可用性综合评估系统, 其特征在于, 包括:
协议接口模块, 用于加载协议栈, 实现与被评估软交换设备相应的接口; 测试脚本模块, 用于为被评估软交换设备提供协议一致性测试脚本和性能 测试脚本;
综合分析模块, 用于对被评估软交换设备的各项测试进行综合分析, 包括: 权重管理单元, 用于为各项测试设置权重;
综合评定单元, 用于根据各项测试结果以及各项测试相应的权重对被 评估软交换设备给出综合评定;
显示模块, 用于所述综合评估系统与用户之间的交互, 包括:
配置管理单元, 用于为用户提供对该综合评估系统的配置管理; 显示评估结果单元, 用于显示所述综合分析模块得出的综合评定。
2、 根据权利要求 1所述的软交换可用性综合评估系统, 其特征在于, 所述 综合评定单元用于根据各项测试结果以及各项测试相应的权重按预定算法计算 出被评估软交换设备在协议一致性测试中的得分、 在性能测试中的得分以及综 合得分。
3、 根据权利要求 2所述的软交换可用性综合评估系统, 其特征在于, 所述 在协议一致性测试中的得分, 在性能测试中的得分以及综合得分的预定算法如 下:
Figure imgf000014_0001
Scon = ^ Sprotoi * Wprotot
Sproto -
Figure imgf000014_0002
其中, 表示被评估软交换设备的综合得分, &。„表示被评估软交换设备 在协议一致性测试中的得分, S^表示被评估软交换设备在性能测试中的得分,
W 表示接口协议一致性测试的权重, 表示性能测试的权重, Sprac^表示协 议 i在协议一致性测试中获得的评分, Wpra^表示协议 i的权重, bk表示第 k 项测试是否通过, 是则取值 " 1 ", 否则取值 " 0", Wk表示第 k项测试的权重。
4、 根据权利要求 3所述的软交换可用性综合评估系统, 其特征在于, 所述 综合评定单元中预设有根据每秒试呼数的被评估软交换设备在性能测试中的得 分。
5、 根据权利要求 2、 3、 4中任意一项所述的软交换可用性综合评估系统, 其特征在于, 所述显示评估结果单元还用于显示各项测试的得分。
6、 根据权利要求 1所述的软交换可用性综合评估系统, 其特征在于, 所述 测试脚本模块中的性能测试脚本采用两段式测试结构。
7、 根据权利要求 1所述的软交换可用性综合评估系统, 其特征在于, 所述 综合分析模块中还包括数据库, 所述数据库中预置的信息包括: 各项测试的测 试目的, 重要级别, 某项测试未通过时可能导致的问题, 如何改进的建议, 日 常维护中的注意事项中的一种或多种信息。
8、 根据权利要求 7所述的软交换可用性综合评估系统, 其特征在于, 所述 显示模块中还包括显示评估建议单元, 用于根据各项测试结果显示未通过的测 试项的测试目的, 重要级别, 可能导致的问题, 如何改进的建议, 日常维护中 的注意事项中的一种或多种信息。
PCT/CN2010/000805 2010-06-08 2010-06-08 一种软交换可用性综合评估系统 WO2011153657A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201080067273.8A CN103098417B (zh) 2010-06-08 2010-06-08 一种软交换可用性综合评估系统
PCT/CN2010/000805 WO2011153657A1 (zh) 2010-06-08 2010-06-08 一种软交换可用性综合评估系统

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2010/000805 WO2011153657A1 (zh) 2010-06-08 2010-06-08 一种软交换可用性综合评估系统

Publications (1)

Publication Number Publication Date
WO2011153657A1 true WO2011153657A1 (zh) 2011-12-15

Family

ID=45097431

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2010/000805 WO2011153657A1 (zh) 2010-06-08 2010-06-08 一种软交换可用性综合评估系统

Country Status (2)

Country Link
CN (1) CN103098417B (zh)
WO (1) WO2011153657A1 (zh)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1809003A (zh) * 2006-02-08 2006-07-26 信息产业部电信传输研究所 测试移动软交换设备性能的测试系统及方法
CN1905497A (zh) * 2006-07-31 2007-01-31 西安西电捷通无线网络通信有限公司 一种端到端服务等级协议的评估方法和评估装置
CN1988483A (zh) * 2006-12-22 2007-06-27 武汉市中光通信公司 基于网元仿真的软交换设备测试系统

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100369422C (zh) * 2004-09-28 2008-02-13 中兴通讯股份有限公司 一种媒体服务器的综合测试仪及综合测试方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1809003A (zh) * 2006-02-08 2006-07-26 信息产业部电信传输研究所 测试移动软交换设备性能的测试系统及方法
CN1905497A (zh) * 2006-07-31 2007-01-31 西安西电捷通无线网络通信有限公司 一种端到端服务等级协议的评估方法和评估装置
CN1988483A (zh) * 2006-12-22 2007-06-27 武汉市中光通信公司 基于网元仿真的软交换设备测试系统

Also Published As

Publication number Publication date
CN103098417A (zh) 2013-05-08
CN103098417B (zh) 2016-04-06

Similar Documents

Publication Publication Date Title
US20110202645A1 (en) Methods and Apparatus to Test Network Elements
JP2015523755A (ja) 音声品質劣化推定を可能にするネットワーク通信のための障害シミュレーション
US7616740B2 (en) Method, system, and computer-readable medium for simulating a converged network with a single media gateway and media gateway controller
US11811844B2 (en) Product validation based on simulated enhanced calling or messaging communications services in telecommunications network
WO2006091820A2 (en) Voip call through tester
CN104486359B (zh) Ims网络的语音质量的测试方法及装置、监控方法及系统
US8213327B2 (en) Communication quality measurement system, device, management server and method thereof
WO2007056921A1 (fr) Dispositif de verification de protocole et procede de verification de protocole associe
US11849492B2 (en) Unified query tool for network function virtualization architecture
JP2006340348A (ja) Voipネットワークの特性付け
US20230081333A1 (en) Unified interface and tracing tool for network function virtualization architecture
CN105471680A (zh) 一种网络语音通信测试系统及方法
US9699674B2 (en) Technique for testing wireless network load produced by mobile app-carrying devices
WO2017054490A1 (zh) 一种会话业务测试管理方法和系统及管理端、测试端
TWI638548B (zh) 寬頻迴路障礙查測系統
US20100278049A1 (en) System and Method for Testing a Dynamic Communication Across a Network
US8184633B2 (en) Automated interoperability assessments based on iterative profling and emulation of SIP or T.38 fax-relay enabled devices
US8687502B2 (en) Method and apparatus for enabling auto-ticketing for endpoint devices
WO2011153657A1 (zh) 一种软交换可用性综合评估系统
WO2011072948A1 (en) Connection analysis in communication systems
US8406380B2 (en) Test phone using SIP
CN105282541B (zh) 一种应用于视频点播系统的测试装置及测试方法
US20170041210A1 (en) VoIP QUALITY TEST VIA MANUAL PHONE CALL INTO VoIP MONITORING SYSTEM
CN106411836B (zh) 通话连接方法及客户端
Zhang et al. Benchmarking the session initiation protocol (SIP)

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080067273.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10852650

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 10852650

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: "NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC, EPO FORM 1205A DATED 13.06.2013."

122 Ep: pct application non-entry in european phase

Ref document number: 10852650

Country of ref document: EP

Kind code of ref document: A1