1、 ETSI TR 101 577 V1.1.1 (2011-12) Methods for Testing and Specifications (MTS); Performance Testing of Distributed Systems; Concepts and Terminology Technical Report ETSI ETSI TR 101 577 V1.1.1 (2011-12) 2Reference DTR/MTS-00120PerfTestDistSys Keywords performance, terminology, testing ETSI 650 Rout
2、e des Lucioles F-06921 Sophia Antipolis Cedex - FRANCE Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16 Siret N 348 623 562 00017 - NAF 742 C Association but non lucratif enregistre la Sous-Prfecture de Grasse (06) N 7803/88 Important notice Individual copies of the present document can be downloaded
3、from: http:/www.etsi.org The present document may be made available in more than one electronic version or in print. In any case of existing or perceived difference in contents between such versions, the reference version is the Portable Document Format (PDF). In case of dispute, the reference shall
4、 be the printing on ETSI printers of the PDF version kept on a specific network drive within ETSI Secretariat. Users of the present document should be aware that the document may be subject to revision or change of status. Information on the current status of this and other ETSI documents is availab
5、le at http:/portal.etsi.org/tb/status/status.asp If you find errors in the present document, please send your comment to one of the following services: http:/portal.etsi.org/chaircor/ETSI_support.asp Copyright Notification No part may be reproduced except as authorized by written permission. The cop
6、yright and the foregoing restriction extend to reproduction in all media. European Telecommunications Standards Institute 2011. All rights reserved. DECTTM, PLUGTESTSTM, UMTSTMand the ETSI logo are Trade Marks of ETSI registered for the benefit of its Members. 3GPPTM and LTE are Trade Marks of ETSI
7、registered for the benefit of its Members and of the 3GPP Organizational Partners. GSM and the GSM logo are Trade Marks registered and owned by the GSM Association. ETSI ETSI TR 101 577 V1.1.1 (2011-12) 3Contents Intellectual Property Rights 6g3Foreword . 6g3Introduction 6g31 Scope 7g31.1 The organi
8、zation of the present document . 7g32 References 7g32.1 Normative references . 7g32.2 Informative references 7g33 Definitions and abbreviations . 8g33.1 Definitions 8g33.2 Abbreviations . 16g34 Performance characteristics 17g34.1 Classifying performance characteristics into categories . 17g34.2 Powe
9、rfulness characteristics . 17g34.2.1 Responsiveness characteristics . 17g34.2.2 Capacity characteristics 20g34.2.3 Scalability characteristics . 21g34.3 Reliability characteristics . 21g34.3.1 Quality-of-Service characteristics . 21g34.3.2 Stability characteristics . 22g34.3.3 Availability character
10、istics . 22g34.3.4 Robustness characteristics 23g34.3.5 Recovery characteristics . 23g34.3.6 Correctness characteristics 23g34.4 Efficiency characteristics 23g34.4.1 Service resource usage characteristics 24g34.4.2 Service resource linearity characteristics 24g34.4.3 Service resource scalability cha
11、racteristics . 24g34.4.4 Service resource bottleneck characteristics . 24g34.4.5 Platform resource utilization characteristics . 25g34.4.6 Platform resource distribution characteristics . 25g34.4.7 Platform resource scalability characteristics . 25g35 Measured objects 26g35.1 Measured services 26g35
12、.2 Measured components 26g35.3 Service concepts . 26g35.3.1 Service and component performance 26g35.3.2 Service topology and topology performance 27g35.4 Service characteristics 27g35.4.1 Service initiation characteristics . 28g35.4.2 Service duration characteristics 28g35.4.3 Service resource and l
13、oad characteristics 29g35.4.4 Service design characteristics . 29g35.4.5 Service flow characteristics 30g35.5 Service Interfaces . 31g35.5.1 Application Programming Interfaces (API) 31g35.5.2 Communication Protocol Interfaces 31g36 Performance measurement data objectives and attributes 32g36.1 Perfo
14、rmance metric objectives . 32g36.2 Measurement data attribute sets . 33g36.3 Processing attributes or Metric types 34g36.3.1 Metrics based on raw performance data . 34g3ETSI ETSI TR 101 577 V1.1.1 (2011-12) 46.3.2 Metrics based on normalized performance data 34g36.3.3 Metrics based on transformed pe
15、rformance data 34g36.3.4 Metrics based on composite performance data . 34g36.4 Identification attributes or Metric identifiers 34g36.4.1 Measurement type . 35g36.4.2 Measurement points 35g36.4.3 Measurement recording time 35g36.5 Unit attributes or Metric formats 36g36.6 Conditional attributes . 36g
16、36.6.1 Requested conditions 37g36.6.2 Actual conditions 37g37 Abstract performance metrics 37g37.1 Abstract performance metrics and performance categories 37g37.2 Abstract powerfulness metrics . 37g37.2.1 Capacity metrics and related attributes . 37g37.2.2 Responsiveness metrics and related attribut
17、es 39g37.2.3 Scalability metrics and related attributes 40g37.3 Abstract reliability metrics . 41g37.3.1 Quality-of-Service metrics and related attributes . 41g37.3.2 Stability metrics and related attributes 41g37.3.3 Availability metrics and related attributes 43g37.3.4 Robustness metrics and relat
18、ed attributes . 45g37.3.5 Recovery metrics and related attributes 46g37.3.6 Correctness metrics and related attributes 47g37.4 Abstract efficiency metrics . 48g37.4.1 Service resource usage metrics and related attributes . 48g37.4.2 Service resource linearity metrics and related attributes . 49g37.4
19、.3 Service resource scalability characteristics . 50g37.4.4 Platform resource utilization metrics and related attributes 51g37.4.5 Platform resource distribution metrics and related attributes 52g37.4.6 Platform resource scalability metrics and related attributes 52g38 Performance data processing 53
20、g38.1 Steps in performance data processing 53g38.2 Time series of performance data 53g38.3 Collection and storage of raw performance data 54g38.4 Condensation and normalization of raw performance data 54g38.5 Performance data computations . 54g38.5.1 Trend analysis . 54g38.5.2 Comparisons of regress
21、ion tests 54g38.5.3 Computations of composite performance metrics. 54g38.6 Evaluation of performance data 54g38.7 Presentation of performance data . 55g39 General performance test concepts . 55g39.1 Performance tests . 55g39.2 Performance tests and system life cycle phases . 55g39.2.1 Pre-deployment
22、 performance test applications . 55g39.2.2 Post-deployment performance test applications 56g39.3 Performance test objectives 57g39.3.1 Confirmative performance tests 57g39.3.2 Explorative performance tests. 57g39.4 Performance objectives and performance requirements . 57g39.5 Performance measurement
23、 conditions 58g39.5.1 External measurement conditions . 58g39.5.2 Internal measurement conditions 58g39.5.3 Example 58g39.6 Performance targets 58g39.7 Performance measurements standards 58g39.8 Some performance test characteristics . 58g39.8.1 Test coverage 58g39.8.2 Test purposes 58g3ETSI ETSI TR
24、101 577 V1.1.1 (2011-12) 59.8.3 Test cases 58g39.8.4 Test concurrency . 59g39.8.5 Test resources . 59g39.8.6 Test execution and test case 59g39.8.7 Test execution time . 59g39.8.8 Recorded test data . 59g39.8.9 Test data evaluation and test results 59g310 Performance test environment 59g310.1 Test e
25、nvironment concepts . 59g310.1.1 Test Bed concepts . 60g310.1.2 Test Site concepts . 60g310.2 System Under Test concepts 60g310.2.1 System Under Test components 60g310.2.2 Borders of a System Under Test . 61g310.2.3 System Under Test replacements 61g310.3 Test System concepts . 62g310.3.1 Performanc
26、e Test Tools 62g310.3.2 Service handling tools . 62g310.3.3 Service Simulation Tools 63g310.3.4 Performance data recording tools . 63g310.3.5 Performance test monitoring tools 64g310.3.6 Performance data processing tools 64g310.3.7 Performance evaluation tools 65g310.3.8 Performance presentation too
27、ls . 65g311 Performance test specifications 65g311.1 Elements of performance test specifications 65g311.2 Test objectives 65g311.3 Test conditions . 66g311.3.1 Test specification prerequisites . 66g311.3.2 Test Execution Pre-conditions 66g311.3.3 Operational Measurement conditions . 67g311.3.4 Test
28、Execution Post-conditions . 67g311.4 Test configurations . 67g311.4.1 Workload specifications . 68g311.4.2 Test bed specifications 68g311.4.3 Data collection specifications . 68g311.5 Test Data Specifications . 69g311.5.1 Test Data for service requests . 69g311.5.2 Test Data for SUT operability. 69g
29、311.5.3 Test Data for performance evaluation . 69g311.6 Test evaluation specifications . 69g312 Workload concepts . 70g312.1 Workload set or Traffic set . 70g312.2 Workload content . 70g312.2.1 User Session Scenarios . 70g312.2.2 Requested Service Profile . 70g312.2.3 Service scenarios 70g312.3 Work
30、load volume . 71g312.4 Load concepts . 71g312.4.1 User session driven load . 71g312.4.2 Traffic rate driven load . 71g312.5 Workload time distribution. 71g312.5.1 Load profiles . 71g312.5.2 Load patterns 71g3History 73g3ETSI ETSI TR 101 577 V1.1.1 (2011-12) 6Intellectual Property Rights IPRs essenti
31、al or potentially essential to the present document may have been declared to ETSI. The information pertaining to these essential IPRs, if any, is publicly available for ETSI members and non-members, and can be found in ETSI SR 000 314: “Intellectual Property Rights (IPRs); Essential, or potentially
32、 Essential, IPRs notified to ETSI in respect of ETSI standards“, which is available from the ETSI Secretariat. Latest updates are available on the ETSI Web server (http:/ipr.etsi.org). Pursuant to the ETSI IPR Policy, no investigation, including IPR searches, has been carried out by ETSI. No guarant
33、ee can be given as to the existence of other IPRs not referenced in ETSI SR 000 314 (or the updates on the ETSI Web server) which are, or may be, or may become, essential to the present document. Foreword This Technical Report (TR) has been produced by ETSI Technical Committee Methods for Testing an
34、d Specification (MTS). Introduction The background to the present document is that few standard specifications define any kind of performance or performance targets. A common explanation to this is that capacity issues are not a matter for standardization bodies, but for product vendors and product
35、buyers to decide. This opinion exclude the need for definition and application of other performance characteristics such as reliability, stability, efficiency and many other. Another more fundamental reason for writing the present document is that there are no strict definitions of what kind of char
36、acteristics should be regarded as indicators of performance. In the absence of strict definitions in this area we tend to talk about performance characteristics in terms of types of performance tests, such as robustness tests or availability tests. A consequence of this is that conformance testing a
37、s specified in ISO 9646 i.1 does not cover performance tests explicitly. A starting point for is therefore a set of terminology and descriptions of concepts in performance testing that could be accepted as a common base for discussions about performance and performance tests. ETSI ETSI TR 101 577 V1
38、.1.1 (2011-12) 71 Scope The present document describes terminology and concepts of performance tests with a generalized view of performance characteristics as the starting point. What kind of characteristics are indicators of a products performance and what kind of measurement data is captured and p
39、rocessed to provide relevant figures on requested performance is in this view the kernel of performance testing. Methods for performance testing will consequently be guided by the requirements on expected output. A set of following documents will describe strategies, methodologies and techniques of
40、performance testing. 1.1 The organization of the present document The present document has two parts. Part one is about performance characteristics and performance metrics. It contains a general view of performance characteristics followed by a general view of measured objects, or what is measured i
41、n performance tests. Part one also contains general objectives and attributes of performance data and performance metrics, i.e. a conceptual view of performance metrics that is followed by a catalogue of abstract performance metrics that covers the performance characteristics described earlier. At t
42、he end of part one is a conceptual view of performance data processing, i.e. the steps between captured performance data and presented results on performance. Part two is about performance testing concepts. It starts with a general view of performance testing followed by a conceptual view of the tes
43、t environment and finally a conceptual view of performance test specifications. 2 References References are either specific (identified by date of publication and/or edition number or version number) or non-specific. For specific references, only the cited version applies. For non-specific reference
44、s, the latest version of the reference document (including any amendments) applies. Referenced documents which are not found to be publicly available in the expected location might be found at http:/docbox.etsi.org/Reference. NOTE: While any hyperlinks included in this clause were valid at the time
45、of publication, ETSI cannot guarantee their long term validity. 2.1 Normative references The following referenced documents are necessary for the application of the present document. Not applicable. 2.2 Informative references The following referenced documents are not necessary for the application o
46、f the present document but they assist the user with regard to a particular subject area. i.1 ISO 9646: “Information technology - Coding of audio-visual objects“. ETSI ETSI TR 101 577 V1.1.1 (2011-12) 83 Definitions and abbreviations 3.1 Definitions For the purposes of the present document, the foll
47、owing terms and definitions apply: absolute measurement time: time of measurement recording expressed as calendar time actual measurement conditions: recorded conditions on SUT and TS when measurement data is recorded actual external measurement conditions: recorded conditions on the TS when measure
48、ment data is recorded actual internal measurement conditions: recorded conditions on the SUT when measurement data is recorded application layer protocols: protocols that correspond to layer 7 in the OSI model, such as DHCP, HTTP, and SIP application pre-conditions: requested conditions on the SUT b
49、efore a performance can start architectural bottleneck: severe limitation of the throughput capacity of a system service related to the systems architecture artificial load: load generated by a Test System (TS) on a System Under Test (SUT) availability characteristics: subcategory of reliability characteristics describing availability metrics: group of reliability metrics indicating a systems ability to uninterruptedly deliver its services back-end borders: intersections between the SUT and Service responding tools of the TS for outgoing service requests from the SUT