ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf

上传人:livefirmly316 文档编号:1256854 上传时间:2019-09-02 格式:PDF 页数:56 大小:3.99MB
下载 相关 举报
ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf_第1页
第1页 / 共56页
ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf_第2页
第2页 / 共56页
ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf_第3页
第3页 / 共56页
ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf_第4页
第4页 / 共56页
ISO IEC 14756-1999 Information technology - Measurement and rating of performance of computer-based software systems《信息技术 以计算机为基础的软件系统性能的测量和评定》.pdf_第5页
第5页 / 共56页
点击查看更多>>
资源描述

1、STDmISO LI75b-ENGL L777 I851703 0829328 ITT m INTERNATIONAL STANDARD ISOllEC 14756 First edition 1999-11-15 Information technology - Measurement and rating of performance of computer-based software systems Technologies de linformation - Mesurage et gradation de la performance des systmes de logiciel

2、s dordinateurs This material is reproduced from IS0 documents under International Organization for Standardization (ISO) Copyright License Number HIS/CC/1996. Not for resale. No part of these IS0 documents may be reproduced in any form, electronic retrieval system or otherwise, except as allowed in

3、the copyright law of the country of use, or with the prior written consent of IS0 (Case postale 56,121 1 Geneva 20, Switzerland, Fax +41 22 734 10 79), IHS or the IS0 Licensors members. ISOllEC 14756:1999(E) Reference number ISO/IEC 14756:1999(E) Contents Foreword . v Introduction . vi Section 1 : G

4、eneral 1 1 Scope . 1 2 Conformance 3 3 Normative reference . 3 4 Definitions . 4 5 Abbreviations and symbols 7 5.1 Abbreviations . 7 5.2 Symbols . 8 Section 2: Principles of measurement and rating 10 6 The measurement 10 6.1 Configuration requirements . 10 6.2 User emulation . 10 6.2.1 Random user b

5、ehaviour 10 6.2.2 Remote terminal emulator . 10 6.2.3 Workload parameter set . 11 6.2.4 Parameter set for proving the accuracy of the user emulation . 11 6.3 The measurement procedure 12 6.3.1 The time phases of the measurement procedure . 12 6.3.2 Writing a measurement logfile 13 6.3.3 Writing a co

6、mputation result file 13 6.4 Proof of validity of the measurement . 13 6.4.1 Proof of the CBSSs computational correctness . 13 6.4.2 Proof of the remote terminal emulators accuracy 13 6.4.3 Proof of the measurement results statistical significance 13 7 Calculation of the performance values of the SU

7、T . 14 7.1 Mean execution time 14 7.2 Throughput 14 7.3 Timely throughput 14 8 Basic data for rating . 14 8.1 User requirements . 14 8.2 The reference environment for rating software efficiency 14 8.2.1 Reference environment for assessing application software efficiency 15 8.2.2 Reference environmen

8、t for assessing system software efficiency . 15 0 I SOA EC 1 999 All rights reserved . Unless otherwise specified. no part of this publication may be reproduced or utilized in any form or by any means. electronic or mechanical. including photocopying and microfilm. without permission in writing from

9、 the publisher . ISOAEC Copyright Office Case postale 56 CH-121 1 Genve 20 Switzerland Printed in Switzerland ii Q ISOAEC ISO/IEC 14756:1999(E) 9 Rating the performance values 15 9.1 Computing the performance reference values 15 9.1.1 Mean execution time reference values . 15 9.1.2 Throughput refere

10、nce values 15 9.2 Computing the performance rating values 15 9.2.1 The mean execution time rating values 15 9.2.2 Throughput rating values 15 9.2.3 The timeliness rating values . 16 9.3 Rating the overall performance of the SUT . 16 9.4 Assessment of performance 17 9.4.1 The steps of assessment proc

11、ess 17 9.4.2 Weak reference environment . 17 Section 3: Detailed procedure for measurement and rating . 18 10 Input requirements . 18 10.1 The SUT description 18 10.1.1 Specification of the hardware architecture and configuration . 18 10.1.2 Specification of the system software configuration . 18 10

12、.1.3 The application programs . 19 10.1.4 Additional software required for the measurement run . 19 10.1.5 The stored data . 19 10.1.6 Additional information for proof . 19 10.2 The workload parameter set 19 10.2.1 The activity types 19 10.2.2 Activity input variation 20 10.2.3 The task types with t

13、imeliness function and task mode 20 10.2.4 The chain types and their frequencies 21 10.2.5 Preparation times mean values and their standard deviations 21 10.3 Input for measurement validation 22 10.3.1 Correct computation results 22 10.3.2 Variation of input data and its resulting output 22 10.3.3 C

14、riteria for precision of working of the RTE 22 10.3.4 Criteria for statistical validity of results 22 11 The measurement 22 11.1 The measurement procedure 22 11.2 Individual rating interval . 23 12 Output from measurement procedure 25 12.1 Measurement logfile 25 12.2 Computation result file . 25 13

15、Validation of measurements 26 13.1 Validation of the computational correctness of the SUT . 26 13.2 Validation of the accuracy of the RTE . 26 13.2.1 Validity test by checking the relative chain frequencies 26 13.2.2 Validity test by checking the preparation times . 26 13.3 Validation of the statist

16、ical significance of the measured mean execution time 27 14 Calculation of the performance values of the SUT 28 14.1 Mean execution time . 28 14.2 Throughput 28 14.3 Timely throughput 28 iii ISO/IEC 14756:1999(E) Q ISOAEC 15 Rating the measured performance values of the SUT . 29 15.1 Specification o

17、f rating level 29 15.2 Computing performance reference values 29 15.2.1 Mean execution time reference values . 29 15.2.2 Throughput reference values 29 15.3 Computing rating values 29 15.3.1 Computing mean execution time rating values . 29 15.3.2 Computing throughput rating values . 30 15.3.3 Comput

18、ing timeliness rating values 30 15.4 Rating 30 15.4.1 Mean execution time rating . 30 15.4.2 Throughput rating 31 15.4.3 Timeliness rating . 31 15.4.4 Overall rating . 31 Annex A (normative) Specification of the RTES basic functions . 32 Annex B (normative) Additional calculation formulas . 33 Annex

19、 C (normative) Format of the workload description 41 Annex D (normative) Format of the logfile . 45 Annex E (informative) Utility programs . 46 Annex F (informative) Examples of workloads . 48 iv Q ISOAEC ISO/IEC 14756:1999(E) Foreword IS0 (the International Organization for Standardization) and IEC

20、 (the International Electrotechnical Commission) form the specialized system for worldwide standardization. National bodies that are members of IS0 or IEC participate in the development of International Standards through technical committees established by the respective organization to deal with pa

21、rticular fields of technical activity. IS0 and IEC technical committees collaborate in fields of mutual interest. Other international organizations, governmental and non-governmental, in liaison with IS0 and IEC, also take part in the work. In the field of information technology, IS0 and IEC have es

22、tablished a joint technical committee, ISOAEC JTC 1. Draft International Standards adopted by the joint technical committee are circulated to national bodies for voting. Publication as an International Standard requires approval by at least 75 % of the national bodies casting a vote. International S

23、tandard ISOAEC 14756 was prepared by Joint Technical Committee ISOAEC JTC 1, Information technology, Subcommittee SC 7, Software engineering. Annexes A to D form an integral part of this International Standard. Annexes E and F are for information only. V ISO/IEC 14756:1999(E) Q ISOAEC Introduction I

24、n both the planning and using of data processing systems, the speed of execution is a significant property. This property is influenced greatly by the efficiency of the software used in the system. Measuring the speed of the system as well as the influence of the efficiency of the software is of ele

25、mentary interest. In order to measure the influence of software on the time behaviour of a data processing system it is necessary to measure the time behaviour of the whole system. Based on the metrics of the measurement procedure proposed in this standard it is possible to define and to compute the

26、 values of the time efficiency of the software. It is important that time behaviour characteristics are estimated in a reproducible way. Therefore it is not possible to use human users in the experiment. One reason is that human users cannot reproduce longer phases of computer usage several times wi

27、thout deviations in characteristics of usage. Another reason is that it would be too expensive to carry out such experiments with human users if the job or task stream comes from many users. Therefore an emulator is used which emulates all users by use of a second data processing system. This means

28、that measurement and rating of performance according to this International Standard needs a tool. This tool is the emulator which shall work according to the specifications of this standard. It has to be proven that the emulator used actually fulfils these specifications. All relevant details of thi

29、s experiment are recorded in a logfile by the user emulator. From this logfile the values which describe the time behaviour (for instance response times and throughput values) can be computed. From these performance values the software efficiency rating values will be computed. Not all of these valu

30、es are always necessary to carry out a measurement and rating procedure. For instance if a simple workload having only a few interactive task types or only a simple sequence of batch jobs is used, then only a small subset of all terms and values which are defined is required. This method also allows

31、 the measuring and rating of a large and complex computer-based software system (CBSS) processing a complex job or task stream which is generated by a large set of many different users. As far as it is necessary the definitions include mathematical terms. This is in order to obtain an exact mathemat

32、ical basis for the computations of performance and rating values and for checking the correctness of the measurement run and rating steps as well as for the (statistical) significance of the performance values and rating results. The result of a measurement consists of the calculated performance val

33、ues. These are throughput values and execution time values. The final result of performance assessment of a CBSS consists of the rating values. They are gained by comparing the calculated performance values with the users requirements. In addition it is possible - if desired - to rate the performanc

34、e values of the CBSS under test by comparing them with those of a reference CBSS (for instance having the same hardware configuration but another version of the application program with the same functionality). The result of the rating procedure is a set of values, each being greater than, less than

35、 or equal to 1. The rating values have the meaning of “better than“, “worse than“ or “equal to“ the defined requirements (or the properties of a second system under test used as a reference). The final set of rating values assesses each task type which are defined separately in the workload. Annexes

36、 E and F contain software as well as special data that are not printable. Therefore they are delivered on the CD-ROM which constitutes this International Standard. A short overview is provided in both annexes. vi INTERNATIONAL STANDARD Q ISOAEC ISO/IEC 14756:1999(E) Information technology - Measurem

37、ent and rating of performance of computer-based software systems Section 1 : General 1 Scope This International Standard defines how user oriented performance of computer-based software systems (CBSS) may be measured and rated. A CBSS is a data processing system as it is seen by its users, e.g. by u

38、sers at various terminals, or as it is seen by operational users and business users at the data processing center. A CBSS includes hardware and all its software (system software and application software) which is needed to realize the data processing functions required by the users or what may influ

39、ence to the CBSSs time behaviour. This International Standard is applicable for tests of all time constrained systems or system parts. Also a network may be part of a system or may be the main subject of a test. The method defined in this International Standard is not limited to special cases like c

40、lassic batch or terminal-host systems, e.g. also included are client server systems or, with a broader comprehension of the definition of task, real time systems. But the practicability of tests may be limited by the expenditure required to test large environments. This International Standard specif

41、ies the key figures of user oriented performance terms and specifies a method of measuring and rating these performance values. The specified performance values are those which describe the execution speed of user orders (tasks), namely the triple of: - execution time, - throughput, - timeliness. Th

42、e user orders, subsequently called tasks, may be of simple or complex internal structure. A task may be a job, transaction, process or a more complex structure, but with a defined start and end depending on the needs of the evaluator. When evaluating the performance it is possible to use this Intern

43、ational Standard for measuring the time behaviour with reference to business transaction completion times in addition to other individual response times. The rating is done with respect to users requirements or by comparing two or more measured systems (types or versions). Intentionally no proposals

44、 for measuring internal values, such as: - utilisation values, - mean instruction rates, - path lengths, - cache hit rates, - queuing times, - service times, 1 ISO/IEC 14756:1999(E) Q ISOAEC are given, because the definition of internal values depends on the architecture of the hardware and the soft

45、ware of the system under test. Contrary to this the user oriented performance values which are defined in this International Standard are independent of architecture. The definition of internal performance values can be done independently from the definition of user oriented performance values. They

46、 may be used and can be measured in addition to the user oriented performance values. Also the definition of terms for the efficiency with which the user oriented values are produced can be done freely. In addition this International Standard gives guidance on how to establish at a data processing s

47、ystem a stable and reproducible state of operation. This reproducible state may be used to measure other performance values such as the above mentioned internal values. This International Standard focuses on: - application software; - system software; - turn-key systems (.e. systems consisting of an

48、 application software, the system software and the - general data processing systems. hardware for which it was designed); This International Standard specifies the requirements for an emulation (by a technical system -the so-called remote terminal emulator (RTE) - of user interactions with a data p

49、rocessing system. It is the guideline for precisely measuring and rating the user oriented performance values. It provides the guideline for estimating these values with the required accuracy and repeatability of CBSSs with deterministic as well as random behaviour of users. It is also a guidance for implementing a RTE or proving whether it works according to this International Standard. This International Standard provides the guideline to measure and rate the performance of CBSS with random user behaviour when the accuracy and repeatability is required. It specifies in deta

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 标准规范 > 国际标准 > 其他

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1