1、 ETSI TR 101 582 V1.1.1 (2014-06) Methods for Testing and Specification (MTS); Security Testing; Case Study Experiences Technical Report ETSI ETSI TR 101 582 V1.1.1 (2014-06) 2Reference DTR/MTS-101582 SecTestCase Keywords analysis, security, testing ETSI 650 Route des Lucioles F-06921 Sophia Antipol
2、is Cedex - FRANCE Tel.: +33 4 92 94 42 00 Fax: +33 4 93 65 47 16 Siret N 348 623 562 00017 - NAF 742 C Association but non lucratif enregistre la Sous-Prfecture de Grasse (06) N 7803/88 Important notice The present document can be downloaded from: http:/www.etsi.org The present document may be made
3、available in electronic versions and/or in print. The content of any electronic and/or print versions of the present document shall not be modified without the prior written authorization of ETSI. In case of any existing or perceived difference in contents between such versions and/or in print, the
4、only prevailing document is the print of the Portable Document Format (PDF) version kept on a specific network drive within ETSI Secretariat. Users of the present document should be aware that the document may be subject to revision or change of status. Information on the current status of this and
5、other ETSI documents is available at http:/portal.etsi.org/tb/status/status.asp If you find errors in the present document, please send your comment to one of the following services: http:/portal.etsi.org/chaircor/ETSI_support.asp Copyright Notification No part may be reproduced or utilized in any f
6、orm or by any means, electronic or mechanical, including photocopying and microfilm except as authorized by written permission of ETSI. The content of the PDF version shall not be modified without the written authorization of ETSI. The copyright and the foregoing restriction extend to reproduction i
7、n all media. European Telecommunications Standards Institute 2014. All rights reserved. DECTTM, PLUGTESTSTM, UMTSTMand the ETSI logo are Trade Marks of ETSI registered for the benefit of its Members. 3GPPTM and LTE are Trade Marks of ETSI registered for the benefit of its Members and of the 3GPP Org
8、anizational Partners. GSM and the GSM logo are Trade Marks registered and owned by the GSM Association. ETSI ETSI TR 101 582 V1.1.1 (2014-06) 3Contents Intellectual Property Rights 6g3Foreword . 6g3Modal verbs terminology 6g31 Scope 7g32 References 7g32.1 Normative references . 7g32.2 Informative re
9、ferences 7g33 Definitions and abbreviations . 8g33.1 Definitions 8g33.2 Abbreviations . 9g34 Overview on case studies . 11g35 Banknote processing case study results 11g35.1 Case study characterization 11g35.1.1 Background . 11g35.1.2 System under test 13g35.1.3 Security risk assessment . 14g35.2 Sec
10、urity testing approaches 15g35.2.1 Detection of vulnerability to injection attacks 15g35.2.1.1 Data Fuzzing with TTCN-3. 16g35.2.1.2 TTCN-3 . 17g35.2.1.3 Data Fuzzing Library 18g35.2.2 Usage of unusual behaviour sequences . 19g35.2.2.1 Behavioural fuzzing of UML sequence diagrams . 20g35.2.2.2 Onlin
11、e model-based behavioural fuzzing 22g35.3 Results 23g35.3.1 Requirements coverage . 23g35.3.2 Test results 24g35.4 Summary and conclusion . 25g36 Banking case study results . 25g36.1 Case study characterization 25g36.2 Security testing approaches 26g36.3 Results 29g36.4 Summary and conclusion . 32g3
12、7 Radio case study results . 32g37.1 Case study characterization 32g37.1.1 Context of Mobile ad-hoc networks . 32g37.1.2 Status of the test of security testing at the beginning of the project 33g37.1.3 Security testing capabilities targeted. 33g37.1.3.1 Frames analysis . 34g37.1.3.2 Data alteration
13、. 34g37.1.3.3 Frames replay 35g37.1.3.4 Denial of service . 36g37.1.3.5 Tampering, malicious code injection 36g37.1.3.6 Combination of threats 37g37.1.4 Description of the use-case . 37g37.1.4.1 Specific application used as Use Case 38g37.1.4.2 Specific context of the application of security testing
14、 tools 38g37.1.4.3 Specific context of the initial validation framework . 38g37.2 Security testing approaches 38g37.2.1 General principles of the security testing tools integration . 38g37.2.1.1 Verification framework adaptation . 39g3ETSI ETSI TR 101 582 V1.1.1 (2014-06) 47.2.1.2 Adaptation of the
15、event driven simulation environment . 39g37.2.2 Properties validated. 41g37.2.3 Active testing 41g37.3 Results 42g37.4 Summary and conclusion . 43g38 Automotive case study results 43g38.1 Case study characterization 43g38.2 Security testing approaches 45g38.2.1 Security risk assessment . 45g38.2.2 F
16、uzzing . 46g38.2.3 IOSTS-based passive testing approach . 47g38.2.3.1 Experimentation results . 48g38.2.3.2 Future works . 48g38.2.4 Security monitoring 48g38.2.5 Framework 50g38.3 Results 51g38.4 Summary and conclusion . 53g39 eHealth case study results. 54g39.1 Case study characterization 54g39.1.
17、1 Patient consent 55g39.1.2 Device pairing . 56g39.1.3 New application features 56g39.2 Security testing approaches 56g39.2.1 Formalization 56g39.2.1.1 Entity overview . 56g39.2.1.2 Environment and sessions . 58g39.2.1.3 Messages . 58g39.2.1.4 Goals . 61g39.2.2 Analysis results using a model checker
18、 63g39.2.3 Technical details . 63g39.2.3.1 eHealth web front-end . 64g39.2.3.2 Device management platform . 64g39.2.3.3 Two-factor authentication service . 64g39.2.4 Improvements of the security model . 65g39.2.5 Considered security properties and vulnerabilities . 65g39.2.5.1 Security properties 66
19、g39.2.5.2 Vulnerabilities . 66g39.3 Results by applying the VERA tool . 66g39.3.1 Password brute force . 66g39.3.2 File enumeration . 67g39.3.3 CSRF token checking . 68g39.3.4 SQL injection 69g39.3.5 XSS injection 70g39.3.6 Path traversal attack 70g39.3.7 Access control . 71g39.4 Summary and conclus
20、ion . 73g310 Document management system case study results . 74g310.1 Case study characterization 74g310.2 Security testing approaches 74g310.2.1 Security risk assessment of the Infobase application scenario 74g310.2.1.1 Background . 74g310.2.1.2 Scope and goal of the case study . 75g310.2.1.3 Metho
21、d walk-through 75g310.2.1.3.1 Describe general usage scenarios 75g310.2.1.3.2 List assets 75g310.2.1.3.3 Define security requirements . 75g310.2.1.3.4 Identify relevant threats . 75g310.2.1.3.5 Define or derive a Business Worst Case Scenario (BWCS) 76g310.2.1.3.6 Generate Security Overview 76g310.2.
22、1.3.7 Map BWCS to Technical Threat Scenario (TTS) 76g3ETSI ETSI TR 101 582 V1.1.1 (2014-06) 510.2.1.3.8 Map TTSs to test types 77g310.2.1.4 Lessons learned . 77g310.2.2 Improvements of the security model detecting Cross-Site Request Forgery at ASLan+ level 78g310.2.2.1 Description of CSRF in Infobas
23、e 78g310.2.2.2 Modeling CSRF in ASLan+ 79g310.2.2.2.1 Client . 80g310.2.2.2.2 Server 81g310.2.2.2.3 Goal . 82g310.2.2.3 Result of the analysis of the Infobase model . 82g310.2.3 Mutation-based test generation . 83g310.2.4 Test automation 83g310.2.4.1 The ScenTest tool for scenario-based testing 83g3
24、10.2.4.2 General approach to test automation of AATs 83g310.2.4.3 Derived test case, test execution and test results . 84g310.2.4.3.1 Test scenario 1: 84g310.2.4.3.2 Test scenario 2: 85g310.2.4.3.3 Test Scenario 3: . 86g310.3 Results by applying the VERA Tool 87g310.3.1 Considered vulnerabilities 87
25、g310.3.2 Cross-Site Scripting (XSS) . 88g310.3.3 SQL injection 89g310.3.4 Password brute-forcing . 89g310.3.5 Cross-Site Request Forgery (CSRF) . 90g310.3.6 File enumeration . 91g310.4 Summary and conclusions 92g311 Evaluation and assessment of case study results 93g311.1 Approach: Security Testing
26、Improvements Profiling (STIP) . 93g311.1.1 Security risk assessment . 95g311.1.2 Security test identification 95g311.1.3 Automated generation of test models . 96g311.1.4 Security test generation . 96g311.1.5 Fuzzing . 97g311.1.6 Security test execution automation . 98g311.1.7 Security passive testin
27、g/ security monitoring . 98g311.1.8 Static security testing 99g311.1.9 Security test tool integration . 99g311.2 Evaluation results: STIP evaluation of the Case Studies 100g311.2.1 Evaluation of the banknote processing machine case study . 100g311.2.2 Evaluation of the banking case study 101g311.2.3
28、 Evaluation of the radio protocol case study 102g311.2.4 Evaluation of the automotive case study. 103g311.2.5 Evaluation of the eHealth case study 103g311.2.6 Evaluation of the document management case study . 104g3Annex A: Bibliography 106g3History 107g3ETSI ETSI TR 101 582 V1.1.1 (2014-06) 6Intell
29、ectual Property Rights IPRs essential or potentially essential to the present document may have been declared to ETSI. The information pertaining to these essential IPRs, if any, is publicly available for ETSI members and non-members, and can be found in ETSI SR 000 314: “Intellectual Property Right
30、s (IPRs); Essential, or potentially Essential, IPRs notified to ETSI in respect of ETSI standards“, which is available from the ETSI Secretariat. Latest updates are available on the ETSI Web server (http:/ipr.etsi.org). Pursuant to the ETSI IPR Policy, no investigation, including IPR searches, has b
31、een carried out by ETSI. No guarantee can be given as to the existence of other IPRs not referenced in ETSI SR 000 314 (or the updates on the ETSI Web server) which are, or may be, or may become, essential to the present document. Foreword This Technical Report (TR) has been produced by ETSI Technic
32、al Committee Methods for Testing and Specification (MTS). Modal verbs terminology In the present document “shall“, “shall not“, “should“, “should not“, “may“, “may not“, “need“, “need not“, “will“, “will not“, “can“ and “cannot“ are to be interpreted as described in clause 3.2 of the ETSI Drafting R
33、ules (Verbal forms for the expression of provisions). “must“ and “must not“ are NOT allowed in ETSI deliverables except when used in direct citation. ETSI ETSI TR 101 582 V1.1.1 (2014-06) 71 Scope The present document reports on the application of model-based security testing in different industrial
34、 domain. Relevant case studies and their results are described in terms of system under test, applied tool chain, together with an overview of the technical requirements. The case studies were conducted as part of ITEA2 DIAMONDS project (http:/www.itea2-diamonds.org/index.html) and SPaCIoS project (
35、http:/www.spacios.eu/). The document concentrates on the results and conclusions from this work, giving an insight into how applicable such methods are today for testing and indicating the current strengths and weaknesses. 2 References References are either specific (identified by date of publicatio
36、n and/or edition number or version number) or non-specific. For specific references, only the cited version applies. For non-specific references, the latest version of the referenced document (including any amendments) applies. Referenced documents which are not found to be publicly available in the
37、 expected location might be found at http:/docbox.etsi.org/Reference. NOTE: While any hyperlinks included in this clause were valid at the time of publication, ETSI cannot guarantee their long term validity. 2.1 Normative references The following referenced documents are necessary for the applicatio
38、n of the present document. Not applicable. 2.2 Informative references The following referenced documents are not necessary for the application of the present document but they assist the user with regard to a particular subject area. i.1 AVANTSSAR Deliverable 2.3 (update): “ASLan+ specification and
39、tutorial“, 2011. NOTE: Available at http:/www.avantssar.eu. i.2 ITEA2 DIAMONDS Deliverable D5.WP2: “Final Security-Testing Techniques“, 2013. i.3 ITEA2 DIAMONDS Deliverable D5.WP3: “Final Security Testing Tools“, 2013. i.4 ITEA2 DIAMONDS Deliverable D5.WP4: “DIAMONDS Security Testing Methodology“, 2
40、013. i.5 SPaCIoS Deliverable 3.3: “SPaCIoS Methodology and technology for vulnerability-driven security testing“, 2013. i.6 SPaCIoS Deliverable 5.1: “Proof of Concept and Tool Assessment v.1“, 2011. i.7 SPaCIoS Deliverable 5.2: “Proof of Concept and Tool Assessment v.2“, 2012. i.8 SPaCIoS Deliverabl
41、e 5.4: “Final Tool Assessment“, 2013. i.9 A. Ulrich, E.-H. Alikacem, H. Hallal, and S. Boroday: From scenarios to test implementations via promela: “Testing Software and Systems“, pages 236-249, 2010. i.10 J. Oudinet, A. Calvi, and M. Bchler: “Evaluation of ASLan mutation operators“. In Proceedings
42、of the 7th International Conference on Tests and Proofs. Springer, June 2013. 20 pages. ETSI ETSI TR 101 582 V1.1.1 (2014-06) 8i.11 OWASP Cross-Site Request Forgery, 2013. NOTE: Available at https:/www.owasp.org/index.php/Cross-Site_Request_Forgery_(CSRF). i.12 Erik van Veenendaal: “Test Maturity Mo
43、del integration“. NOTE: Available at http:/www.tmmi.org/pdf/TMMi.Framework.pdf. i.13 T. Koomen, M. Pool: “Test process improvement - A practical step-by-step guide to structured testing“, Adison Wesley, 1999. i.14 Rik Marselis fuzz function zf_RandomSelect( in template integer param1) return integer
44、; template myType myData := field1 := zf_UnicodeUtf8ThreeCharMutator(?), field2 := 12ABO, field3 := zf_RandomSelect(1, 2, 3) The fuzz function instance may also be used instead of an inline template. EXAMPLE 2: myPort.send(zf_FiniteRandomNumbersMutator(?); To get one concrete value instance out of a
45、 fuzzed template the valueof() operation can be used. At this time the fuzz function is called and the selected value is stored in the variable myVar. EXAMPLE 3: var myType myVar := valueof(myData) To allow repeatability of fuzzed test cases, an optional seed for the generation of random numbers use
46、d to determine random selection will be used. There will be one seed per test component. Two predefined functions will be introduced in TTCN-3 to set the seed and to read the current seed value (which will progress each time a fuzz function instance is evaluated). EXAMPLE 4: setseed(in float initial
47、Seed) return float; getseed() return float; Without a previous initialization a random value will be used as initial seed. The above declared fuzz function is implemented as a runtime extension and will be triggered by the TTCN-3 Test Control Interface (TCI) instead of (TRI), as external functions,
48、in order to accelerate the generation by avoiding the encoding of the parameters and return values. More information about the TTCN-3 extension for data fuzzing can be found in the DIAMONDS project deliverable D5.WP3 i.3. ETSI ETSI TR 101 582 V1.1.1 (2014-06) 185.2.1.3 Data Fuzzing Library In order
49、to retrieve a valuable set of fuzzed values, a fuzzing library was implemented. It provides fuzz testing values from well-established fuzzers. These tools work standalone and thus, cannot be integrated in the existing test execution environment. So the fuzzing library was developed which allows integration in the test execution environment by using XML interface provided by it or by accessing the Javacode directly. The integration of the fuzzing library to the test development and execution tool was done by implementing exter