SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf

上传人:dealItalian200 文档编号:1028344 上传时间:2019-03-24 格式:PDF 页数:140 大小:6.02MB
下载 相关 举报
SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf_第1页
第1页 / 共140页
SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf_第2页
第2页 / 共140页
SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf_第3页
第3页 / 共140页
SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf_第4页
第4页 / 共140页
SAE SEB 6-A-1990 System Safety Engineering in Software Development (Formerly TechAmerica SEB6-A).pdf_第5页
第5页 / 共140页
点击查看更多>>
资源描述

1、U -Q. L . -. . - EIA SEBb-A 90 3234b00 0006959 O :- = -=- EIA BULLETIN r - System Safety- Engin in Software-DeveIo ELECTRONIC INDUSTRIES ASSOCIATION ENGINEERING DEPARTMENT . . ,- i 3234600 000bb0 7 m 7 EIA SEBb-A 90 M NOTICE EIA Engineering Standards and Publications are designed to serve the public

2、 interest through eliminating misunderstandings between manufacturers and purchasers, facilitating interchangeability and improvement of products, and assisting the purchaser in selecting and obtaining with minimum delay the proper product for his particular need. Existence of such Standards and Pub

3、lications shall not in any respect preclude any member or non- member of EIA from manufacturing or selling products not conforming to such Standards and Publications, nor shall the existence of such Standards and Publications preclude their voluntary use by those other than EIA members, whether the

4、standard is to be used either domestically or internationally. Recommended Standards, Publications and Bulletins are adopted by EIA without regard to whether or not their adoption may involve patents on articles, materials, or processes. By such action, EIA does not assume any liability to any paten

5、t owner, nor does it assume any obligation whatsoever to parties adopting the Recommended Standard, Publication or Bulletin. Technical Bulletins are distinguished from EIA Recommended Standards or Interim Standards, in that they contain a compilation of engineering data or information useful to the

6、technical community, and represent approaches to good engineering practices that are suggested by the formulating committee. This Bulletin is not intended to preclude or discourage other approaches that similarly represent good engineering practice, or that may be acceptable to, or have been accepte

7、d by, appropriate bodies. Parties who wish to bring other approaches to the attention of the formulating committee to be considered for inclusion in future revisions of this Publication are encouraged to do so. It is the intention of the formulating committee to revise and update this Publication fr

8、om time to time as may be occasioned by changes in technology, industry practice, or government regulations, or for other appropriate reasons. COPYRIGHT 1990 Published by ELECTRON IC IN DUSTRI ES ASSOCIATION Engineering Department 2001 Pennsylvania Ave., N.W. Washington, D.C. 20006 (Temporary Headqu

9、arters 1722 Eye St., N.W, Washington, D.C. 20006) -. .- PRICE: $45.00 Printed in U.S.A. All rights reserved EIA SEBb-A 70 W 3234600 00067b3 7 = E IA SAFETY ENGINEER1 NG BULLETIN NO. 6-A SYSTEM SAFETY ENGINEERING IN SOFTWARE DEVELOPMENT Prepared by G-48 System Safety Engineering Committee This bullet

10、in supersedes SEB No. 6, “A Method of Software Safety Analysis,” dated June 1984. EIA SEBb-A 90 m 3234b00 00069b2 O m SEBG-A FOREWORD This Safety Engineering Bulletin was prepared by the Software Safety Subcommittee of the EIA System Safety (G-48) Committee, one of the Committees of Government (G) P

11、anel of the Engineering Department. The G-48 Committee has as its area of interest the procedures, methodology and development of criteria for the application of system safety engineering to systems, subsystems and equipments. These Bulletins are guidelines which summarize analyses, reviews, assessm

12、ents, reports, etc. ) ; and provide basic instructions, tools, and supporting data for use in performing such tasks and activities. This Bulletin addresses the system safety involvement, support and evaluation of software developed for Department of Defense weapon systems in accordance with the proc

13、ess specified by DOD-STD-2167A, “Defense System Software Development.“ These system safety engineering activities will implement the requirements and intent of MIL-STD-882. Because software is generally unsuited to traditional hardware-oriented design hazard analysis techniques, system safety engine

14、ers must first ensure that safety requirements are properly included in the software specification documents. During preliminary and detailed design, system safety uses various tools, such as the system software hazardous effects analysis (SSHEA) , to identify and document the results of software de

15、ficiencies and to ensure that adequate measures are taken to eliminate the deficiencies. The SSHEA, as a documentation and tracking tool, allows the system safety analyst to select the appropriate analysis techniques necessary to adequately identify and evaluate potential mishaps. background informa

16、tion on system safety tasks and activities (e.g., plans, / 2 . 4 EIA SEBb-A 90 3234b00 000b9b3 2 9 1 1.1 1.1.1 1.1.2 1.1.3 1.2 1.3 1.4 2 3 3.1 3.2 3.2.1 3.2.2 3.2.3 3.3 3.4 3.5 3.5.1 3.5.2 3.5.3 3.5.4 3.5.4.1 3.5.4.2 3.5.5 TABLE OF CONTENTS SEB6-A SHEET ACRONYMS 6 INTRODUCTION . 8 Bulletin Purpose.

17、Scope and Application . 8 Purpose . 8 Scope . 8 Application 8 Background., . 9 Causes of Safety Concerns . 11 Bulletin Content outline 11 DEFINITIONS., . 13 SYSTEM SAFETY SOFTWARE ANALYSIS TASKS 17 Purpose and Timing of Saety Analysis of System Software . 17 General System Safety Approach . 20 Estab

18、lishment of Safety Requirements SO Validation and Verification Planning. 21 Validation and Verification of System Safety Requirements Implementation . 22 General Considerations for Safety Analysis of System Software 23 Request for Proposal (RFP) Response Phase . 24 System Safety Software Activities

19、by DOD-STD-2167A Phases.25 System Requirements Analysis/Design Phase . 29 Software Requirements Analysis Phase . 30 Preliminary Design Phase 31 Detailed Design Phase 32 Quantitative Evaluation 33 Analysis Extension 33 Coding and CSU Testing/CSC Integration and Testing /CSCI Testing/System Integratio

20、n and Testing Phases . 34 SEB 6 -A 4 4.1 4.2 4.3 4.3.1 4.3.2 5 Appendix A Appendix B Appendix C Appendix D Appendix E Appendix F Appendix G -. TABLE OF CONTENTS (Continued) t i SHEET SYSTEM SOFTWARE HAZARDOUS EFFECTS ANALYSIS (SSHEA) , , , , . .36 SSHEA Technique. . . , . . . . . . . . . . . . . . .

21、 . . , . , , . . , . , , . 36 SSHEA Format,. . , . . . . . , . . , . . . . . . . . . . . , . . . . . . . . . ,37 Examples,. . . . , , . , . . , . . , , . . . . . . . . . . . . . . . . . . . . . ,40 Example A , , . . , . , , . . . . . . . . . . . . . . . . . . . . . . . . . . -40 Example B , , . . ,

22、. . . , . . . , . . . . . , . . . . . , . . , , . . . . . .45 SOFTWARE SAFETY ANALYSIS PROCESS FLOWS AND DESCRIPTION. . .52 APPENDICES SHEET Software System Safety Checklist , . , , . . . . , . . , . . A-1 Request for Proposal Response Phase , . . . . . , . . . . . B-1 System Requirements Analysis P

23、hase . . , . . , , . . , . . .C-1 Software Requirements Analysis Phase . . . . . . . . , . . D-1 Preliminary Design Phase, . . . . . . . . . . . , . . , . . . . E-1 Detailed Design Phase, . , . . , , . , , . , , . . . . . . . , , ,F-1 CSC/System Integration and Testing Phase. . . . , , . G-1 I . EIA

24、 SEBb-A 90 M 3234600 00069b5 b M LIST OF FIGURES SEB 6 -A SHEET 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Process Flow for Software Consideration in System Safety Activities . 19 An Example of System Development Reviews and Audits 26 System Safety Relationship to Software Design and Develop

25、ment . . 27 Deleverable Products. Reviews. Audits and Baselines 28 Software-Related Safety Problems 34 System Software Hazardous Effects Analysis (SSHEA) Format. 39 IPL Fault Tree 41 Operational Ground Program SSHEA 43 Operational Ground Program SSHEA. Revised. 44 R/S Shroud Release Fault Tree 47 Mi

26、ssile Test Routine. Original . 48 SSHEA for Operational Failure #I; . 49 Safety-Critical Path . 50 Missile Test Routine. Modified 51 RFP Response Flow . B-2/6 Contract Award to SDR Flow C-2/8 SDR to SSR Flow D-2/8 SSR to PDR Flow E-2/12 PDR to CDR Flow F-2/18 CDR to FCA/PCA Flow . G-2/10 5 EIA SEBb-

27、A 90 3234b00 00069bb SEB6-A ACRONYMS A/D AMI BDC CDR CDRT CNG C/R CSC CSCI CSD CSD/G CSD/M csu FCA FHA FMEA FSD FTA GIDEP GODT HCR H/W HWCI ICBM IPL IRS IV AFR 122-10, “Systems Design and Evaluation Criteria for Nuclear Weapon Systems“; MIL-HDBK-255, “Safety, Design and Evaluation Criteria for Nucle

28、ar Weapon Systems,“ and requirements established by the various national and military services test ranges and bases. 1.1.3 Application The software safety concerns, considerations, processes and methods discussed in this Bulletin apply to all programs/projects which are involved in the development

29、of safety-critical computer software components (SCCSCs), (generally referred to as “safety-critical“ software in this document) throughout all phases of the software development process described in DOD- STD-2167A (i.e., from System Requirements Analysis/Design through System Integration and Testin

30、g). The processes and techniques discussed are EIA SEBb-A 90 W 3Z34b00 00069b9 3 W I SEB 6 -A guidelines only and must not be considered to be mandatory. Each software analysis program should be structured to be sensitive to, and able to react to, changes or improvements in analysis technology and m

31、ethodology, and to program milestone scheduling. 1.2 Background Many systems controlled and monitored by computer are safety-critical because of their potential for causing critical or catastrophic mishaps (e.g., nuclear weapon systems; command, control and communications systems; avionics; weapons

32、release systems; space systems; and nuclear power plant systems). As computers assume a more controlling role in safety-critical systems, they and their software become a source of unique hazards, replacing, and in some instances adding to, the potentially hazardous effects of human operator errors.

33、 Computers fail, just as other hardware, and operator response to computer-generated data can be erroneous; but those similarities to non- computer hardware systems are not the problem. The problem is the way computers work. Computers are intended to operate only as instructed by their software. Sof

34、tware failures in the sense of hardware failures do not occur, but errors in the design of the software and computer failure-induced perversions of software are possible. Software tells a computer system what to do, how to do it, and when to do it, and then monitors the systems response(s) . Informa

35、tion on the system state (input) is received from the system (including other computers) and from system operators. Using stored logic, computers then translate the input into commands to system elements, thereby triggering other changes in the system state. These changes present the software with a

36、 different set of conditions to which it must respond with new message outputs. Monitoring of those changed messages by the computer starts the cycle again. This constantly evolving interplay of input, evaluation, response, monitoring, and changing input is called “real time“ control. The burgeoning

37、 use of computer controls for safety-critical functions in military and space system applications is a good example of the situation in which a system can simultaneously be both safer and yet less assuredly safe, because of scientific advances. The increasing complexity of systems has placed demands

38、 upon human operators which even the most advanced use of biotechnology cannot satisfy. The speed required to receive, understand, react to, and monitor complex and rapid changes in state exceeds human capability, but is necessary in order to maintain real time control. The development of high-speed

39、 computers therefore provides an improved capability of safely maintaining control of highly complex safety-critical systems. However, this same technological advance results in a diminished credibility of mishap risk assessment in such systems because the development of analytical and testing techn

40、iques to preclude computer-caused system mishaps has lagged the scientific advances in application of computer technology. The reduced credibility of mishap risk analysis in computer-controlled systems results from the differences between the way systems puters work, as opposed to those which use co

41、mputers. In systems without computers, system safety analysts work with known, calculable, or postulated hardware failure rates and modes, failure effects, human errors, safety factors, and hardware interactions within a system and across system and facility interfaces. Although problems in conduct

42、of system safety analyses in such systems have grown in complexity with the introduction of new technology (e.g., new materials), consensus methods exist which permit high levels of assurance that 9 ? .A EIA SEBb-A 90 m 3234b00 0006970 T m SEB6-A such systems can be safely fielded. This is because,

43、failing all other methods to provide such assurance, hardware systems can be tested (to destruction if necessary) without penalty except for cost. Achieving the same level of assurance in computer-controlled systems is not yet, if it ever will be, feasible. Systems with no potential for catastrophic

44、 or critical consequence are not safety-critical, and therefore software associated with such systems is also not safety-critical. For those systems determined to be inherently safety-critical, a determination must also be made that software used in the system either is or is not safety- critical. T

45、his document describes certain methods by which the software elements of systems can be assessed for mishap risk and to thus identify the software elements which are safety-critical. However, it must be remembered that using the prescribed methods will at best provide assurance that the computer sof

46、tware will instruct the computer to do what it should do to avoid system mishaps under each planned or foreseeable condition identified by the system safety analyst. No assurance is provided regarding those conditions which are not identified! Implementation of the system safety requirements so gene

47、rated is verifiable both by analysis and demonstration or test. Testing the software as part of system testing may also provide system safety analysts information on previously unanticipated system reactions which represent new hazards requiring control. Without a feasible capability to test for all

48、 possible combinations of events and conditions that could trigger a hazardous system response, it is not possible to ascertain with assurance that a system will not do something unforeseen, or perform an unwanted function when triggered by a unique conjunction of events/conditions. With that shortc

49、oming in mind, it is strongly recommended that safety-critical system functions with high-risk- level catastrophic or critical mishap potential be controlled by using redundant measures implemented by hardware in the system whenever it is practical to do so. However, the determination of whether to implement controls in hardware or software must ultimately be decided on a case-by-case judgment of practicality in close coordination with systems and design engineering. Even then, exhaustive evaluation of computer software safety must be assured, using methods s

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 标准规范 > 国际标准 > 其他

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1