1、U -Q. L . -. . - EIA SEBb-A 90 3234b00 0006959 O :- = -=- EIA BULLETIN r - System Safety- Engin in Software-DeveIo ELECTRONIC INDUSTRIES ASSOCIATION ENGINEERING DEPARTMENT . . ,- i Copyright Government Electronics and provide basic instructions, tools, and supporting data for use in performing such
2、tasks and activities. This Bulletin addresses the system safety involvement, support and evaluation of software developed for Department of Defense weapon systems in accordance with the process specified by DOD-STD-2167A, “Defense System Software Development.“ These system safety engineering activit
3、ies will implement the requirements and intent of MIL-STD-882. Because software is generally unsuited to traditional hardware-oriented design hazard analysis techniques, system safety engineers must first ensure that safety requirements are properly included in the software specification documents.
4、During preliminary and detailed design, system safety uses various tools, such as the system software hazardous effects analysis (SSHEA) , to identify and document the results of software deficiencies and to ensure that adequate measures are taken to eliminate the deficiencies. The SSHEA, as a docum
5、entation and tracking tool, allows the system safety analyst to select the appropriate analysis techniques necessary to adequately identify and evaluate potential mishaps. background information on system safety tasks and activities (e.g., plans, / 2 Copyright Government Electronics . 49 Safety-Crit
6、ical Path . 50 Missile Test Routine. Modified 51 RFP Response Flow . B-2/6 Contract Award to SDR Flow C-2/8 SDR to SSR Flow D-2/8 SSR to PDR Flow E-2/12 PDR to CDR Flow F-2/18 CDR to FCA/PCA Flow . G-2/10 5 Copyright Government Electronics AFR 122-10, “Systems Design and Evaluation Criteria for Nucl
7、ear Weapon Systems“; MIL-HDBK-255, “Safety, Design and Evaluation Criteria for Nuclear Weapon Systems,“ and requirements established by the various national and military services test ranges and bases. 1.1.3 Application The software safety concerns, considerations, processes and methods discussed in
8、 this Bulletin apply to all programs/projects which are involved in the development of safety-critical computer software components (SCCSCs), (generally referred to as “safety-critical“ software in this document) throughout all phases of the software development process described in DOD- STD-2167A (
9、i.e., from System Requirements Analysis/Design through System Integration and Testing). The processes and techniques discussed are Copyright Government Electronics command, control and communications systems; avionics; weapons release systems; space systems; and nuclear power plant systems). As comp
10、uters assume a more controlling role in safety-critical systems, they and their software become a source of unique hazards, replacing, and in some instances adding to, the potentially hazardous effects of human operator errors. Computers fail, just as other hardware, and operator response to compute
11、r-generated data can be erroneous; but those similarities to non- computer hardware systems are not the problem. The problem is the way computers work. Computers are intended to operate only as instructed by their software. Software failures in the sense of hardware failures do not occur, but errors
12、 in the design of the software and computer failure-induced perversions of software are possible. Software tells a computer system what to do, how to do it, and when to do it, and then monitors the systems response(s) . Information on the system state (input) is received from the system (including o
13、ther computers) and from system operators. Using stored logic, computers then translate the input into commands to system elements, thereby triggering other changes in the system state. These changes present the software with a different set of conditions to which it must respond with new message ou
14、tputs. Monitoring of those changed messages by the computer starts the cycle again. This constantly evolving interplay of input, evaluation, response, monitoring, and changing input is called “real time“ control. The burgeoning use of computer controls for safety-critical functions in military and s
15、pace system applications is a good example of the situation in which a system can simultaneously be both safer and yet less assuredly safe, because of scientific advances. The increasing complexity of systems has placed demands upon human operators which even the most advanced use of biotechnology c
16、annot satisfy. The speed required to receive, understand, react to, and monitor complex and rapid changes in state exceeds human capability, but is necessary in order to maintain real time control. The development of high-speed computers therefore provides an improved capability of safely maintainin
17、g control of highly complex safety-critical systems. However, this same technological advance results in a diminished credibility of mishap risk assessment in such systems because the development of analytical and testing techniques to preclude computer-caused system mishaps has lagged the scientifi
18、c advances in application of computer technology. The reduced credibility of mishap risk analysis in computer-controlled systems results from the differences between the way systems puters work, as opposed to those which use computers. In systems without computers, system safety analysts work with k
19、nown, calculable, or postulated hardware failure rates and modes, failure effects, human errors, safety factors, and hardware interactions within a system and across system and facility interfaces. Although problems in conduct of system safety analyses in such systems have grown in complexity with t
20、he introduction of new technology (e.g., new materials), consensus methods exist which permit high levels of assurance that 9 Copyright Government Electronics the one with a double asterisk (*) is from MIL-STD-882B. * COMPUTER HARDWARE: Devices capable of accepting and storing computer data, executi
21、ng a sequence of operations on computer data, or producing any output data (including control outputs) . * COMPUTER SOFTWARE (or SOFTWARE): A combinat.ion of associated computer instructions and computer data definitions required to enable the computer hardware to perform computational or control fu
22、nctions. * COMPUTER SOFTWARE COMPONENT (CSC): A distinct part of a computer software configuration item (CSCI). CSCs may be further decomposed into other CSCs and Computer Software Units (CSUS). . * COMPUTER SOFTWARE CONFIGURATION ITEM (CSCI): Software that is designated by the procuring agency for
23、configuration management. * COMPUTER SOFTWARE UNIT (CSU): An element specified in the design of a Computer Software Component (CSC) that is separately testable. * CONTRACTING AGENCY: The “contracting office” as defined in Federal Acquisition Regulation Subpart 2.1, or its designated representative.
24、(In this document the term Managing Activity MA is used). FAULT INJECTION PROCESS: The process of deliberately inserting faults into a system (by manual or automatic methods) to test the ability of the system to safely handle the fault or to fail to a safe state. Usually, fault injection criteria is
25、 defined by system safety and is implemented by the software test engineering group to measure the systems ability to mitigate potential mishaps to an acceptable level of risk. FIRMWARE: Software that resides in a nonvolatile medium that is read-only in nature, and is completely write-protected when
26、 functioning in its operational environment. HAZARD: An inherent characteristic of a thing or situation which has the potential of causing a mishap. Whenever a hazard is present, the possibility, regardless of degree, of a mishap occurring exists. HAZARDOUS OPERATION/CONDITION: An operation (activit
27、y) or condition (state) which introduces a hazard to an existing situation without adequate control of that hazard, or removes or reduces the effectiveness of existing controls over existing hazards, thereby increasing mishap probability or potential mishap severity, or both. Copyright Government El
28、ectronics death; occupational illness; damage to or destruction of equipment, facilities or property; or pollution of the environment. MISHAP RISK ASSESSMENT: The process of determining the potential of a mishap in terms of severity and probability of occurrence; and, the results of that determinati
29、on. NON-SAFETY-CRITICAL COMPUTER SOFTWARE COMPONENT: Computer software component (unit) which does not control safety-critical hardware systems, subsystems, or components, and does not provide safety-critical information. PROBABILITY: The numerical likelihood that a mishap will occur given a defined
30、 set of circumstances. This term does not reflect the reliability of the software, which is the likelihood of a software component error rendering the software dysfunctional. Safety analyses assume a uniform software reliability. (See MIL-HDBK-217) PROCESS SECURITY: The assurance that the computer s
31、oftware programs and information cannot be modified or activated by unauthorized personnel or inadvertently modified or activated by other software processes. PRODUCT BASELINE : Configuration Item (5) which have design frozen at established program milestones (SDR, PDR, CDR) , and are ultimately sub
32、jected to formal testing and configuration audits prior to delivery. SAFETY-CRITICAL COMPUTER SOFTWARE COMPONENT (SCCSC): Computer software component (unit) whose inadvertent response to stimuli, failure to respond when required, response out-of-sequence or in unplanned combination with others, can
33、result in a critical or catastrophic mishap, as defined in MIL- STD-882B. SAFETY INTEGRITY: The ability of a control system to work safely (this includes shutting down safely if a fault occurs), which depends on the entire system, not just the computer. Copyright Government Electronics and, to asses
34、s residual software-related mishap risk in the system. SUPPORT SOFTWARE: All software used.to aid the development, testing, and support of applications, systems, test and maintenance software. Support software includes, but is not limited to: O Compilers, assemblers, linkage editors, libraries and l
35、oaders required to generate machine code and combine hierarchical components into executable computer programs. O O O Debugging software. Stimulation and simulation software. Data extraction and data reduction software. Software used for management control, software configuration management, or docu
36、mentation generation and control during development. Test software used in software development. Design aids, such as program design language tools, and problem statement analysis tools. Test and maintenance software to assist in fault diagnosis and isolation, operational readiness verification, and
37、 system alignment checkout of the system or its components. It may be used to check out and certify equipment and total system at installation, reinstallation, or after maintenance. It is also used in accordance with prescribed procedures to maintain the system throughout its operational life. SYSTE
38、M SAFETY: The application of engineering and management principles, criteria, and techniques to optimize safety within the constraints of operational effectiveness, time, and cost throughout all phases of the system life cycle. Copyright Government Electronics i.e., it can be used for end-item, subs
39、ystem, and system-level analysis. The closed-loop process used identifies software-related safety problems and the hazardous system conditions they create, the design requirements levied to eliminate or control those conditions, and validation and verification of the design solution. The resultant d
40、ocumentation provides a clear trail from problem identification to verification of problem solution, thereby facilitating system safety review by both contractor and customer management. The SSHEA technique is also usable on software development projects performed under standards/directions other th
41、an DOD-STD-2167A, but the functional steps described in this instruction must be appropriately tailored for such use. 18 Copyright Government Electronics (o IL I I I I I I II k -l I I I I I I I L r - -7 i I- II II I I 19 F- Copyright Government Electronics however, system safety shall assure that th
42、e MA system safety official receives, and provides concurrence with, system safety requirement implementation validation planning. 3.2.3 Validation and Verification of System Safety Requirements Implementation The final objective of the software system safety effort is to examine the safety-critical
43、 routines to validate actual design implementation of the safety requirements, and to verify that the design solutions are effective and properly documented. This effort starts following SSR and continues to software CDR. All design changes and modifications made following CDR must also be evaluated
44、 to determine their effect on system safety. The analysis will contain recommended actions to eliminate the identified hazards or to minimize their associated mishap risks to an acceptable level. Specifically, this portion of the effort examines: O The effects of hardwaxe failures on software routin
45、es which could con- tribute to the frequency of occurrence or severity of identified system- level mishaps. This step, which is iterative until software CDR, includes : e O Development of a system hardware/software-oriented flow diagram of each safety-critical hardware output/input, reflecting the c
46、ommand flow and the functions performed by the hardware, with identification of hardware inputs to the computer. O Determination- of the ability of software to detect hardware failures. (A FMEA might help.) O Identification of what action software takes when a hardware failure or out-of-limit condit
47、ion is detected. O Identification of the elements within the safety-critical software routine which contribute to the identified mishap. O Identification of additional software safety requirements to eliminate or mitigate the hardware failure effects. O Identification of fail-safe features to determ
48、ine system redundancies required (hardware, software or both) to mitigate the identified potential mishap (s) . O Software routines for design errors which could cause or contribute to an undesired event that affects safety. 22 Copyright Government Electronics or whether the software response will r
49、esult in an action that will result in a hazardous system condition. O O Software is a set of computer instructions, not a physical entity. How- ever, it can be taken apart and analyzed, routine by routine. Software does not wear out as a result of use, degrade over time, or “fail“ in the sense that a hardware assembly fails. Like hardware, software faults in a computer usually occur because requirements were incorrectly or insufficiently specified; the system functions, interfaces, or requirements were not fully or properly understood by the designer; coding er
copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1