AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf

上传人:rimleave225 文档编号:427852 上传时间:2018-11-07 格式:PDF 页数:171 大小:1.77MB
下载 相关 举报
AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf_第1页
第1页 / 共171页
AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf_第2页
第2页 / 共171页
AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf_第3页
第3页 / 共171页
AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf_第4页
第4页 / 共171页
AIR FORCE MIL-HDBK-1823 A-2009 NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT《系统可靠性评定用无损评估》.pdf_第5页
第5页 / 共171页
亲,该文档总共171页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述

1、 NOT MEASUREMENT SENSITIVE MIL-HDBK-1823A 7 April 2009 SUPERSEDING MIL-HDBK-1823 14 April 2004 DEPARTMENT OF DEFENSE HANDBOOK NONDESTRUCTIVE EVALUATION SYSTEM RELIABILITY ASSESSMENT This handbook is for guidance only. Do not cite this document as a requirement. AMSC N/A AREA NDTI DISTRIBUTION STATEM

2、ENT A. Approved for public release; distribution is unlimited. Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A 2 FOREWORD 1. This handbook is approved for use by all Departments and Agencies of the Department of Defense. 2. This handbo

3、ok is for guidance only and cannot be cited as a requirement. If it is, the contractor does not have to comply. 3. Comments, suggestions, or questions on this document should be addressed to ASC/ENRS, 2530 Loop Road West, Wright-Patterson AFB 45433-7101, or emailed to Engineering.Standardswpafb.af.m

4、il. Since contact information can change, you may want to verify the currency of this address information using the ASSIST Online database at http:/assist.daps.dla.mil. Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH

5、 PAGE 3 1. SCOPE 13 1.1 Scope . 13 1.2 Limitations 13 1.3 Classification 13 2. APPLICABLE DOCUMENTS 14 2.1 General 14 2.2 Non-Government publications 14 3. Nomenclature 15 4. GENERAL GUIDANCE 21 4.1 General 21 4.2 System definition and control . 21 4.3 Calibration 21 4.4 Noise . 21 4.5 Demonstration

6、 design . 21 4.5.1 Experimental design . 21 4.5.1.1 Test variables 22 4.5.1.2 Test matrix 24 4.5.2 Test specimens 25 4.5.2.1 Physical characteristics of the test specimens . 25 4.5.2.2 Target sizes and number of “flawed” and “unflawed” inspection sites 26 4.5.2.3 Specimen maintenance . 27 4.5.2.4 Sp

7、ecimen flaw response measurement . 28 4.5.2.5 Multiple specimen sets 28 4.5.2.6 Use of actual hardware as specimens 28 4.5.3 Test procedures . 28 4.5.4 False positives (false calls) . 30 4.5.5 Demonstration process control . 31 4.6 Demonstration tests . 31 4.6.1 Inspection reports 31 4.6.2 Failure d

8、uring the performance of the demonstration test program 31 4.6.3 Preliminary tests . 31 4.7 Data analysis . 31 4.7.1 Missing data 32 4.8 Presentation of results . 32 4.8.1 Category I - NDE system 32 4.8.2 Category II - Experimental design 33 4.8.3 Category III - Individual test results . 33 4.8.4 Ca

9、tegory IV - Summary results 33 4.8.5 Summary report 33 4.8.5.1 Summary report documentation 34 4.9 Retesting . 34 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 4 4.10 Process control plan 34 5. DETAILED GUIDAN

10、CE . 35 5.1 General 35 6. NOTES 36 6.1 Intended use 36 6.2 Trade-offs between ideal and practical demonstrations 36 6.3 Model-Assisted POD 36 6.4 A common misconception about statistics and POD “Repeated inspections improve POD” 36 6.5 Summary: 37 6.6 Subject term (key word) listing. 37 6.7 Changes

11、from previous issue 37 Appendix A Eddy Current Test Systems (ET) . 39 A.1 SCOPE 39 A.1.1 Scope . 39 A.1.2 Limitations 39 A.1.3 Classification 39 A.2 APPLICABLE DOCUMENTS 39 A.3 DETAILED GUIDANCE . 39 A.3.1 Demonstration design . 39 A.3.1.1 Test parameters . 39 A.3.1.2 Fixed process parameters 40 A.3

12、.1.3 Calibration and standardization 40 A.3.2 Specimen fabrication and maintenance 41 A.3.2.1 Surface-connected targets . 41 A.3.2.2 Crack sizing crack length, or crack depth, or crack area . 41 A.3.2.3 Specimen maintenance . 41 A.3.3 Testing procedures 42 A.3.3.1 Test definition . 42 A.3.3.2 Test e

13、nvironment 43 A.3.4 Presentation of results . 43 A.3.4.1 Submission of data 43 Appendix B Fluorescent Penetrant Inspection Test Systems (PT) 45 B.1 SCOPE 45 B.1.1 Scope . 45 B.1.2 Limitations 45 B.1.3 Classification 45 B.2 DETAILED GUIDANCE . 45 Provided by IHSNot for ResaleNo reproduction or networ

14、king permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 5 B.2.1 Demonstration design . 45 B.2.1.1 Variable test parameters 45 B.2.1.2 Fixed process parameters 46 B.2.2 Specimen fabrication and maintenance 46 B.2.3 Testing procedures 47 B.2.3.1 Test definition . 47 B.2.3.2

15、Test environment 48 B.2.4 Presentation of results . 48 B.2.4.1 Submission of data 48 Appendix C Ultrasonic Test Systems (UT) . 49 C.1 SCOPE 49 C.1.1 Scope . 49 C.1.2 Limitations 49 C.1.3 Classification 49 C.2 DETAILED GUIDANCE . 49 C.2.1 Demonstration design . 49 C.2.1.1 Test parameters . 49 C.2.1.2

16、 Fixed process parameters 49 C.2.2 Specimen fabrication and maintenance 50 C.2.2.1 Longitudinal and shear wave UT inspections . 50 C.2.2.2 Defects in diffusion bonded specimens 51 C.2.2.3 Specimen maintenance . 51 C.2.3 Testing procedures 51 C.2.3.1 Test definition . 51 C.2.3.2 Test environment 52 C

17、.2.4 Presentation of results . 52 C.2.4.1 Submission of data 53 Appendix D Magnetic Particle Testing (MT) 55 D.1 SCOPE 55 D.1.1 Scope . 55 D.1.2 Limitations 55 D.1.3 Classification 55 D.2 DETAILED GUIDANCE . 55 D.2.1 Demonstration design . 55 D.2.1.1 Variable test parameters 55 D.2.1.2 Fixed process

18、 parameters 55 D.2.2 Specimen fabrication and maintenance 56 D.2.3 Testing procedures 57 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 6 D.2.3.1 Test definition . 57 D.2.3.2 Test environment 58 D.2.4 Presentat

19、ion of results . 58 D.2.5 Submission of data 59 Appendix E Test Program Guidelines . 61 E.1 SCOPE 61 E.1.1 Scope . 61 E.1.2 Limitations 61 E.1.3 Classification 61 E.2 APPLICABLE DOCUMENTS 61 E.3 EXPERIMENTS . 62 E.3.1 DOX 62 E.3.2 Experimental design . 62 E.3.2.1 Variable types . 62 E.3.2.2 Nuisance

20、 variables 62 E.3.2.3 Objective of Experimental Design 62 E.3.2.4 Factorial experiments 63 E.3.2.5 Categorical variables . 63 E.3.2.6 Noise Probability of False Positive (PFP) . 63 E.3.2.7 How to design an NDE experiment 63 Appendix F Specimen Design, Fabrication, Documentation, and Maintenance 67 F

21、.1 SCOPE 67 F.1.1 Scope . 67 F.1.2 Limitations 67 F.1.3 Classification 67 F.2 GUIDANCE 67 F.2.1 Design . 67 F.2.1.1 Machining tolerances 67 F.2.1.2 Environmental conditioning 67 F.2.2 Fabrication 68 F.2.2.1 Processing of raw material 68 F.2.2.2 Establish machining parameters . 68 F.2.2.3 Defect inse

22、rtion . 68 F.2.2.3.1 Internal targets 68 F.2.2.3.1.1 Simulated voids . 68 F.2.2.3.1.2 Simulated inclusions . 69 F.2.2.4 Target documentation . 69 F.2.2.4.1 Final machining 69 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PAR

23、AGRAPH PAGE 7 F.2.2.5 Target verification . 69 F.2.2.5.1 Specimen target response 70 F.2.2.5.2 Imbedded targets . 70 F.2.3 Specimen maintenance . 70 F.2.3.1 Handling 70 F.2.3.2 Cleaning 70 F.2.3.2.1 Specimen integrity 71 F.2.3.3 Shipping 71 F.2.3.4 Storage 71 F.2.3.5 Revalidation 71 F.2.3.6 Examples

24、 of NDE Specimens . 72 Appendix G Statistical Analysis of NDE Data 81 G.1 SCOPE 81 G.1.1 Scope . 81 G.1.2 Limitations 81 G.1.3 Classification 81 G.1.4 APPLICABLE DOCUMENTS 81 G.2 PROCEDURES 81 G.2.1 Background . 81 G.3 vs a DATA ANALYSIS 85 G.3.1 Plot the data 85 G.3.2 Four guidelines . 86 G.3.3 War

25、ning 86 G.3.4 How to analyze vs a data . 86 G.3.4.1 Wald method for building confidence bounds about a regression line . 88 G.3.4.2 Understanding noise . 88 G.3.4.3 How to go from vs a to POD vs a The Delta Method . 89 G.3.4.4 The POD(a) curve . 92 G.3.5 How to analyze noise 95 G.3.5.1 Definition of

26、 noise 95 G.3.5.2 Noise measurements . 95 G.3.5.3 Choosing a probability density to describe the noise 96 G.3.6 Repeated measures, mh1823 POD software and vs a users manual . 98 G.3.6.1 mh1823 POD software overview 98 G.3.6.2 USERS MANUAL 100 G.3.6.2.1 Entering the data . 100 G.3.6.2.2 Plotting the

27、data 106 G.3.6.2.3 Beginning the analysis 108 G.3.6.2.4 Analyzing noise 110 G.3.6.2.4.1 False positive analysis 112 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 8 G.3.6.2.4.2 Noise analysis and the combined v

28、s a plot 112 G.3.6.2.5 The POD(a) curve . 114 G.3.6.2.6 Miscellaneous algorithms . 119 G.4 Binary (hit/miss) data 120 G.4.1 Generalized linear models. 120 G.4.1.1 Link functions . 120 G.4.2 USERS MANUAL (Hit/Miss) . 122 G.4.2.1 Reading in and analyzing hit/miss data simple example (EXAMPLE 3 hm.xls)

29、 122 G.4.2.2 Constructing hit/miss confidence bounds . 130 G.4.2.2.1 How the loglikelihoood ratio criterion works . 130 G.4.2.3 NTIAC data. 135 G.4.2.4 Lessons learned. 135 G.4.3 Choosing an asymmetric link function: EXAMPLE 4 hm cloglog.xls . 135 G.4.3.1 Analysis. . 135 G.4.4 Analyzing repeated mea

30、sures (multiple inspections of the same target set) EXAMPLE 5 hm repeated measures.xls 137 G.4.4.1 Analysis. . 138 G.4.5 Analyzing disparate data correctly (EXAMPLE 6 hm DISPARATE disks.xls) 139 G.4.5.1 Analysis 142 G.4.6 Analyzing hit/miss noise . 142 G.5 mh1823 POD algorithms 144 Appendix H Model-

31、Assisted Determination of POD 147 H.1 SCOPE 147 H.1.1 Scope . 147 H.1.2 Limitations 147 H.1.3 Classification 147 H.2 APPLICABLE DOCUMENTS 147 H.3 MAPOD 147 H.3.1 Protocol for model-assisted determination of POD 149 H.3.2 Protocol for determining influence of empirically assessed factors . 149 H.3.2.

32、1 Protocol for empirical vs a model-building . 152 H.3.2.2 Protocol for use of “physical” models to determine influence of model-assessed factors 152 H.3.3 Summary . 152 H.4 Examples of successful applications of MAPOD . 152 H.4.1 Eddy Current detection of fatigue cracks in complex engine geometries

33、 . 153 H.4.2 Ultrasonic capability to detect FBHs in engine components made from a variety of nickel-based superalloys . 153 H.4.3 Capability of advanced eddy current technique to detect fatigue cracks in wing lap joints 153 Provided by IHSNot for ResaleNo reproduction or networking permitted withou

34、t license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 9 Appendix I Special Topics . 155 I.1 Departures from underlying assumptions crack sizing and POD analysis of images 155 I.1.1 Uncertainty in X 155 I.1.1.1 “Errors in variables” . 155 I.1.1.2 Summary uncertainty in X . 156 I.1.2 Uncertain

35、ty in Y 156 I.1.2.1 Pre-processing POD analysis of images 156 I.1.2.1.1 How to go from UT image to POD . 157 I.1.2.1.2 Summary POD analysis of images 157 I.1.3 References . 158 I.2 False positives, Sensitivity and Specificity 158 I.2.1 Sensitivity, Specificity, positive predictive value, and negativ

36、e predictive value . 158 I.2.2 Sensitivity and PPV are not the same 158 I.2.3 Why Sensitivity and PPV are different 159 I.2.4 Why bother to inspect? . 159 I.2.5 Result to remember . 160 I.3 The misunderstood receiver operating characteristic (ROC) curve 160 I.3.1 The ROC curve . 160 I.3.2 Two defici

37、encies . 161 I.3.2.1 Prevalence matters 161 I.3.2.2 ROC cannot consider target size . 162 I.3.3 Summary . 164 I.4 Asymptotic POD functions . 164 I.4.1 A three-parameter POD(a) function 164 I.5 A voluntary grading scheme for POD(a) studies 166 I.5.1 POD “grades” . 166 I.5.1.1 All POD studies 166 I.5.

38、1.2 Grade A . 166 I.5.1.3 Grade B . 167 I.5.1.4 Grade C . 167 Appendix J Related Documents . 169 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 10 FIGURES FIGURE F-1. Typical FPI reliability demonstration speci

39、men. 72 FIGURE F-2. Surface template for locating PT indications. . 73 FIGURE F-3. Typical engine disk circular scallop specimen. . 74 FIGURE F-4. Typical engine disk elongated scallop specimen. . 75 FIGURE F-5. Typical engine disk broach slot specimen. . 76 FIGURE F-6. UT internal target specimen.

40、. 77 FIGURE F-7. All targets on all rows are visible to interrogating sound paths. . 78 FIGURE F-8. “Wedding Cake” UT specimen. 79 FIGURE F-9. Typical engine disk bolt hole specimen. . 80 FIGURE G1. A perfect inspection can discriminate the pernicious from the benign. . 83 FIGURE G-2. Resolution in

41、POD at the expense of resolution in size. 84 FIGURE G-3. Diagnostic vs a plots show log(X), Cartesian(Y) is the best model. . 85 FIGURE G-4. vs log(a) showing the relationship of scatter, noise scatter, and POD. 87 FIGURE G-5. The Delta Method. 90 FIGURE G-6. POD(a) curve for example 1 data (figure

42、G-4) log x-axis. 92 FIGURE G-7. POD(a) curve for example 1 data (figure G-4) Cartesian x-axis. 94 FIGURE G-8. Scatterplot of signal, , vs size, a, showing only a random relationship. . 95 FIGURE G-9. Regression model of noise vs a showing an essentially zero slope. 96 FIGURE G-10. Four possible prob

43、ability models for noise; Weibull, Exponential, Gaussian, and Lognormal. 97 FIGURE G-11. The noise is represented by a Gaussian probability model. . 98 FIGURE G-12. Opening screen of mh1823 POD software. 99 FIGURE G-13. vs a menu, item 1 read vs a data. . 100 FIGURE G-14. vs a menu, item 2 build lin

44、ear model. . 101 FIGURE G-15. vs a menu, item 3, POD. . 101 FIGURE G-16. vs a menu, item 4 noise analysis. . 102 FIGURE G-17. vs a menu, miscellaneous algorithms. . 102 FIGURE G-18. The vs a dialog box. 103 FIGURE G-19. EXAMPLE 2 vs a repeated measures.xls data. . 104 FIGURE G-20. vs a POD setup. .

45、105 FIGURE G-21. vs a parameter dialog box. . 105 FIGURE G-22. Diagnostic vs a plots for repeated measures data. . 106 FIGURE G-23. Example 2 data showing censoring values and decision. 107 FIGURE G-24. vs a summary plot 109 FIGURE G-25. Repeated measures noise. . 111 FIGURE G-26. The Gaussian densi

46、ty represents the noise well. 112 FIGURE G-27. vs a summary plot with superimposed noise density and POD vs a inset. 113 FIGURE G-28. Trade-off plot showing PFP a90and a90/5as functions of decision. . 114 FIGURE G-29. POD(a) for the example 2 repeated measures data, log x-axis. 116 FIGURE G-30. Dial

47、og box to change x-axis plotting range. . 117 Provided by IHSNot for ResaleNo reproduction or networking permitted without license from IHS-,-,-MIL-HDBK-1823A CONTENTS PARAGRAPH PAGE 11 FIGURE G-31. POD(a) for the example 2 repeated measures data, Cartesian x-axis. 118 FIGURE G-32. POD vs size, EXAM

48、PLE 3 hm.xls. 123 FIGURE G-33. Hit/Miss menu, items 1 read hit/miss data. 124 FIGURE G-34. Hit/Miss menu, item 2 build generalized linear model. . 124 FIGURE G-35. Hit/Miss menu, item 4 input hit/miss noise. 125 FIGURE G-36. Hit/Miss menu, item 3 POD plotting algorithms. 126 FIGURE G-37. Hit/Miss menu miscellaneous algorithms. . 127 FIGURE G-38. Hit/Miss setup dialog box. 128 FIGURE G-39. Hit/Miss GLM parameter box. . 128 FIGURE G-40. Choosing the right link function an whether

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 标准规范 > 国际标准 > 其他

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1