ImageVerifierCode 换一换
格式:PPT , 页数:44 ,大小:282.50KB ,
资源ID:378344      下载积分:2000 积分
快捷下载
登录下载
邮箱/手机:
温馨提示:
如需开发票,请勿充值!快捷下载时,用户名和密码都是您填写的邮箱或者手机号,方便查询和重复下载(系统自动生成)。
如填写123,账号就是123,密码也是123。
特别说明:
请自助下载,系统不会自动发送文件的哦; 如果您已付费,想二次下载,请登录后访问:我的下载记录
支付方式: 支付宝扫码支付 微信扫码支付   
注意:如需开发票,请勿充值!
验证码:   换一换

加入VIP,免费下载
 

温馨提示:由于个人手机设置不同,如果发现不能下载,请复制以下地址【http://www.mydoc123.com/d-378344.html】到电脑端继续下载(重复下载不扣费)。

已注册用户请登录:
账号:
密码:
验证码:   换一换
  忘记密码?
三方登录: 微信登录  

下载须知

1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。
2: 试题试卷类文档,如果标题没有明确说明有答案则都视为没有答案,请知晓。
3: 文件的所有权益归上传用户所有。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 本站仅提供交流平台,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

版权提示 | 免责声明

本文(Analysis is necessary but far from sufficient.ppt)为本站会员(amazingpat195)主动上传,麦多课文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。 若此文所含内容侵犯了您的版权或隐私,请立即通知麦多课文库(发送邮件至master@mydoc123.com或直接QQ联系客服),我们立即给予删除!

Analysis is necessary but far from sufficient.ppt

1、Analysis is necessary but far from sufficient,Jon Pincus Reliability Group (PPRC) Microsoft Research,Jon Pincus (Microsoft Research),2,Why are so few successful real-world development and testing tools influenced by program analysis research?,Jon Pincus (Microsoft Research),3,Outline,Provocation Suc

2、cessful tools Analysis in context Implications for analysis Conclusion,Jon Pincus (Microsoft Research),4,Success: a simple view,A tool is successful if people use it Not if people think its interesting but dont try it Not if people try it but dont use it Not if people buy it but dont use it (“Shelfw

3、are”),Jon Pincus (Microsoft Research),5,Some examples of success,Purify BoundsChecker PREfix (2.X and later) Especially interesting because 1.0 was unsuccessful,Jon Pincus (Microsoft Research),6,Why do people use a tool? If,it helps them get their work done more efficiently than they would otherwise

4、 without making them look (or feel) bad.Aside: look at organizational and personal goals. See Alan Coopers books, e.g. About Face,Jon Pincus (Microsoft Research),7,Value vs. Cost,Value: the quantified benefit from the tool Cost: primarily time investment Licensing cost is typically much smaller (Val

5、ue Cost) must be Positive Positive fairly quickly More positive than any alternatives Value and cost are difficult to estimate and others estimates are often questionable,Jon Pincus (Microsoft Research),8,An example,Purify 1.0: Virtually zero initial cost on most code bases “trial” license easy to i

6、ntegrate Immediate value Companies then invested to increase the value E.g., changing memory allocators to better match Purifys (and buying lots of licenses),Jon Pincus (Microsoft Research),9,Characteristics of successful tools,Successful tools almost always address significant problems, on real cod

7、e bases, give something for (almost) nothing, and are easy to use.,Jon Pincus (Microsoft Research),10,Significant problems,Nobody fixes all the bugs. What are the key ones? Often based on most recent scars Often based on development or business goals Examples: Purify: memory leaks BoundsChecker: bou

8、nds violations Lint (back in K&R days): portability issues,Jon Pincus (Microsoft Research),11,Real code bases,Large code bases in nasty languages (e.g., C/C+) 1M+ LOC is medium-sized; 10M+ LOC is largeOr, smaller code bases in different nasty languages Perl, JScript, VBScript, HTML/DHTML, TCL/Tk, SQ

9、L 5000+ LOC is medium; 50K+ is large,Jon Pincus (Microsoft Research),12,More reality ,Most code bases involve multiple languages Extensions and incompatibilities, e.g. GCC/G+, MS C+, Sun C+ ECMAScript/JScript/JavaScript HTML versions People use all those nasty language features (e.g., casts between

10、pointers and ints, unions, bit fields, gotos, ),Jon Pincus (Microsoft Research),13,Something for (almost) nothing,Engineering time is precious Engineers are skeptical so are unwilling to commit their valuable time Dont even think about requiring significant up-front investment code modifications pro

11、cess changes,Jon Pincus (Microsoft Research),14,Examples: something for (almost) nothing,Purify for UNIX: just relink! BoundsChecker: you dont even need to relink! PREfix 2.X: point your web browser to a URL!A non-technology solution: “well do it for you” Commercial variant: an initial benchmark for

12、 $X Preferably: money back if it isnt useful In many cases, money is cheaper than engineering time ,Jon Pincus (Microsoft Research),15,“Revolutionary tools”,People may be willing to do up-front work to Enable something previously impossible Or provide order-of-magnitude improvements BUT! Still must

13、be significant problem, real code base Need compelling evidence of chance for success Any examples?,Jon Pincus (Microsoft Research),16,Outline,What makes a tool successful? Successful tools Analysis in context Implications for analysis Conclusion,Jon Pincus (Microsoft Research),17,PREfix,Analyzes C/

14、C+ source code Identifies defects GUI to aid understanding and prioritization Viewing individual defects Sorting/filtering sets of defects Integrates smoothly into existing builds Stores results in database,PREfix 2.X Architecture,Source Code,Model Database,Defect Database,Web Browser,C/C+ Parser,Jo

15、n Pincus (Microsoft Research),Jon Pincus (Microsoft Research),19,Counterintuitively ,Actual analysis is only a small part of any “program analysis tool”.In PREfix, 10% of the “code mass”,Jon Pincus (Microsoft Research),20,3 key non-analysis issues,Parsing Integration Build process Defect tracking sy

16、stem SCM system User interaction Information presentation Navigation Control,Jon Pincus (Microsoft Research),21,Parsing,You cant parse better than anybody else but you can parse worse Complexities: Incompatibilities and extensions Full language complexity Language evolution Solution: dont Alternativ

17、es: GCC, EDG, ,Jon Pincus (Microsoft Research),22,Integration,A tool is useless if people cant use it Implied: “use it in their existing environment” “Environment” includes Configuration management (SCM) A build process (makefiles, scripts, ) Policies A defect tracking system People have invested hu

18、gely in their environment They probably wont change it just for one tool,Jon Pincus (Microsoft Research),23,User interaction,Engineers must be able to Use the analysis results Understanding individual defects Prioritizing, sorting, and filtering sets of defects Interact with other engineers Influenc

19、e the analysis Current tools are at best “okay” here Improvement is highly leveraged,Jon Pincus (Microsoft Research),24,Example: Noise,Noise = “messages people dont care about” Noise can result from Incorrect tool requirements Integration issues Usability issues (e.g., unclear messages) Analysis ina

20、ccuracies ,Jon Pincus (Microsoft Research),25,Dealing with noise,Improving analysis is usually not sufficient May be vital; may not be requiredSuccessful user interaction techniques: Filtering History Prioritization Improving presentation, navigation Providing more detail,Jon Pincus (Microsoft Resea

21、rch),26,Outline,What makes a tool successful? Characteristics of successful tools Analysis in context Implications for analysis Conclusion,Jon Pincus (Microsoft Research),27,Characteristics of useful analyses,Scalable to “large enough” system Typically implies incomplete, unsound, decomposable, and/

22、or very simple “Accurate enough” for the task at hand Produce information usable by typical engineer E.g., if theres a defect, where? How? Why? Remember: half the engineers are below average Handle full language complexity (or degrades gracefully for unhandled constructs) Handle partial programs,Jon

23、 Pincus (Microsoft Research),28,Analyses are not useful if ,They dont apply to the tools “reality” “For a subset of C, excluding pointers and structs ” “We have tested on our approach on programs up to several thousand lines of Scheme ” They assume up-front work for the end user “Once the programmer

24、 modifies the code to include calls to the appropriate functions ” “The programmer simply inserts the annotations to be checked as conventional comments ”,Jon Pincus (Microsoft Research),29,Different tradeoffs from compilers,Focus on information, not just results Compilers dont have to explain what

25、they did and why Unsoundness is death for optimization but may be okay for other purposes Intra-procedural analysis often not enough,Jon Pincus (Microsoft Research),30,Types of analyses,FCIA: Flow- and context-insensitive FSA: Flow-sensitive CSA: Context-sensitive FCSA: Flow and context sensitive PS

26、A: Path-sensitive,Performance vs. “Accuracy”,Dont forget “information”!,Jon Pincus (Microsoft Research),33,Example analysis tradeoffs,PREfix: scalable, usable analysis results Path-sensitive Incomplete (limit # of paths traversed) Unsound (many approximations) Major emphasis on summarization (“model

27、s”) PREfast: fast, usable analysis results Local analyses, using PREfix models Flow-insensitive and flow-sensitive analyses Far less complete than PREfix,Jon Pincus (Microsoft Research),34,Aside: Techniques for scalability,Decompose the problem Use the existing structure (function, class, etc.) Summ

28、arization, memoization Caveat: make sure you dont lose key info! Give up completeness and soundness Use three-valued logic with “dont know” state Track approximations to limit the damage Examine and re-examine tradeoffs! Optimize for significant special cases,Jon Pincus (Microsoft Research),35,Outli

29、ne,What makes a tool successful? Characteristics of successful tools Analysis in context Implications for analysis Conclusion,Jon Pincus (Microsoft Research),36,Recap: successful tools,People use tools to accomplish their tasks Successful tools must address real problems, on real code bases, give so

30、mething for (almost) nothing, and be easy to use Analysis is only one piece of a tool Information is useless if its not presented well,Jon Pincus (Microsoft Research),37,One persons opinion,Why are so few successful real-world development and testing tools influenced by program analysis research? Se

31、veral key areas are outside the traditional scope of program analysis research User interaction Visualization (of programs and analysis results) Integration,Jon Pincus (Microsoft Research),38,One persons opinion (cont.),Why are there so few successful real-world programming and testing tools based o

32、n academic research? Program analysis research in general: Not directly focused on “key problems” Not applicable to “real world” code bases Makes unrealistic assumptions about up-front work,Jon Pincus (Microsoft Research),39,One tool developers mindset,We have plenty of ideas already. We cant even i

33、mplement all our pet projects! We are interested in new ideas but skeptical The burden is on you to show relevance Remember, analysis is only part of our problem If we cant figure out how to present it forget it,Jon Pincus (Microsoft Research),40,Making analysis influential,Show how the analysis add

34、resses a significant problem Synchronization, security, Convince us that it will work in our reality Avoid the obvious problems discussed above Demonstrate in our reality (perhaps by using real-world code bases) or persuade us that it will work,Jon Pincus (Microsoft Research),41,Some interesting que

35、stions ,Which analyses are right for which problems? How to get difficult analyses to scale well? Are there soundness/completeness tradeoffs? Are there opportunities to combine analyses? Can we use a cheap flow-insensitive algorithm to focus a more expensive algorithm on juicy places? Can we use exp

36、ensive local path-sensitive algorithms to improve global flow-insensitive algorithms?,Jon Pincus (Microsoft Research),42,Beyond analysis,Can visualization and user interaction for analysis tools become an interesting research area? How can analysis be used to refine visualization and user interaction?,Jon Pincus (Microsoft Research),43,Questions?,Analysis is necessary but far from sufficient,Jon Pincus Reliability Group (PPRC) Microsoft Research,

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1