Team Software Project (TSP)June 26, 2006System Test.ppt

上传人:eastlab115 文档编号:389678 上传时间:2018-10-14 格式:PPT 页数:47 大小:332KB
下载 相关 举报
Team Software Project (TSP)June 26, 2006System Test.ppt_第1页
第1页 / 共47页
Team Software Project (TSP)June 26, 2006System Test.ppt_第2页
第2页 / 共47页
Team Software Project (TSP)June 26, 2006System Test.ppt_第3页
第3页 / 共47页
Team Software Project (TSP)June 26, 2006System Test.ppt_第4页
第4页 / 共47页
Team Software Project (TSP)June 26, 2006System Test.ppt_第5页
第5页 / 共47页
亲,该文档总共47页,到这儿已超出免费预览范围,如果喜欢就下载吧!
资源描述

1、6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,1,Team Software Project (TSP) June 26, 2006System Test,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,2,Outline,Remaining Session Plan & Discussion System Test Plan Discussion Mythical Man Month System Test Plan Recap Metrics Presentations More on

2、 Measurement Next Phases Cycle 1 Test Cycle 1 Post-Mortem & Presentations Cycle 2 Plan & Strategy,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,3,Due Today,Key Metrics Presentation (10-15 minutes) All Implementation Quality Records (LOGD, CCRs, etc.) Final code (source & executable) Updated Prod

3、ucts (code components, SRS, HLD, User Documentation) Intermediate Products (e.g. Unit Test Plans) Configuration Management Plan Release CD: Application User Guide Release LetterNo class on July 3,Project Performance Discussion,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,5,Remaining Lectures Pl

4、an/Discussion,July 10 Cycle 1 Test Complete & Post-Mortem Cycle 1 Results Presentation & Discussion Cycle 1 Reports & Post-Mortem Measurement Team audit July 17 Cycle 2 Launch Cycle 2 Launch, Project & Measurement Planning Peopleware Topics: Management, Teams, Open Kimono, Quality, Hiring/Morale, Ju

5、ly 24 Cycle 2 Requirements Complete Cycle 2 Requirements Death March Projects: July 31 Cycle 2 Implementation Complete System Test Plan Baselined Cycle 2 Design & Implementation Process topics CMMI, TL-9000, ISO August 7 Cycle 2 Test Complete Cycle 2 Test Complete Cycle 2 Post-Mortem Complete August

6、 14 - Course Review Course Review Class exercise Final,Remaining Course Topics Discussion,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,7,System Test Schedule,Note: Assumes system has already passed Integration Test Full feature to system test and instructor by COB June 25 including: Test enviro

7、nment Executable User documentation (note: CCRs can be filed against user documentation) Source code Tester generates CCRs for all finds & fills out LOGTEST Email to instructor when generated (see below) Development team updates LOGD referencing CCRs Required turn-around times for fixes 80% within 2

8、4 hours 99% within 48 hours Required test coverage short of blocking issues 80% First Pass Test Complete by June 28 100% First Pass Test Complete by July 1 Regression Test Complete by July 3 Daily test reports to instructor detailing test cases executed, results & CCRs,6/26/2007,SE 652- 2007_6_26_Te

9、stResults_PSMp1.ppt,8,System Test Plan Recap,Areas to cover: Installation Start-up All required functions available & working as specified Diabolical (e.g. power failures, corner cases, incorrect handling) Performance Usability Includes: Test cases you plan to run (numbered / named) Expected results

10、 Ordering of testing & dependencies Supporting materials needed Traceability to requirements,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,9,Release “Letters”,Purpose Whats in it? Version Information Release contents Examples: All functionality defined in Change Counter Requirements v0.6 except

11、GUI Phase 1 features as defined in project plan x.y Feature 1, Feature 2, Feature 3 as defined by Known Problems Change Request IDs w/ brief customer oriented description Fixed Problems Upgrade Information Other?,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,10,Implementation Status,Implementati

12、on experience Unit/Integration experience Problems / Rework?PIP forms,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,11,Implementation & Test Discussion,Sample topics Obstacles to success? Things that went well? Things to avoid? Biggest surprises? How did you do vs. plan? Crises handled? Team dyn

13、amics in crisis?,Team Presentation,Project Measurement,Source: Practical Software Measurement John McGarry, et.al.,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,14,Measurement,“If you cant measure it, you cant manage it”Tom DeMarco,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,15,Fundamental

14、s,Dont try to measure everything Align measures with: Project goals & risks (basic survival mode) Process improvement areas (continual improvement mode) Define measurement program up front Monitor continuously & take action where needed,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,16,Applicatio

15、ns,Improve accuracy of size & cost estimates Improve quality Understand project status Produce more predictable schedules Improve organizational communication Faster, better informed management decisions Improve software processes,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,17,Basic In-Process

16、 Measurement Examples,Schedule Earned Value vs. Planned Value Schedule Variance Development Task completion Actual code completed vs. planned Project End Game Defect Creation vs. Closure Variations: severity System Test % Testing Complete Variations: passed, failed, blocked Test Time / Defect Test C

17、overage (vs. requirements, white box code coverage),6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,18,Process Improvement Measurement Examples,Quality Defect density Post Deployment defect densityInspection Effectiveness Defects / inspection hourEstimation Accuracy,6/26/2007,SE 652- 2007_6_26_Tes

18、tResults_PSMp1.ppt,19,Why Measure?,Support short & long term decision making Mature software organization (CMMI level?) uses measurement to: Plan & evaluate proposed projects Objectively track actual performance against plan Guide process improvement decisions Assess business & technical performance

19、Organizations need the right kind of information, at the right time to make the right decisions,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,20,Measurement in Software Lifecycle,Plan Do carry out change Check observe effects of change Act decide on additional areas for improvement RepeatConside

20、rations: Cost, schedule, capability, quality,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,21,Measurement Psychological Effects,Measurement as measures of individual performance Hawthorne Effect Measurement Errors Conscious: rounding, pencil whipping (ie. False data entry) Unintentional: inadver

21、tent, technique (ie. Consistent),6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,22,Use of Measures,Process Measures time oriented, includes defect levels, events & cost elements Used to improve software development & maintenance process Product Measures deliverables & artifacts such as documents

22、includes size, complexity, design features, performance & quality levels Project Measures project characteristics and execution includes # of developers, cost, schedule, productivity Resource Measures resource utilization includes training, costs, speed & ergonomic data,6/26/2007,SE 652- 2007_6_26_T

23、estResults_PSMp1.ppt,23,Measurement Uses,Objective information to help: Communicate effectively Track specific project objectives Identify & correct problems early Make key trade-off decisions Justify decisions,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,24,Glossary,Entity - object or event (e

24、.g. personnel, materials, tools & methods) Attribute - feature of an entity (e.g. # LOC inspected, # defects found, inspection time) Measurement - # and symbols assigned to attributes to describe them Measure quantitative assessment of a product/process attribute (e.g. defect density, test pass rate

25、, cyclomatic complexity) Measurement Reliability consistency of measurements assuming nochange to method/subject Software validity proof that the software is trouble free & functions correctly (ie. high quality) Predictive validity accuracy of model estimates Measurement errors systematic (associate

26、d with validity) & random (associated w/ reliability) Software Metrics approach to measuring some attribute Defect product anomaly Failure termination of products ability to perform a required function,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,25,PSM Measurement Process,Measurement Plan Info

27、rmation need e.g.: What is the quality of the product? Are we on schedule? Are we within budget? How productive is the team? Measurable Concept Measured entities to satisfy need (abstract level: e.g. productivity) Measurement Construct What will be measured? How will data be combined? (e.g. size, ef

28、fort) Measurement Procedure Defines mechanics for collecting and organizing data Perform Measurement Evaluate Measurement,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,26,Measurement Construct,Decision Criteria,Measurement method,Measurement Function,Measurement method,Analysis Model,6/26/2007,S

29、E 652- 2007_6_26_TestResults_PSMp1.ppt,27,Attributes,Attribute Distinguishable property or characteristic of a software entity (Entities: processes, products, projects and resources) Qualitative or Quantitative measure,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,28,Base Measure,Measure of an a

30、ttribute (one to one relationship) Measurement method Attribute quantification with respect to a scale Method type Subjective (e.g. high, medium, low), Objective (e.g. KLOC) Scale Ratio Interval Ordinal Nominal Unit of measurement e.g. hours, pages, KLOC,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1

31、.ppt,29,Derived Measure Indicator,Derived Measure Function of 2 or more base measures Measurement Function Algorithm for deriving data (e.g. productivity = KLOC/developer hours)Indicator Estimate or Evaluation Analysis Model Algorithm / calculation using 2 or more base &/or derived measures + Decisi

32、on Criteria Numerical thresholds, targets, limits, etc. used to determine need for action or further investigation,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,30,Measurement Construct Examples,Productivity Attributes: Hours, KLOC Base Measures: Effort (count total hrs), Size (KLOC counter) Der

33、ived Measure: Size / Effort = Productivity Analysis Model: Compute Mean, compute std deviation Indicator: Productivity: mean w/ 2 confidence limits Quality Attributes: Defects, KLOC Base Measures: # Defects (count defects), Size (KLOC counter) Derived Measures: # Defects / Size = Defect Rate Indicat

34、or: Defect rate control: baseline mean, control limits & measured defect rate,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,31,More Measurement Construct Examples,Coding Base Measure: Schedule (w.r.t. coded units) Derived Measure: Planned units, actual units Analysis Model: Subtract units comple

35、ted from planned units Indicator: Planned versus actual units complete + variance,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,32,Class Measurement Construct Examples,Coding Base Measure: Derived Measure: Analysis Model: Indicator:,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,33,Identify C

36、andidate Information Needs Project Objectives Cost, schedule, quality, capability Risks Prioritize One approach: probability of occurrence x project impact = project exposure e.g. Schedule Budget Reliability Dependencies Product Volatility,Measurement Planning,6/26/2007,SE 652- 2007_6_26_TestResults

37、_PSMp1.ppt,34,PSM Common Information Categories,Schedule & Progress Resources & Cost Product Size & Stability Product Quality Process Performance Technology Effectiveness Customer Satisfaction,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,35,PSM Common Information Categories Measurement Concepts

38、,Schedule & Progress - milestone dates/completion, EV/PV Resources & Cost - staff level, effort, budget, expenditures Product Size & Stability - KLOC/FP, # requirements, # interfaces Product Quality - defects, defect age, MTBF, complexity Process Performance - productivity, rework effort, yield Tech

39、nology Effectiveness - requirements coverage Customer Satisfaction - customer feedback, satisfaction ratings, support requests, support time, willingness to repurchase,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,36,Select & Specify Measures,Considerations Utilize existing data collection mecha

40、nisms As invisible as possible Limit categories & choices Use automated methods over manual Beware of accuracy issues (e.g. timecards) Frequency needs to be enough to support ongoing decision making (alternative: gate processes),6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,37,Measurement Constr

41、uct,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,38,Project Measurement Plan Template,(from PSM figure 3-10, p 56) Introduction Project Description Measurement Roles, Responsibilities & Communications Description of Project Information Needs Measurement Specifications (i.e. constructs) Project

42、Aggregation Structures Reporting Mechanisms & Periodicity,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,39,Team Project Postmortem,Why Insanity Continuous improvement Mechanism to learn & improve Improve by changing processes or better following current processes Tracking process improvements du

43、ring project Process Improvement Proposals (PIP) Post-Mortem Areas to consider Better personal practices Improved tools Process changes,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,40,Cycle 2 Measurement Plan,Identify cycle 2 risks & information needs Review & revise measures & create measureme

44、nt constructs Document in a measurement plan,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,41,Postmortem process,Team discussion of project data Review & critique of roles,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,42,Postmortem process,Review Process Data Review of cycle data including S

45、UMP & SUMQ forms Examine data on team & team member activities & accomplishments Identify where process worked & where it didnt Quality Review Analysis of teams defect data Actual performance vs. plan Lessons learned Opportunities for improvement Problems to be corrected in future PIP forms for all

46、improvement suggestions Role Evaluations What worked? Problems? Improvement areas? Improvement goals for next cycle / project?,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,43,Cycle Report,Table of contents Summary Role Reports Leadership leadership perspective Motivational & commitment issues,

47、meeting facilitation, reqd instructor support Development Effectiveness of development strategy, design & implementation issues Planning Teams performance vs. plan, improvements to planning process Quality / Process Process discipline, adherence, documentation, PIPs & analysis, inspections Cross-tea

48、m system testing planning & execution Support Facilities, CM & Change Control, change activity data & change handling, ITL Engineer Reports individual assessments,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,44,Role Evaluations & Peer Forms,Consider & fill out PEER forms Ratings (1-5) on work,

49、team & project performance, roles & team members Additional role evaluations suggestions Constructive feedback Discuss behaviors or product, not person Team leaders fill out TEAM EVALUATION form,6/26/2007,SE 652- 2007_6_26_TestResults_PSMp1.ppt,45,Cycle 1 Project Notebook Update,Updated Requirements

50、 & Design documents Conceptual Design, SRS, SDS, System Test Plan, User Documentation* Updated Process descriptions Baseline processes, continuous process improvement, CM Tracking forms ITL, LOGD, Inspection forms, LOGTEST Planning & actual performance Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK, CCR*,

展开阅读全文
相关资源
猜你喜欢
相关搜索

当前位置:首页 > 教学课件 > 综合培训

copyright@ 2008-2019 麦多课文库(www.mydoc123.com)网站版权所有
备案/许可证编号:苏ICP备17064731号-1