1、Systems/Software ICM Workshop Acquisition and Process Issues Working Group Rick Selby and Rich Turner,Systems/Software ICM Workshop July 14-17, 2008 Washington DC,Process & Acquisition Participants,Rick Selby, Northrop Grumman (co-chair) Rich Turner, Stevens Institute of Technology (co-chair) Steven
2、 Wong, Northrop Grumman Ernie Gonzalez, SAF Ray Madachy, USC Matt Rainey, Redstone Huntsville Dan Ingold, USC Dave Beshore, Aerospace Lindsay MacDonald, BAE Systems Blake Ireland, Raytheon Bill Bail, Mitre Barry Boehm, USC John Forbes, OSD/SSA Carlos Galdamez, Boeing Gail Haddock, Aerospace,Software
3、/Systems Process and Acquisition Initiatives,Start-up teams,SW start-up teams,Methods for driving behavior for SW risk reduction,SW leadership meeting with Chief SW Engineers,Prioritized SW requirements list with cut-points at PDR,Independent SW risk team (non-contractor),SW design reviews are risk-
4、 not function-centric,1st SW Build at PDR for key components for SW architecture evidence,TRL framework for SW,Ideal forms of SW evidence,Engage right SW decision makers,New SW acquisition approaches,Architecturally significant SW requirements,Enable parallel, open, competitive environment for SW ac
5、quisition,SW invariants that you must have in order to adopt ICM,Learn from CMMI appraisals for evidence-based reviews,Develop approaches to minimize/prevent protests,Software/Systems Process and Acquisition Initiatives,Ideal Program PDR (pre-Milestone B),Attendees “right decision-makers” attend and
6、 will involved prior to the PDR P 12, 3, 0 E 0, 1, 2 All success-critical stakeholders engaged, including technical warrant holders who will authorize ultimate deployment Focus of meeting and method of evaluation Focus on risks (vs functionality) of achieving desired functionally within the proposed
7、 architecture Evidence-based review The decision making actually occurs before the milestone; how to empower the Technical knowledge Level the playing field in terms of technical knowledge, such as embedding engineers in the contractor organization and making the acquisition personnel more knowledge
8、able about SW Need to have government/FFRDC/UARC SW tiger teams that go into and help acquisitions and programs, such as from Tri-Service Acquisition Initiative including Start-Up Teams to help launch new programs Risks Rather than listing the system “functions”, we list the “risks” at the review P
9、10, 4, 0 E 0, 3, 0 Independent groups (non-contractor) identify and investigate the risks P 9, 5, 0 E 0, 3, 0 Architecture At least one SW Build for each key (such as high risk) software component (maybe CSCI) to demonstrate their functionality and integration, which demonstrates that the SW people
10、have explored preliminarily design space P 10, 2, 0 E 3, 0, 0 Need to define architecturally significant requirements (by definition, these architecturally significant risks are addressed in the first release) and map these to risks P 10, 2, 0 E 0, 0, 3 Scalability Performance If architecturally sig
11、nificant risks are not addressed, the system “will fail” Somehow put in place the architecturally baseline earlier Requirements Prioritized list of requirements/capabilities/features that the customer can select the “cut point” based on the degree of value, risk, budget, and other new information P
12、13, 0, 0 E 1, 1, 1 Incorporate some notion of how to change the requirements to reduce risk Need to be able to assess whether requirements allocated to configurable items make sense,Comments on Draft 5000.2 Language SA 8, 4, SD 3,3.5.10. A System Preliminary Design Review(s) (PDR(s) shall be conduct
13、ed for the candidate design(s) to establish the allocated baseline (hardware, software, human/support systems) and underlying architectures and to define a high-confidence design. All system elements (hardware and software) shall be at a level of maturity commensurate with the PDR entry and exit cri
14、teria as defined in the Systems Engineering Plan. A successful PDR will provide independently validated? evidence that supports inform requirements trades decisions; substantiates design decisions; improves cost , schedule, and performance estimation; and identifyies remaining design, integration, a
15、nd manufacturing risks. The PDR shall be conducted at the system level and include user representatives , technical authority, and associated certification authorities. The PDR Report shall be provided to the MDA at Milestone B and include recommended requirements trades based upon an assessment of
16、cost, schedule, and performance risks?.Synergy with ICM Greater emphasis on risk-driven decisions, evidence, and high-confidence designs,Comments on Draft 5000.2 Language SA 7, 6, SD 1,3.5.10. A Preliminary Design Review (PDR) shall be conducted for the candidate design(s) to establish the allocated
17、 baseline (hardware, software, human/support systems) and underlying architectures and to define a high-confidence design. At PDR, evidence shall be provided that independently? validates that aAll system elements (hardware and software) are shall be at a level of maturity commensurate with the PDR
18、entry and exit criteria. A successful PDR will support inform requirements trades decisions; substantiate design decisions; improve cost , schedule, and performance estimationes; and identify remaining design, integration, and manufacturing risks. The PDR shall be conducted at the system level and i
19、nclude user representatives , technical authority, and associated certification authorities. The PDR Report shall be provided to the MDA at Milestone B and include recommended requirements trades based upon an assessment of cost, schedule, and performance risks?.Synergy with ICM Greater emphasis on
20、risk-driven decisions, evidence, and high-confidence designs,Some Quotes for Context Setting,“The only way we will have large acquisition programs on schedule, within budget, and performing as expected, is for everyone - from Congress down to the suppliers - to all stop lying to each other at the sa
21、me time.“Softwares just another specialty discipline and doesnt deserve special attention. Integrating software engineering into the development is the job of the chief system engineer.“It takes so long for a program to reach deployment that we are essentially acquiring legacy systems.“Spiral proces
22、s is nothing more than the vee chart rolled up.“There is no such thing as an emergent requirement.“Evolutionary acquisition is just a ploy to excuse the software guys incompetence and let programs spiral forever without having to deliver something.“,Some Topics for Discussion: Acquisition and Proces
23、s,Quality Factor Tradeoffs Integrating hardware and software quality factor evidence planning and preparation guidelines Coordinating single-quality-factor IPTs Cost and Risk Budgeting for systems and software risk mitigation Risk-driven earned value management Translating shortfalls in feasibility
24、evidence into next-increment risk management plans Requirements Concurrently engineering vs. allocating system, hardware, software, and human factors requirements Methods for dealing with requirements emergence and rapid change Competitive Prototyping Supporting value-adding continuity of prototype
25、development and evaluation teams Topic Specifics Synchronizing different-length hardware and software increments Early hardware-software integration: hardware surrogates Contracting for 3-team developer/V&Ver/next-increment rebaseliner incremental development,USC-CSSE,Incremental Commitment Life Cyc
26、le Process,Stage I: Definition,Stage II: Development and Operations,Understanding ICM Model for Software,Reconciling the milestones Where are LCO/LCA/IOC and SRR/PDR/CDR? When are the downselects: 3 to 2, 2 to 1? How to drive behavior RFP language Award fee Large carrot (sole winner of major program
27、) How long does the competitive phase last (ends at Milestone B, ends later, etc)? Create a “whole new contractor role” that gets awarded to the 2-to-1 downselect non-winner External evaluators come into reviews (“air dropped”) and have a high entry barrier and limited context to achieve success Los
28、s of valuable expertise in the non-winner Non-winner becomes the “evaluator” of evidence throughout the program What kinds of evidence/prototypes are needed for what kinds of risks? Funding Who pays for pre vs post 2-to-1 downselect (what color)? How do you use CP to do: New approaches for model def
29、inition and validation Quality attribute trades (non-functional),Ranked Summary of Initiatives (High to Low),Issues - 1,John Young was seeing CP as a way to “get the HW right” He did not expect CP to cause all this discussion about SE/SW What is the order of buying down risk? We currently do evidenc
30、e-based reviews for CMMI appraisals? P 1,4,9 E 2, 0 , 1 How do we change the behavior of both the vendor and acquirer? Reviewers now “tune out” when the SW architecture presentation is given because it is hard to “bring it to life” ICM ties together goals of reviewers Navy currently has a six-gate r
31、eview system Has an emphasis similar to ICM, including both system and software ICM has “sufficient levels of vagueness”; provides opportunity for tailoring which is a positive flexible How can we figure how the HW-SW touchpoints? SW has the inherent value of changeability,Issues - 2,What are the id
32、eal forms of evidence? P 12, 0, 1 E 0, 1, 2 Demonstrating is not a complete answer? Needs to be a validated demo that addresses For example, on the early FCS reviews there were many dog-and-pony shows No/little talk about risks There are already lots of gates and reviews in place now; but the Army h
33、ad seven Nunn-McGurdys last year The decision makers are not attending the early reviews These people are needed, not just surrogates “who just take notes” When do you start addressing these issues and when do you push these issues up the chain The review attendees are “going for the show” not “to d
34、o the review” The contractor overwhelms the reviewers in terms of technical knowledge Somehow we need to level the playing field in terms of technical knowledge Need some form of parallel teams Risk: PDRs are currently oriented around functions Rather than listing the system “functions”, we list the
35、 “risks” at the review This enables something that the reviewers can focus on The “ranked risk list” becomes a first-class document that is at least as important as other design documents At the PEO/IWS (Navy), there is an emerging requirement that prior to System PDR, there will have been at least
36、one SW Build for each CSCI to demonstrate that their functionality and integration Demonstrate performance-criticality functionality We should define the “invariants” that you must have in order to adopt ICM We need to make sure that the risks that are currently being presented are honest/accurate T
37、he government reviewers somehow identify the risks, and can empower/contract some teams to address these risks Independent groups (non-contractor) identify and investigate the risks Requirements organization and presentation Need to define architecturally significant requirements and map these to ri
38、sks The first release (“indivisible build 1”) needs to address all architecturally significant risks Take a fraction of the predictable overruns (50-100%) and spend it up front to reduce risks,Issues - 3,Need TRL framework for SW? P 7, 6, 0 E 0, 2, 1 Maybe do not call this framework “TRL” because of
39、 confusion with existing HW-centric TRLs MDA has SWRLs (software readiness levels) now and it works pretty well Navy has ratings for process and functionality, analogous to TRLs Interface level readiness too (from IDA workshop April 2008) SMC uses Key Decision Points (KDPs) as the major decision mil
40、estones, and their KDP-B occurs after the SW reviews now Need to change the attitude of the senior acquisition and policy decision makers (above chief engieer level) SW illiteracy exists at the highest levels, such as arguments about whether to do SDPs Need to re-instate the original language that w
41、as proposed for the DoD 5000 revision Need to think broadly about new acquisitions approaches, such as moving away from fee-on-labor cost-plus contracting vehicles to “new incentive models” P 11, 2, 1 E 0, 0, 3 Navy sonar systems have periodic re-bidding approach/process where contractors continual
42、re-bid on new capabilities “technology refresh cycle” One inhibitor: How to protect IP that is a discriminator for contractors corporations, including the underlying methods for producing the products Boeing made middleware for UAV open source shared across military contractors ELV Atlas-V Ground Co
43、ntrol System uses Linux libraries Naval open architecture for open systems (including shared code) initiative, including contract terms and licensing Domain-based “members-only” open source models How about moving toward an open source model Development tools as well as systems How to address assura
44、nce? Government has “open access” to disk farm where all development artifacts (req, design, source code, test code, etc) are stored/developed and therefore can inspect/analyze Have common test beds Such as original Ada validation suite Define common test beds that also provide meeting place and com
45、munities to interact DDR&E has several development and test beds environments now that enable the SBIRs teams to develop products How to address unknown unknowns Making unknown unknowns known,Issues - 4,Enable parallel, open, competitive environment P 6, 4, 3 E 0, 0, 3 When acquiring new system, ado
46、pt an members-only open source model: Standard middleware (think RT Linux for ELV Atlas-V ground system and now NASA Aries ground) Apple-like AppStore for developers to develop and sell applications; Google has gadgets (free), Microsoft has gadgets (free), Yahoo has widgets (free) Must have some seq
47、uence of gates to ensure that SW was “reasonable” / do-no-harm Members-only contributors SW is “low cost” but need to pay for support Multi-tier pricing scheme for execute only, source for customers, source for all Acquirers can select/purchase the applications that have the most value What are the
48、incentives for contractors to invest for developing these applications? Example: ACE/TAO is open source middleware that is being used on Navy SSDS large-deck combat systems Enables new potential competitors because of externally known interfaces Architecture would need to be able to accommodate this
49、 “new thinking”,Issues - 5,What is the earliest to end the Competitive Prototyping? Sometime between Milestone A and B What is the latest to end the Competitive Prototyping? You can build 2 or more complete systems by keeping competition going throughout the lifecycle You continue the competition un
50、til the decision makers (and success critical stakeholders) have sufficient evidence that the risks are “acceptable enough” to go forward with one contractor You can possibly re-open the competition later for some aspects of the program The current working assumption is that you can downselect to on
51、e contractor at Milestone B Can we gain knowledge over time? Once a winner is selected, you want to hit-the-ground running and not lose any time and talent How to do we minimize/prevent protests P 0, 5, 8 E 0, 0, 3 Will the early rounds of prototyping show you enough evidence to justify going with t
52、he a Sole Source award (and therefore avoid protests) Milestone B brings on new requirements and formal briefings to Congress Right now, the government declares the budget before Milestone B It is very difficult to re-certify programs when you exceed 25% of the original budget Most programs who Nunn-McCurdy once, do it again because of staff loss, etc Government wants cost and schedule realism,