Search past ICEAA Workshop Proceedings in the table below and click the title to access the downloadable files.
2007-2024 Workshop Proceedings are available online. For 2006 and earlier, please email us.
Title | Author(s) | Summary | Year | Track |
---|---|---|---|---|
Data-Driven Lifecycle Analysis to Optimize Cost, Risk, and Sustainability | George Bayer | Many government infrastructure investments adhere to a standard lifecyle to estimate program cost, plan replacement timing, and compare business cases to one another in a cost-benefit analysis. What if those lifecycle replacement timelines are inconsistent with system sustainability and are not cost-effective? Some infrastructure systems which are replaced according to an end-of-life schedule can be sustained more cost-effectively for longer periods of time via preventative maintenance. Our team examined multiple infrastructure program replacement timelines and analyzed operational effectiveness, cost/risk trade-offs, system redundancy, and sustainability, and we recommended lifecycle adjustments based on those considerations. We reduced overall program cost by extending replacement timelines, eliminating system redundancy without compromising sustainability, and reprioritizing maintenance portfolios on critical backlogs. We document a comprehensive process to customize program lifecycles to optimize cost, risk, and sustainability. | 2024 | Analytical Methods |
Triage the Sub-Projects: Calculating and Applying Portfolio Contingency | Stephen Koellner | Risk-adjusted cost estimates are needed to understand the potential range of actual costs through execution. Cost risk analysis produces uncertainty distributions which can be used to calculate an expected cost as well as contingency, which can be thought of as the difference between expected cost and a higher confidence level chosen for planning purposes. In a portfolio of projects, allocating uncertainty at the portfolio level will result in a different risk-adjusted cost than applying the same allocation at the project level, and so it is unclear whether a portfolio should allocate and manage risk-informed contingency at the portfolio or project level. This topic will explore methods for calculating portfolio contingency, using a tangible example to demonstrate. | 2024 | Analytical Methods |
Things Forgotten Since College - Foundational Statistics | Jordan Harlacher | Statistical analysis is one of the foundations of cost estimating, but fundamentals are easy to overlook. This presentation will help ensure that is not the case for your next estimate as we will discuss how the data collection and organization processes can form the basis for your estimate. Once the relevant data has been collected and organized, the real fun begins, as the central tendencies and variability of the data can now be examined. The central tendencies and variability can be used to determine the most applicable distribution and assess the probability of different events occurring. We will examine the best ways to visualize different data sets, using charts and graphs to convey the information clearly to stakeholders, as visualizing the data can help inform relationships between variables. Finally, we will touch on key statistics to look for in your regression analysis to ensure a meaningful relationship is defined. | 2024 | Analytical Methods |
Stretching Purchasing Power through Improved Escalation Methods | Amanda Schwark | Escalation methods ensure cost estimates adapt to economic changes and facilitate accuracy and reliability. The NNSA chartered the Programmatic Recapitalization Working Group (PRWG) to track mission-critical equipment directly supporting weapons activities across the NSE. The PRWG maintains a comprehensive database of equipment above the NNSA capital acquisition threshold of $500,000. The previous escalation methodology for equipment purchase price was limited to using a single equipment inflation index. Additional fidelity in price projections can be achieved by leveraging empirical price data and published indices to derive escalation rates specific to various equipment categories. This paper explores our approach to improving upon the previous escalation methodology to better inform planning and programming decisions. This approach can be leveraged when one broad escalation index is used to predict costs for many significantly differing data elements. | 2024 | Analytical Methods |
Spacecraft Design to a Cost Target: From CAIV to Cosmos | Ryan Sukley | Perfect performance of every system is critical for space missions. Identifying capable designs is a challenging process, and one that often comes at the expense of exceeding cost targets. The Cost as an Independent Variable (CAIV) approach helps mitigate this issue by treating cost as a primary consideration in the design or procurement of systems. Establishing a fixed cost target sets a ceiling for the cost versus performance trade-off and, in the case of NASA's in-house spacecraft, enables more cost-conscious decision making. This paper examines the application of CAIV to identify upper bounds for parameters (mass, power, quantity, etc.) early in the process of designing a spacecraft that satisfies mission requirements. It describes the process of developing, maintaining, and explaining the limitations of this capability, and addresses potential applications of the approach to other commodities. | 2024 | Analytical Methods |
Early-Stage Cost Growth CER Development | Gabriel Sandler | Capital acquisition projects at the National Nuclear Security Administration (NNSA) have experienced significant early-stage cost estimate growth, driven in part by early optimism and missed scope. To account for these potential scope changes, NNSA's Office of Programming, Analysis, and Evaluation (PA&E) developed a cost estimating relationship (CER) for construction projects which relates the actual total project cost (TPC) to its early-stage scope estimate. This methodology differs from usual CERs which model actual cost as a function of actual scope, but reflects the scope uncertainty NNSA projects have at early stages. Three cost drivers (gross square footage, hazard category, and equipment complexity) were selected as the variables to solve for the TPC. The results of the CER were compared to another PA&E CER built with actual scope and actual costs so that early-stage cost estimate growth at the NNSA for various types of capital acquisition projects could be quantified. | 2024 | Analytical Methods |
Market Dimensional Expansion, Collapse, Costs, and Viability | Douglas K. Howarth | Most government programs set out with cost caps and minimum force requirements. Commercial projects usually begin with a budget, sales targets, and specifications. All too often, in both cases, producers and customers give little thought to the changing market structures they face. When it comes to Demand, markets self-organize to form up to four boundaries each, including 1) Upper (price-limited), 2) Outer (saturation-limited), 3) Inner (efficiency-limited), and 4) Lower (margin-limited) Demand Frontiers. When new market segments appear as different product forms with enhanced functionality over existing options, as the new markets grow, the product groupings they replace may contract across one or more Demand Frontiers. This paper examines preparing for these inevitable eventualities in an N-dimensional framework. | 2024 | Analytical Methods |
Comparison of UMP in the Great Recession and the Covid-19 Recession | Nathan Gallacher | This piece aims to produce a review of the Unconventional Monetary Policy (UMP) used in both the Great Recession 2007-09 and the COVID-19 Recession, then compare the two recessions to show how unconventional monetary policy changed, differences in tools used by the Bank of England and the size of the tools put in place. Notably, tools such as quantitative easing see use in both recessions suggesting similarities in the aims of the Bank of England during both recessions. The main results show a significant increase in the use of unconventional monetary policy from the Great Recession to the COVID-19 Recession. At the same time, inflation outcomes were worse during the COVID-19 Recession. This suggests that the greater reaction by the BoE in the use of UMP towards the COVID-19 Recession may not have been as effective in controlling inflation compared to the Great Recession. | 2024 | Analytical Methods |
Explosive Analysis: Using Data to Hold Warfare Centers Accountable | Ryan Webster | The Joint Service Explosive Ordnance Procedure Publications program creates and maintains critical documents for the safe handling of ordnances. This effort is managed by Naval Warfare Centers. Historically, senior leadership has funded these efforts without the ability to evaluate reasonableness of annual funding requests. Augur has recently obtained publications system data, resulting in valuable analysis of historical efforts. This data is being leveraged to develop a planning calculator, capable of estimating ranges of labor hours based on ordnance type, country of origin, and other complexity drivers derived through regression analysis and other visualization techniques. This tool and the accompanying insights will enable senior leadership to negotiate with warfare centers and more easily measure performance. | 2024 | Data Science & Machine Learning |
Maximizing Analysis of Minimalized Datasets | Taylor Fountain | Many techniques exist to determine parametric relationships within large datasets. While cost estimation relies heavily on identifying such relationships, a data-scarce environment, driven by factors such as vendor proprietary restrictions, security concerns, and the uncertainty of emergent technologies, is a common barrier in implementing these techniques. This topic will evaluate common methods to analyze minimalized datasets for developing defendable cost estimates, such as complexity factors and 3-point distribution fitting, and demonstrate the statistical impacts of their underlying assumptions. | 2024 | Data Science & Machine Learning |
Labor Mapping in Parametric Estimates | David Ferland | Contractors and Original Equipment Manufacturers (OEM) alike often struggle applying specific resources or labor categories to their parametric estimates. Many parametric modeling techniques produce hours by generic resources that still need to be translated into labor resources that have rates and other attributes before they can be useful for analysis. I will outline a tool development framework that fills this gap and allows the cost estimates to stay in-sync with downstream tools like ProPricer that may compile the final estimate. This case study uses TruePlanning® as an input to the pipeline but can be applicable to most parametric sources. In cases where Basis-of-Estimates (BOEs; as opposed to Realistic Cost Estimates or RCEs) using proposed resource hours are still being required to justify parametric estimates, the traceability and justification of these pipelines is also an important consideration. | 2024 | Data Science & Machine Learning |
Data Cleaning in Python for Beginners | Alexis Somers | As cost estimators, we collect large amounts of data from many sources, and it's often messy. Cleaning and organizing the data often requires time-consuming manual effort before proper analysis can begin. Using Python to clean and manipulate data is one of the easiest ways to save time and maximize efficiency when working on cost or other data analyses. As a free, beginner-friendly, and versatile tool, Python is an excellent choice for processing and analyzing data. This session will cover how to get started using Python to create simple scripts that produce clean, organized data. We will use the pandas and NumPy libraries to clean datasets by correcting errors, reformatting data, handling missing values, adjusting for outliers, and more. The ability to create simple Python scripts can improve the quality of your cost estimates and other deliverables by improving accuracy, efficiency, and saving time. | 2024 | Data Science & Machine Learning |
Going Beyond Count-Based Methodologies with Semantic Vector Embeddings | Trevor Lax | Machine Learning (ML) is a topic of persistent interest and a frequent buzz word because of the astounding capabilities it has shown across disparate fields. However, the complexity of ML combined with the overwhelming number of options can lead to decision fatigue and reduced understanding in new users. While much attention is duly focused on the data and machine, occasionally the basic components of ML, such as input data type, are not properly interrogated. Indeed, a frequently used Natural Language Processing method, Term Frequency - Inverse Document Frequency (TF-IDF), simply uses counts, which cannot encode syntactic or semantic information. An alternative to TF-IDF, Word-2-Vector, creates vector embeddings of the words in a corpus, instead of relying on sentence-level counts, and attempts to encode Semantic information. Word-2-Vector has its own limitations, such as the need for a large corpus, however, it can allow for better performance and greatly improved flexibility. | 2024 | Data Science & Machine Learning |
Automation and Process Improvement in Cost Estimating | Anil Divvela | 2024 | Data Science & Machine Learning | |
AI and Machine Learning/Data Science Tools for Cost Analysis | Daniel Harper | AI and Machine Learning/Data Science Tools such as Chat GPT have taken on an expanded presence in Cost Analysis. E.g., NLP is used to automate functional software sizing in commercial models. Large Language Models (LLM) may even have applications for cost and acquisition professionals. We will present an overview of modern usages of data science, to include Machine Learning, AI and data visualization. We will also provide several use cases for applying these tools in cost estimation. | 2024 | Data Science & Machine Learning |
Costing Web App Development for Operations Research | Kyle Ferris | Commercial-off-the-shelf (COTS) web application development platforms empower analysts to leverage low-code environments to build comprehensive business tools. Therefore, understanding the lifecycle cost requirements to design, develop, deploy and maintain low-code web applications as both analytical and decision support tools for stakeholders is of interest to the cost community. We define web application lifecycle requirements as analogous to an overarching Data Operations Stack. The Data Operations Stack is a conceptual framework that describes data operations as a set of hierarchical requirements, from base-level IT infrastructure and tools to high-level business products. With this framework in mind, we describe web application lifecycle requirements through successive levels of the Data Operations Stack, elucidating the required personnel, tools, and capabilities integrated into each level. Finally, we discuss how an understanding of interconnected dependencies across the Data Operations Stack can be used to develop defensible cost estimates and manage resources for web application lifecycle requirements. | 2024 | Data Science & Machine Learning |
From a Man-Month to an AI-Minute, Myth or Reality? | Colin Hammond | In this session I will share some of our discoveries of using AI over the last five years that can help software cost estimators and our thoughts on how AI will be changing software development costs in the coming years. Back in 1975 Fred Brooks discussed observations of software engineering, many of which are counter-intuitive in a book entitled The Mythical Man Month, we pay homage to his book title in this talk as we share some observations and quantifications of how AI is helping to improve early software estimation. I will also share our predictions on areas where AI will help accelerate software development and impact on software costs over the next few years. | 2024 | Data Science & Machine Learning |
Implications of Generative AI (Artificial Intelligence) in Software Engineering | Arlene F. Minkiewicz | Generative AI is positioned to revolutionize software development, with potential far reaching implications for productivity. Generative AI applications leverage Large Language Models to understand language, imagery and code, then use what they learned to generate content; answering questions, organizing multimodal information, and writing text and code snippets. A McKinsey report from 2023 reports that the software development landscape is quickly changing as Generative AI applications such as ChatGPT and Github Copilot have the potential to enable software engineers to complete development tasks; achieving as much as 2x productivity over traditional development practices. Activities such as inception and planning, system design, coding, testing, and maintenance can all be aided through applications of Generative AI. This paper will include an introduction to Generative AI in the software engineering context. Following will be a discussion of productivity impacts and guidance for incorporating them into a software estimates. | 2024 | Data Science & Machine Learning |
Distribution Free Uncertainty for CERs | William King | For this presentation we intend to introduce and demonstrate the application of conformal prediction as a tool to specify prediction intervals for any machine learning algorithm. Conformal prediction intervals offer rigorous statistical coverage guarantees without distributional assumptions and only require the exchangeability of data (a weaker assumption than independence). Moreover, generating these prediction intervals is an easy consequence of retaining the sub-models trained during k-fold cross-validation. Specifically, we intend to summarize the "CV+ for K-fold cross-validation" method (and its locally weighted variant) from Predictive Inference with the Jackknife+ (Barber, Candes, Ramdas, Tibshirani, 2021, The Annuals of Statistics), and show how conformal prediction enables distribution free uncertainty for CERs. Additionally, we plan to discuss how this technique can be applied to direct human-in-the-loop intervention when applying machine learning models. | 2024 | Data Science & Machine Learning |
Industry Leaders' Insights: Enhance Efficiency and Simplify Your Work Using AI | Karen Richey Mislick | The modern workplace is increasingly influenced by leaders who recognize the transformative power of data analytics and AI. This presentation delves into the practical experiences and insights gleaned from industry frontrunners effectively utilizing these technologies. These leaders have not only achieved significant operational efficiencies but have also mastered the art of simplification in complex business processes. Their lessons underline the importance of strategic integration, the value of data-driven decision-making, and the transformative potential of AI-driven automation. Attendees will gain a comprehensive understanding of how top enterprises are reducing costs, streamlining operations, and fostering innovation. Drawing from real-world case studies, this presentation aims to encourage cost analysts to tap into the immense potential of data analytics and AI, turning insights into actionable strategies for enhanced work efficiency. | 2024 | Data Science & Machine Learning |
Generative AI for Government | Conner Lawston | ChatGPT' has been making massive waves across the world in the last year! This presentation gives an introduction to several 'Generative AI' models, and how they can create new images, code, data, and text, seemingly out of thin air. We will look at the process of how to build these models, including their training dataset sizes and costs. Examples will be shown of how to use ChatGPT to generate python code for you, as well as R, and PowerBI. After the general overview, specific examples of applications to Government will be shown (including acqbot- an AI tool for generating proposals). There will also be a demo of the 'GURU' bot, which was trained on the Federal Acquisitions Regulation (FAR) pdf, and can answer questions about PPBE, EVM, and Acquisition questions. We will summarize the pros, cons, and potential risks of Generative AI, as well as the future outlook to come. | 2024 | Data Science & Machine Learning |
The Cost-Risk Uncertainty Determination (CRED) Model – A New Approach | Cheryl L. Jones | The objective of this model is to improve the credibility of and trust in a cost estimate by: 1) Identifying, characterizing, and accounting for different cost performance factors that may be sources of risk/uncertainty that can result in creating material impacts on a software sustainment and maintenance cost estimate. 2) This approach makes visible the “knowledge gap†(if any) between "what should be known" and "what is known" about the system under assessment - this "gap" is an input used to assess a range of uncertainty associated with the estimate. 3) It also fully documents the key program issues and related performance factors that may influence the cost estimate and why. While this presentation focuses on the software domain, it is easily adaptable to other domains. | 2024 | Management, EVM & Risk |
Schedule Risk at Early Acquisition | Gabriella Magasic | It can be difficult to construct a realistic schedule early in the acquisition lifecycle due to the limited certainty of requirements, design decisions, and other key elements of program planning. Understanding risk and uncertainty in a schedule is essential, and the GAO Scheduling Guide includes "Conducting a Schedule Risk Analysis" as one of the 10 Best Practices. A Schedule Risk Analysis (SRA) can provide quantitative insight into potential areas of delay along with associated cost impacts. However, a well-formed SRA requires clear input and structured analysis of risk events and uncertainty. In this presentation, we will discuss how to address schedule risk in low maturity projects by investigating different risk modeling techniques, reviewing existing guidance on schedule risk, and analyzing how uncertainty analysis must be interpreted and applied early in the project lifecycle. | 2024 | Management, EVM & Risk |
Cost Estimation for Project Control | Ed Spriggs | Project control in software development is a critical responsibility of program managers and contracting officers. And although the job is a difficult one for most analysts, the inability to measure and control what is being created and tested can result in loss of stakeholder confidence and, in the worst case, a cancelled project/program. What got us here? You guessed it - agile development. The adoption of agile means less defined up-front scope and little to no requirements documentation. While that flexibility allows for more development freedom it creates more variability in the features and functionality of the delivered product. This paper will describe the best new and existing practices for forecasting capabilities (features) that can be delivered within a certain timeframe given the fixed parameters of cost, schedule, and development team size. We will explore innovative techniques to measure software features, even in the absence of requirements, using function points and story points among others. | 2024 | Management, EVM & Risk |
Advancing EVM with a Modernized Framework | Aaron Everly | DoD's FY24 procurement budget is the largest in history. The cornerstone of this budget is the procurement of complex, technologically advanced systems. DoD programs require new technologies to meet end-user requirements; however, the challenges inherent in new technology often translate to significant cost growth. PMs utilize EVM analysis to make informed decisions and mitigate contract cost growth. The IPMDAR exemplifies DoD's recognition of the need for meaningful data by requiring a modernized data schema (machine-readable format providing near real-time cost performance). Likewise, Technomics implements a modern approach to EVM using data analytics software and BI tools applied through a framework that incorporates a comprehensive view of EVM. This paper describes Technomics' EVM Framework (EVMS Surveillance, Contract Startup, Data Aggregation, EV Analysis, and Program Management), which implements modern tools to not only reduce monthly reporting tasks but also perform powerful EV analysis that enables programmatic decisions. | 2024 | Management, EVM & Risk |
EVM Reviews – Surveillance Reviews vs. IBRs | Sam Kitchin | Successful Earned Value Management (EVM) implementation requires an effective Earned Value Management System (EVMS) and a well-planned performance measurement baseline. Meaningful insight into project performance can only be achieved with this combination of a compliant system with the active planning and management of project execution. A critical method to evaluate adherence to EVM best practices is to conduct reviews. Compliance reviews and surveillance reviews are used to evaluate the sufficiency of the EVMS, while integrated baseline reviews are used to assess the reasonableness of a project baseline. This presentation will compare and contrast these two types of review, demonstrating how and why they differ. Key terminology, stakeholders, artifacts, timeline, and intended results will be discussed. Real life examples may be used. | 2024 | Management, EVM & Risk |
Advanced EVM Analysis using Time Series Forecasts | Anna B. Peters | The recent digitization of contractor EVM data affords cost analysts a newfound ability to execute robust statistical and data science techniques that better predict total project cost and schedule realism. Time series analysis, a well-established method in private sector finance, is one such method. Auto regressive integrated moving average (ARIMA) models may capture the persistence and patterns in EVM data, as measured by CPI, SPI, and schedule execution metrics (SEMs). As a second option, macroeconomic regression models can measure the relationship between contract performance and external considerations, like unemployment and inflation, over time. Both techniques, moreover, may forecast future changes in EVM variables interest, like IEAC. This presentation will discuss how these types of time series models and forecasts are employed on real acquisition programs and their associated IPMDAR data using Python based tools to raise program analysts' alertness to emergent acquisition risks and opportunities. | 2024 | Management, EVM & Risk |
Deriving Total Project Costs from WBS Elements' Probability Distributions | Rainald Kasprik | Studies on possible cost variances in major acquisition projects are focusing on total project costs in order to come to plausible project budgets with a confidence level of 80%. Different lognormal probability distributions had been worked out representing different states of uncertainty. However, these models cannot be applied when using risk management software for deriving the total project costs based on cost probability distributions for WBS elements. Due to a limited processing capacity, risk management software demands a division of the underlying probability distributions into intervals A simple discretization of the models developed to date is not possible, as these models contain unrealistic cost growth factors. Based on simulation studies, three lognormal probability distributions are presented that meet these challenges. Finally, some practical hints are given on the minimum number of intervals which still represents the curvature of a probability distribution and on how to interpret the joint CDF's not-defined areas. | 2024 | Management, EVM & Risk |
Cascading Effects - Performance Impacts of Fragile Tasks | Tommie (Troy) Miller | The growing popularity of Joint Cost & Schedule Analysis has highlighted the need for quality Schedule Risk Assessments (SRA). Modeling schedule risk and uncertainty requires an understanding of schedule networks. Network Analytics (NA) has been furthered in recent years due to research in fields such as social networks, IT networks, and transportation networks. Key aspects of these advancements can be used in SRAs to improve our understanding of schedule risk and mature our modeling techniques. For example, epidemiologists study the propagation of diseases through a community. The techniques used to model this phenomenon can be applied to SRAs to model the propagation of task slips through schedules. This presentation integrates classical concerns in schedule analytics, principally Merge Bias, with NA processes, such as node centrality measures and edge properties, to uniquely identify fragile tasks and illustrate how delays in these tasks cascade through a schedule and ultimately affect program execution. | 2024 | Modeling |
Data-Driven Constellation Architecture Design Using Integrated Models | W. Allen Wautlet | The modern space mission landscape requires consideration of numerous trade variables to deliver optimal mission performance at low cost. Academic methods exist to address such challenges, however, practical deployment of these methods to constellation mission design remains uncommon. This paper presents a practical space mission constellation architecture approach that employs proven statistical, data science, and machine learning techniques on the products of an integrated cost and engineering modeling framework. When deployed at the early stages of constellation development, this integrated modeling framework and analysis approach provides stakeholders insight into key design parameters that drive mission performance and cost sensitivity. Furthermore, it enables the uncovering of promising design regions in large trade spaces that can be further examined and refined by technical subject matter experts. This approach leads to better decision making earlier in the acquisition timeline and increases the efficiency of design cycles. | 2024 | Modeling |
Mission Operations Cost Estimation Tool (MOCET) 2024 Status | Marc Hayhurst | The Mission Operations Cost Estimation Tool (MOCET) is a model developed by The Aerospace Corporation in partnership with NASA's Science Office for Mission Assessments (SOMA). MOCET provides the capability to generate cost estimates for the operational, or Phase E, portion of full NASA space science missions. MOCET is a widely accepted model in the NASA community used in full mission Announcement of Opportunity competitions since 2015. MOCET was awarded NASA Cost and Schedule Team award in 2017 and honorable mention in the 2021 NASA Software of the Year competition. The cost estimating relationships and documentation have been implemented as a standalone Excel tool that is available within NASA and publicly through software.nasa.gov. Extended mission and Level 2 work breakdown structure costing capabilities are continuing to be developed and a status will be presented. | 2024 | Modeling |
A CASE for Estimate Analytics at the Enterprise Level | Josh Angeo | Are our estimates improving over time? What did this cost 2 years ago? When was the last time we reviewed this estimate? These questions, amongst many others, are why SSC FMC developed the Cost Analytics for SSC Estimates (CASE) tool. CASE includes over 175 cost estimates, 60 programs, and goes back as far as 2017. The tool creates comprehensive dashboards capable of analyzing programs individually and in aggregate. CASE utilizes various data sources and performs extensive data pre-processing to ready the data for Power Bi. Data pre-processing steps utilize python, DAX, and Power Query. Estimate data comes from a combination of POST reports, PDFs, and spreadsheets. Custom meta data tables were developed to enable parsing and other functions. Lastly, data sources comprising of program actuals have recently been integrated. All of this results in a new found capability to evaluate estimates using analytics. | 2024 | Modeling |
Modeling Electronic/IT System Deployment Projects | F. Gurney Thompson III | This presentation will discuss the development and application of cost models for electronic and IT system deployment projects. The deployment projects include various technical studies and preparation activities, site survey visits, and comprehensive installation efforts across many sites. The models consider size drivers such as the amount of hardware and software systems to be installed, number of sites, scope of activities, and number of different system configurations. Project complexity can be adjusted for many system/technology intricacies and site conditions. The models have been applied successfully, with validation against actuals, in estimating deployment costs for communication network deployment projects such as data centers, air traffic control towers, and military vehicle/ship/aircraft communication systems. Additionally, these models have been applied to weapon system and train signaling equipment deployments, with model validation relying on expert judgment. This presentation outlines the model's development, stru | 2024 | Modeling |
Recipe: Homemade Pizza (or Facility Estimate) | Kristen Marquette | Have you ever wanted to "wow" your guests with a homemade pizza, but didn't know where to start? This is how we felt when beginning our facilities estimates. This presentation will break down both recipes step by step, leaving everyone satisfied and writing rave reviews. Just as you need delicious dough, sauce, and toppings for great pizza, you need detailed structural, material, and recurring scope requirements for a great facilities estimate. We will take you through our experience with data collection spanning multiple facilities and serve up comprehensive methodologies with high fidelity. If you don't have time to create a homemade pizza or perform your own detailed facilities analysis, you can leverage the tools and methodologies provided (as to-go slices), to build your own facilities estimate based on your specific program requirements. | 2024 | Modeling |
Well, That Escalated Quickly – A Novel Approach to Forecasting Escalation | Sean Wells | Escalation rates are an important part of estimates and as such the provenance and derivation of indices should be regularly scrutinized, yet are rarely contemplated. This paper will compare a commonly used black-box escalation resource, IHS Global Insight, to a traceable, simplified forecasting method to determine if a purely mathematical model delivers an improved level of forecasting accuracy. Our model relies on a curated set of Bureau of Labor Statistics (BLS) indices to develop a moving average forecast. With access to over 15 years of IHS forecasts dating back to 2006, spanning 800+ indices, this study has the unique opportunity to quantify the accuracy of IHS and moving average forecasts against historical BLS indices. Our paper will establish and explore various measures of forecast accuracy for use in creating defensible estimates. The goal is to provide a quick, transparent, and flexible way to develop tailored escalation projections without sacrificing accuracy. | 2024 | Modeling |
Comparative Analysis of NASA Cost Estimating Methods | Camille Holly | NASA policy and customer expectations dictate use of various cost estimating tools depending on milestone and program maturity, regardless of level of effort or accuracy of results. This paper presents a case study of the tradeoffs of modeling the cost of an unmanned space mission using different NASA-approved parametric tools. The comparison addresses subsystem and component-level cost estimates, providing invaluable insight into the granularity of cost modeling for complex space missions and differences in results associated with more or less granular estimates. The study offers perspective on the challenges and opportunities associated with parametric cost modeling methodologies due to the varying levels of input detail, and of effort, needed to complete an estimate. It also aims to provide practical insights on the number and types of subjective decisions made when modeling costs using different approaches, and the impacts that these choices have on cost results. | 2024 | Modeling |
The Nuclear Option: Avoiding Critical Delays with Advanced Constraints Analysis | Hannah Hoag Lee | NNSA construction projects are often subject to funding constraints. The ripple effect of funding shortfalls can be severe; projects are forced into suboptimal execution profiles that produce costly schedule slips with drastic mission implications. This experience is not unique to NNSA construction projects. Funding constraints occur in most government sectors, negatively impacting many types of projects' progression, schedule, and mission. However, since inadequate funding is often unavoidable, it is imperative to use a data-driven methodology to predict schedule deviations and calculate ideal cost phasing to mitigate additional or unanticipated implications on project timeline. This paper demonstrates how a constrained phasing model uses historic project cost and schedule data to estimate a new project timeline based on a constrained funding profile. It also reveals how the model re-phases costs for the remainder of the project duration to generate a viable execution plan. | 2024 | Modeling |
Costing a Ballistic Schedule | Rob Carlos | Join us to explore an imminent solution addressing recurring concerns in the DoD involving cost overruns and schedule delays resulting from program practices and schedule dynamics. We'll address the power of Integrated Cost & Schedule Risk Analysis (ICRSA) & Joint Confidence Level (JCL) assessment from a DoD program office perspective, emphasizing its practicality. Such outputs yield more reasonable and quantifiable estimates by incorporating cost & schedule risk and uncertainty. We'll present a case study involving a DoD ACAT IB program, discussing the lessons learned during ICSRA implementation and JCL attainment. Our presentation illustrates the impact of ICSRA and JCL, facilitating improved forecasting, early risk identification, trade space analysis, and informed decision-making. The primary objective is to provide real world insight based on lessons learned, quantitative analysis, and creative problem solving on the efficacy, utility, and power of the ICRSA and JCL. | 2024 | Modeling |
Flavors of Commonality: Learning in a Multiple Variant Environment | Brent M. Johnstone | Commonality – the reuse of parts, designs and tools across multiple aircraft models -- is a popular strategy to reduce program costs in commercial and military applications. But its use poses unique challenges to learning curve practitioners. This paper examines five approaches to estimating multiple variant programs using different learning curve techniques. A notional dataset is created, and the accuracy of each method is measured to highlight the advantages and disadvantages of each. This presentation should be of interest to anyone doing learning curve analysis in their cost estimates. | 2024 | Modeling |
Installation Cost Analysis | Eric White | Navy IT program managers have been frustrated in recent years by increasing system installation costs. Large amounts of siloed but related installation cost data has previously proven difficult to analyze and identify core problem areas. This paper describes an innovative new solution to this problem, utilizing data visualization tools to combine related data sources and illustrate the trends and relationships in visuals that make it easy for program managers to consume and act upon. By dynamically expanding cost data, this visualization dashboard can express cost across time, product types, location, and more, while also offering the ability to quickly drill into the inherent cost makeups. Not only can this tool quickly identify significant variances, but also offers an explanation to those cost variances. Once the historical cost data is understood it is then used in a cost model that accounts for the time value of money across future years. | 2024 | Modeling |
Showing 1 to 40 of 1,409 entries