Software Estimation Techniques
* adapted from Miranda, 2014
1. What Estimation Answers
Estimation helps us answer two core questions:
- How big is it? → Sizing the system (e.g., number of features, modules, function points)
- How much effort is needed? → Effort and cost estimation (staff months, budget)
Estimation converts scope into resources and time.
2. Two Basic Estimation Strategies
| Approach | Process | Strengths | Weaknesses | Example |
|---|---|---|---|---|
| Bottom-Up | Estimate cost/effort for each component, sum them, check total. | Accurate when WBS is detailed. | Time-consuming; depends on detailed scope. | Estimating each microservice and summing total developer hours. |
| Top-Down | Start with total project budget → allocate to major subsystems → refine. | Fast; useful early in planning. | Risk of unrealistic breakdowns. | Management fixes total budget ($500k) and allocates to phases. |
Both require sanity checks: overhead, contingency, and lifecycle coverage (e.g., testing, maintenance).
3. Categories of Estimation Approaches
(a) Judgment or Expert-Based Estimation
Estimation based on experience and intuition.
- Unaided: relies purely on expert intuition (e.g., senior engineer says “6 months”).
- Structured: uses formal elicitation and combination methods:
- Wideband Delphi – iterative group consensus.
- Planning Poker – agile team estimation.
- Paired Comparison – experts rank or compare tasks pairwise.
Example: Estimating sprint backlog effort using Planning Poker.
Use When: Project has little data, early stage planning.
(b) Engineering Approaches
Estimate by analyzing what drives effort—build from known process characteristics.
Example: Training project
Number of users × Hours per class × Locations + Fixed setup costs.
Used when no past project data exists.
Variants:
- Process-based cost modeling (breakdown by activities such as design, coding, testing).
Use When: Understanding process mechanics is easier than collecting data.
(c) Counting Approaches
Quantify system size via measurable functional elements.
| Technique | Measure | Example |
|---|---|---|
| Function Points | User-visible inputs/outputs, files, interfaces | 10 inputs × 4 + 5 outputs × 5 → FP total |
| COSMIC Points | Functional data movement | Used in modern distributed systems |
| Use Case Points | Number and complexity of use cases | UML-based |
| Web Object Points | Pages, scripts, media | Web apps |
| Test Points | Test cases and complexity | QA planning |
Use When: Early requirement specs are available.
(d) Industry Norms
Based on historical distributions across projects.
Typical averages:
- Requirements: 15%
- Design: 10%
- Coding: 35%
- Testing: 40%
Also includes Function Point Backfiring: converting function points to LOC based on historical ratios.
Use When: Benchmarking effort allocation.
(e) Analogy-Based Estimation
Estimate by comparing with similar past projects.
- Find analog projects.
- Adjust based on similarity.
- Average or cluster results.
Variants:
- Proxy-Based Estimation (PROBE) in the Personal Software Process (Humphrey)
- Academic systems: ESTOR, ACE, ANGEL use clustering.
Example: A mobile app project estimated by comparing with a similar past app.
Use When: Reliable database of past projects exists.
(f) Parametric Models
Statistical models linking effort to measurable drivers (Cost Estimation Relationship, CER).
General Form:
Effort = a × (Size)^b × Π(cost drivers)
Famous examples:
- COCOMO II (Boehm): Effort = a × (KLOC)^b × EAF
(EAF = effort adjustment factors like reliability, complexity) - SLIM (Putnam) – uses Rayleigh curve of manpower distribution.
Example:
If estimated size = 20 KLOC, COCOMO predicts ~50 person-months.
Use When: Large dataset available; detailed drivers known.
4. Comparison Summary
| Category | Data Needed | Typical Accuracy | Example Methods | Best Used When |
|---|---|---|---|---|
| Judgment | Expert opinion | Low–Medium | Delphi, Planning Poker | Early stages |
| Engineering | Process decomposition | Medium | Activity-based | New, data-poor projects |
| Counting | Measurable features | Medium–High | Function Points | Defined requirements |
| Industry Norms | Historical ratios | Low | Effort % by phase | Benchmarking |
| Analogy | Similar projects | High | PROBE | Stable organization |
| Parametric | Statistical models | High | COCOMO, SLIM | Mature data sets |
5. Combining Techniques
In practice, estimators combine methods:
- Use expert judgment to calibrate parametric models.
- Use counting to size and COCOMO to predict effort.
- Apply analogy for cross-checks.
6. Example Integration
Suppose a company estimates a CRM system:
- Function points = 350 → size-based estimate.
- Apply COCOMO → 75 person-months.
- Experts adjust for complexity (+10%) → 83 PM.
- Add 15% contingency for uncertainty → final = 95 PM.
Acknowledgments
This content is heavily inspired by and adapted from lectures by Eduardo Miranda and David Root on software project management. The structure, examples, and pedagogical approach reflect their teaching materials and frameworks.
Sources
- Miranda, Eduardo. Managing Software Development. Lecture materials, 2014.
Disclaimer: AI is used for text summarization, explaining and formatting. Authors have verified all facts and claims. In case of an error, feel free to file an issue or fix with a pull request.