Software Development Magazine - Project Management, Programming, Software Testing |
Scrum Expert - Articles, tools, videos, news and other resources on Agile, Scrum and Kanban |
Click here to view the complete list of archived articles
This article was originally published in the Spring 2004 issue of Methods & Tools
Identifying your Organization's Best Practices
David Herron and David Garmus
The David Consulting Group, www.davidconsultinggroup.com
Characterizing an organization's best practices can easily be defined as those software development practices that yield favorable results. Favorable results may be measured in relation to customer satisfaction, reduced time of delivery, decreased cost, better quality, etc. This article will explore opportunities for an organization to improve their ability to identify best practices and to ultimately achieve more positive results.
We are most interested in the results that meet business goals and objectives. A basic set of business goals related to software application development and maintenance typically includes the following objectives:
- Decrease project costs
- Reduce time to market
- Minimize defects delivered
- Improve performance
Frequently, the strategy to achieve these goals is formulated around quick-fix approaches. Cost reduction often tops the list and can be the driving force behind the decision to outsource software development to an offshore provider. Time to market is often reduced by delivering fewer features to the end user, thus reducing the development work load. Defect minimization is too often ignored. However, regardless of the means, the good news here is that all of these goals are measurable and can be achieved as we deliver and maintain software.
The key to successful performance management is performance measurement. As the software industry grows out of its infancy into a more mature set of practices, the inclusion of performance measurement to manage and direct decisions is becoming a mainstream practice. It is obviously important to establish proper goals and objectives; equally important, however, is that effective measurement provides the key to recognizing and realizing those goals and objectives.
A basic measurement model that is advanced by the Practical Software and Systems Measurement (PSM) program suggests that an organization follow these steps:
- Identify the needs of the organization.
- Select the measures.
- Integrate measurement into the process.
Building upon this basic measurement model, we will further examine the key elements necessary in selecting your appropriate measures. The selected measures must have a balanced view and, therefore, must consider both quantitative and the qualitative information. Using the model depicted below, we can begin to see how the quantitative and qualitative measures can together provide a complete organizational view.
This model depicts the collection of quantitative data (e.g., deliverable project size, effort required, duration of the project, pre and post defects) along with the qualitative data (e.g., processes, methods, skill levels, development tools, management practices). Collected on a project basis, the quantitative data results in a measured profile that indicates how well a project is performing. The qualitative data results in a capability profile, which identifies the attributes that are contributing to high or low yields of performance.
These two elements (quantitative and qualitative) come together to form what is commonly viewed as an organization's baseline of performance. The baseline values are compiled from a selection of measured projects and represent the overall performance level of the organization.
As you can well imagine, the results vary significantly. Some projects perform very well (i.e., they have low cost and high quality), and other projects do not perform nearly as well. The quantitative data gives us a picture of performance; the qualitative data provides us the opportunity to examine the attributes of the projects to determine why some projects have performed better than others. This can often lead to the identification of an organization's best practices and opportunities for improvement.
A few case studies follow that indicate how this data can be applied.
Case Study 1
A large organization wants a complete baseline performed consisting of both quantitative data collection and analysis and qualitative data collection and analysis. Data is collected on sixty-five completed projects, and productivity rates are then calculated based upon the findings. The results are divided into two categories (high performing projects and low performing projects), and an average is calculated for each category. See the note below regarding the measures.
High Performers | Low Performers | |
Average project size in function points (FPs) | 148 | 113 |
Average duration (in calendar months) | 5.0 | 7.0 |
Average rate of delivery in FPs/person month (PM) | 22 | 9 |
Average number of resources | 2.4 | 1.8 |
The quantitative data demonstrated that high performing projects produced (on average) more functionality in a shorter timeframe with a modest increase in staffing levels.
Qualitative data (attributes about each project) was also collected so profiles of performance could be developed that identify characteristics, which were present in the high performing projects but absent from the lower performing projects. These sets of attributes are then considered to be the leading factors contributing to higher performing projects.
Case 2
The second case study involved an organization that wanted to compare its level of performance to industry benchmarks. Once again, the organization collected quantitative data and calculated performance indicators such as those in the first case study. After determining the current level of performance, a comparison to industry average and industry best practices benchmarks was accomplished.
The results were as follows:
Client | Ind. Avg. | Best Practices | |
FP Size | 567 | 567 | 567 |
Productivity (FP/PM | 6.9 | 7.26 | 22.68 |
Duration (months | 12 | 14 | 10 |
Defects/ | .24 | .12 | .02 |
As we examine these data points, we learn the following. For projects of equal size, the client's productivity rate was close to the industry average (6.9 vs. 7.26), but the best practices value indicates that there is a great deal of room for improvement. The client is actually delivering products (on average) in a shorter time frame than industry average, but again is not as good as best practices. Finally, the level of quality (defect density) is significantly below industry data points.
By looking at this picture, we can imagine an organization that is getting the product out the door quickly by increasing staffing levels and short-cutting quality practices.
About the measures
FP Size - represents the average function point size of the delivered product. Some organizations use this as a measure of value (functionality) being delivered to the end user.
Duration - represent the overall duration of the project from requirements through to customer acceptance
Defect Density - is a function of defects per function points. A lower number represents fewer defects in the delivered product.
Case Study 3
The final case study involves an organization that wants to estimate the impact that an SEI CMM Level 3 process improvement initiative will have on their performance. In order to model this improvement, the organization must first determine its current baseline of performance and profile of performance.
Project data is collected and analyzed. Averages for size, productivity, duration, and quality are computed. Using the profile data, a mapping of current project attributes is completed and compared to a similar mapping of CMM Level 3 project attributes. A model is developed that will modify the current attribute profile to more closely reflect Level 3 practices. Using a modeling tool such as Predictor from DDB Software, a series of calculations are performed, and projected models of performance are produced. Let's look at the results:
Client Performance | Level 3 Model | ||
FP Size | 456 | 456 | |
Productivity | 6.9 | 10.6 | + 53% |
Duration | 12 | 12 | -- |
Defect Density | .12 | .06 | - 50% |
The client performance levels are marked, and we can see the impact of CMM Level 3 practices for products of similar size. Productivity is projected to increase by 53%, and defect density will be improved by 50%. This modeling technique helps organizations to evaluate the potential benefits with process improvement programs or other strategic initiatives.
In Summary
We have seen, in these three case studies above, a variety of ways in which measurement data may be used to learn more about:
- An organization's level of performance,
- Key factors that contribute to high or low yields of productivity,
- The level of performance as compared to industry data points, and
- The impact of strategic initiatives through the use of performance modeling.
Related Software Process Articles in Methods & Tools
- Don't Write Another Process
- Assessing Readiness for (Software) Process Improvement
- Process Improvement - Is it a Lottery?
- Mature Scrum at Systematic