FDD is targeted at the core problem of software construction; it fits very nicely between existing requirements efforts and formal system testing processes. We take a look at how FDD can help put a project back on track by examining its influence in each activity within the core of a software development project.


Much of software development management is about identifying and reducing risk. There are many risks involved with developing software in teams of any significant size; most of them people and process related. Tom De Marco, in his book Peopleware says, "The major problems of our work are not so much technological as sociological in nature". Jeff De Luca (www.nebulon.com) says his first law of I.T. is, "Information Technology is 80% psychology and 20% technology". Few of us that have worked on troublesome projects would disagree.

Feature-Driven Development (FDD) is a set of five processes first defined by Jeff De Luca (www.nebulon.com) whilst working as the project manager and technical architect of large Java project at a bank in Singapore. It combines a number ideas from Peter Coad (www.togethersoft.com), the Chief Architect on the project, with other best industry practises, all blended into a cohesive whole.

FDD is targeted at the core problem of software construction; it fits very nicely between existing requirements efforts and formal system testing processes. We take a look at how FDD can help put a project back on track by examining its influence in each activity within the core of a software development project.


Poor communication and misunderstanding of requirements is probably the biggest risk in most projects especially those creating a new software system in a business area that has not previously been automated or dramatically changing an existing way of doing things.

At one extreme we have the traditional heavy approach of a team of analysts working with user representatives to gather functional requirements, describing them all in detail in a large, complex set of documents. The user representatives usually sign off these documents as being correct (often with a long list of qualifications and open issues) and then the documents are passed to the developers to use as input into their design work. Projects using this approach often fall foul of analysis paralysis where the analysts cannot keep up with requirement discovery, changes and clarifications long enough to produce complete and consistent requirements documents. This can be especially true of teams trying to produce detailed use cases as a means of recording requirements. If developers start design work to the incomplete specifications, they find they are repeatedly need drastic amounts of rework in their design specifications because the requirements have changed. Instead of helping speed up the process, the developers add to the problem by reporting inconsistencies and errors in the requirements and loading the requirements team even further. After a significant period of time when the project has not delivered any code of substance, it is usually cancelled.

Other things that can add to the problem include:

  • Document/use case writers can find themselves concentrating more on formatting the results rather than ensuring that the results are correct and communicated effectively.
  • Reading and writing large, detailed documents is mind-numbingly boring and it is not surprising that miscommunications and misunderstandings occur.

At the other extreme user representatives have direct access to developers at all times and communication of requirements is completely informal with no control over or tracking of changes. Projects following this approach find that developers rarely deliver anything to schedule and when they do eventually deliver, the code has been subject to so many changes that it is of very poor quality. Poor quality code makes enhancements and changes very hard to implement and many teams have to start from scratch to develop 'version 2'.

During the first process in FDD, 'Develop an Overall Model', domain and development members work together under the guidance of an experienced object modeler in the role of Chief Architect. Domain Experts perform a high-level walkthrough of the scope of the system and its context. They then perform detailed domain walkthroughs for each area of the domain that is to be modeled. After each domain walkthrough, small groups are formed with a mix of domain and development staff. Each small group composes its own model in support of the domain walkthrough and presents its results. One of the proposed models or a merge of the models is selected and becomes the model for that domain area. The domain area model is merged into the overall model, adjusting model shape as required.

This activity helps the lead developers quickly acquire a good knowledge of the problem domain. It also the helps the domain experts and developers learn to work together, communicate effectively and make explicit many unspoken assumptions. In a project that has struggled with miscommunication and misunderstandings between domain experts and developers, this can be as important a result as the production of the object model.

The second process in FDD, 'Build a Features List', requires the developers from the modeling activity to systematically go through the notes they took during the modeling, and through any existing requirements documents pulling out features. A feature is defined as a small client-valued piece of functionality expressed in the form <action><result><object>. The features are grouped into sets and the sets grouped into areas. The result is a hierarchically organized list of functional requirements each of which can be implemented in less than two weeks and in many cases in a matter of hours. The knowledge gained in the modeling enables the lead developers to perform this task very well with only the occasional input and sanity check with the domain experts. This stops the process from getting bogged down in too much detail and discussion. The features list is used to plan, drive, track and report project progress.


Preparing an analysis object model to describe the structure of the problem domain is not unique to FDD. It has a place in any UML based approach. However, despite many leading figures saying it should be done either before or concurrently with the development of use cases, object modeling is normally done after the production of use cases or other form of functional requirements specification. The normal reason for this, is that the use cases or functional specification writers do not have the necessary object modeling skills.

Worse still the use cases or functional specification are often are used to drive the shape of the object model resulting in brittle or overly complicated models full of controller and manager classes, data-heavy classes and function-heavy classes. A completely separate 'robustness' activity is often recommended to try and correct these deficiencies.

Even if the object model is built during the writing of the use cases, it is often produced by one or two architects and handed as a finished article to the developers. In this case, if the object model is consistent with the use cases, the developers can consider themselves lucky. Without comprehensive annotations, developers end up returning to the architects to ask why the model looks the way it does and how it was intended to represent a particular scenario described in the use cases.

The first process in FDD, 'Develop an overall model' solves this because the developers work through the creation of the model themselves with the domain experts under the guidance of an experienced architect. The developers leave the modeling process with a sense of involvement and ownership of the object model, a good understanding of its structure and the alternatives considered and rejected. In other words, the developers are in a fantastic position to start developing features within the context of the model.

During the modeling process, the evolving object model also becomes a framework within which requirements can be discussed more explicitly avoiding many unspoken assumptions, miscommunications and misunderstandings.

Design and Implementation

We attack the development of a complex software system by breaking the complex problem down into simpler, readily solvable problems, assigning those problems to different developers or teams of developers so that they can produce the solutions concurrently and then integrating those solutions to form the solution of the overall problem. Good communication is paramount if results are to integrate well.

Were communication has been poor and developers have been unable to complete work, had to rework the same area time and time again, and failed to time and time again to meet schedule deadlines, frustration and insecurity builds up. This in turn leads to blame games and more 'politics'. Developer insecurity also makes effective design and code inspections or reviews very hard because the developer whose work is being inspected is defensive and sensitive to criticism. In these circumstances inspections become a negative influence rather than a positive one and often abandoned as a waste of time.

In process 4 and 5 of FDD, 'Design By Feature' and 'Build By Feature', a development team lead (Chief Programmer) selects a small handful of features from those that he or she has been assigned. He or she then forms a small development team (Feature Team) out of the developers who own the classes likely to be affected by the development of those features. This team produces the detailed design for the selected features and refines the overall object model as necessary. Once the design has been agreed and verified, each developer makes the changes to their own classes and unit tests. Design inspections and code inspections provide verification and inform others in the project of significant impact from the development of these features.

FDD's small, feature team-oriented iterations attack the frustration of not being able to complete work and developer insecurity head on. Features are small and completed in a matter of a few hours or days and at the very most, within two weeks. This means developers start using the word 'finished' frequently and a sense of accomplishment is built up. Working in feature teams led by an experienced team lead (Chief Programmer) means that design and code inspections put the whole feature team on the hot seat instead of one or two individuals; this immediately helps reduce the anxiety levels of those being inspected. Often the inspection is held within the feature team, with the team checking each other's work reducing the intimidation factor even more.


Without adequate quality controls throughout the development process, it is often not until a piece of code arrives in formal system testing, that many problems are found. These problems are then much more expensive to fix than they would have been if found earlier.

FDD uses working in small groups and teams applying multiple minds to a problem to raise the initial quality of a system. This is augmented by design and code inspections. Finally unit testing and any appropriate end-to-end testing must be completed to the team lead's (Chief Programmer) satisfaction. The intent is to make system testing a formality instead of a major and costly debugging activity.

The highly iterative and feature-driven nature of FDD means real features are completed early. Therefore testers can get involved early. Starting testing early helps iron out problems in test cases and communication issues between testers and developers. Again this means by the end of the project system testing should have become a formality.

Planning, tracking and reporting

Much the planning of many software projects is based on the estimation of the time taken to complete a task by the developer to whom it has been assigned. Unless the task is very small, this estimation is hard to make because the developer has little idea of what is involved in completing the task and is not given sufficient time or access to the right people or information to find out.

Similarly, tracking progress usually requires developers to estimate what percentage of a task they have completed. Again this can be very difficult to do and stories of developers with tasks more than 90% complete for more than 90% of the tasks' duration are common.

Therefore estimates are often widely inaccurate and provide poor quality input into the planning, tracking and progress reporting of a project.

When it comes to reporting status and progress, the Gantt chart is normally the weapon of choice. However, for many of the managers and clients of a project a Gantt chart does not communicate well. The representation, although capable of presenting a wealth of information in a concise format, is too complicated to quickly provide an overall sense of project status to people who do not work with that format every day.

The result is that the leadership and sponsors trying to guide the project often end up trying to do project management equivalent of driving a truck without being able to see where precisely they are and how fast they are traveling.

FDD tackles this problem by defining a set of sharp milestones in process 4 and 5, 'Design By Feature' and 'Build By Feature'. Since features are small, reaching the next milestone in a feature's development is a very small task. This makes possible reasonable estimation of time needed to reach the next milestone. To help with tracking progress, a percentage complete is assigned to each milestone. As a feature's development moves from one milestone to the next the percentages are added and the total forms the reported percentage complete for that feature. Values for each feature can be rolled up into set, area and project totals for reporting to different levels of the project.


We have only just scratched the surface in this article in discussing how FDD can be used to help put a problematic project back on course. Naturally, no process is ever going to replace good people and if a project does not have the right people to perform the Chief Architect and Chief Programmer roles, FDD is unlikely to provide a solution.


Jeff De Luca and Peter Coad first wrote about FDD in chapter 6 of Java Modeling in Color with UML published by Prentice Hall in 1999. Mac Felsing and myself have written a more in-depth book on the subject called A Practical Guide to Feature-Driven Development also published by Prentice Hall. Java Modeling in Color with UML provides a very concise, if somewhat specialized, description of FDD. Both of the books contain suggestions for graphically reporting progress to senior management levels of a project so that an immediate feel for project status is communicated. A Practical Guide to Feature-Driven Development also contains many other suggestions, tips and 'tricks of the trade' to help adapt and apply FDD.

Follow me on Twitter...