Previous column

Next column


Customer Interface Management

John D. McGregor, Clemson University and Luminary Software LLC, U.S.A.

space COLUMN

PDF Icon
PDF Version

Abstract

Many software products are produced as a commission from one company to another. Much of the software used by the United States government is commissioned from and maintained by commercial companies. Often the contracts that define the commission call for reviews to be held in which both the producers and consumers of the software products participate. In this issue of Strategic Software Engineering I will discuss some techniques for managing this strategic interface between parties.


1 INTRODUCTION

Customers are obviously of strategic importance. Poor management of interactions with customers on even a single project can jeopardize future business opportunities. Every company wants to have a good relationship with their customers but this becomes harder as the customer is more tightly integrated into the producing company’s activities. Even companies with very mature processes have discussions that are best not heard by the customer. For example, risks become problems but often can be addressed internally without alerting the customer, or mitigated before they have a negative impact on schedule or budget.

One of the interfaces between the producing company and the consuming company is the formal review. Requirements reviews, design reviews, test reviews, and schedule reviews are all points at which the customer and producer come together. These need to be managed as explicit project activities. There are several different types of producer/consumer relationships and each has its own characteristics and its own interface at review time.

The “shrink wrapped” industry has a long distance relationship with its customers until it sells enough of a product for a user’s group to form. Even then the user’s group has input into a product’s specification but rarely into the product’s implementation. The only joint reviews are presentations made at annual stockholders’ meetings or user’s group meetings. The types of users who attend these meetings are typically concerned with features and capabilities that apply to general use of the product.

The typical software product development firm produces software on order but maintains an arm’s length relationship with the customer. Reviews with the customer are primarily targeted at developing mutual agreement on the requirements. The customer may include requirements for the producer to use certain technologies or even certain techniques but their participation in a review is at a black box level. The reviews will often include justifying the initial customization and long term maintenance costs for the customer’s environment.

Large complex systems are often developed jointly. Remember the wave of interest in Joint Application Development (JAD)? In this scenario a number of companies partner to share risks and resources. The arms’ length relationship becomes a hug. And what about agile development which advocates having a customer at the development site full time? There are several variations on these types of relationships.

  • In the Open Source community, large projects like Eclipse are developed by consortia, such as the Eclipse Foundation. The relationships are informal, Eclipse uses a “contribution” model with interactions but no joint reviews.
  • When the joint venture is commercial, the contracts often call for various types of reviews in which all parties participate. The consumer and producer are tightly coupled, for the duration of the project. IBM has many joint ventures including one recently announced with Nortel and continuing joint ventures with organizations such as Microsoft [IBM 05]. IBM has explicit policies that guide interactions, such as required reviews, between themselves and development partners or subcontractors. The joint reviews may even examine other ventures in which either group is involved since those ventures might divert resources from or present conflicts of interest with the joint project.
  • Government software acquisition is yet another variation on this type of venture. In this case the consumer – the government - serves as the domain expert but expects the producer to work out details of the requirements as well as providing design and implementation services. The relationship is not between equals, the government is in control. Reviews in this context become a matter of interactive presentations by the producer to the consumer with the consumer probing for specific information. Often the consumer even specifies the outline of the content of the presentation. For large tasks the development and production costs of these presentations can be significant. There is inherent difficulty in accurately forecasting costs associated with developing large presentations and this can contribute to significant cost and schedule variance.

I think of a review presentation as telling a story. The story should have:

  • a purpose. The purpose may be to reach agreement or to persuade the consumer that goals have been met;
  • a small number of themes. Even if the presentation is divided by functional teams, the pieces should be coordinated to have a common flow addressing the same ideas in each section.
  • a logical order. The story should flow from the beginning of the review to the end in an order that leads the consumer through the material.

I want to consider how reviews contribute to the relationship between the producer and consumer of software products. I will first talk about a way to characterize the techniques used to conduct reviews. Then I will discuss some guidelines for the review process. Finally I will briefly apply these guidelines to my continuing example, the Arcade Game Maker software product line organization.

2 PASSIVE VERSUS ACTIVE REVIEWS

The term review implies viewing something for a second time. A review often is an examination of progress made on a plan that has already been presented. The fastest way to re-view plans and project artifacts is to present information to a group and ask for their feedback. It is the fastest, but not necessarily the most effective if the goal is to have the reviewers think critically about the material and make constructive suggestions. It is difficult for a group to passively absorb material.

Parnas proposed a technique he termed active reviews [Parnas 85]. In this approach the reviewers were actively engaged in the review by performing actions such as filling in a questionnaire that required they read the material to be reviewed. The assumption being that an actively involved reviewer is more likely to identify problems.

I published a technique that was a hybrid of passive and active approaches, termed guided inspection [McGregor 98]. In this technique the same sampling techniques used to identify effective test cases are used to select scenarios that “guide” the reviewers through the artifacts. This approach is very effective for guiding reviewers to areas of critical importance or finding defects in the material being reviewed. It actively involves the consumer in the selection of scenarios but usually it is the producer who presents how each scenario is handled by the product under development.

Of course, in some cases, the producer’s primary goal for a review is to satisfy a contractual obligation not to elicit constructive comments. Finding defects is not a priority and in fact may not even be desired. Producers with this goal would prefer a passive review where they determine the content of the presentation and consumers will probably not be very engaged in the activity.

The consumer typically wants to get a realistic view of the status of the development effort. Defects, when found, may diminish their confidence in the producer or at least their confidence in an on-time, to-specification delivery. These consumers want to actively participate in the review process by having some control over the content and guiding the review through their questions to cover the areas in which they are most interested.

3 SOME GUIDELINES

I have participated in a large number of reviews, as a producer and a consumer at arm’s length and in a bear hug, over the years. I have seen good and bad practices. I have distilled a few guidelines for the producing organization.

Planning

At first glance it would appear that reviews are sunk costs that do not contribute to progress toward a product. That does not have to be the case. With appropriate planning the review can identify defects early, saving the effort that would have been expended to track down the problem later in the life cycle.

As always, if we plan well there is less to do later so the list of planning guidelines is the longest of the three segments.

  • Use preparation for the review as a time to consolidate the understanding of the team. Develop materials that have continuing value beyond the day of the review.
  • Include time in the project schedule for preparation for the review, not just for the actual review sessions. The presentation is a time when the customer’s focus is on your company. Making a good impression is a productive use of time.
  • Place the review materials under revision control as they are developed. Some portion of the material may be used as introductory material for the next review or training material for new hires.
  • Plan the review to guide the reviewers through the material. For example, describing project activities in chronological order or the product’s action in data flow order help the reviewers be clear about what they are being told. Using scenarios to make the logical organization clear is a useful device.
  • Present the minimum of material that will satisfy the goals of the review. I am not suggesting that anything be hidden from the consumer but anything that is said becomes fair game for questions from the reviewers. Plan on using well thought out graphics so that you are presenting ideas as opposed to words.
  • Prepare the presenters through several well organized and focused internal dry runs. Often those best qualified to present technical material are not good presenters. Usually having a dry run with only your company employees present is a good way to prepare the presenters. Even in joint projects each company has its own culture, its own goals, and its own relationships. Having only company employees in attendance provides an atmosphere where suggestions can be more freely given.
  • Assign teams to portions of the review. I am often involved with projects that involve both hardware and software, such as cellular phone manufacturing. Presentations of features require explaining both the hardware and software. For the presentation of this material, the primary presenter is usually from one facet or the other. Team them with someone with complementary expertise. In this way, a presenter never has to respond “I don’t know the answer to that question.” The appropriate response is, “I will let my partner answer that.” The team mate should be present during the review, ready to quickly contribute, but only when asked.
  • Treat the development of your presentation package as a proposal development effort. Appoint a review coordinator to manage the development and tuning of each section of the presentation and to manage consistency in terminology, references, graphics, themes and takeaways. The coordinator should lead a small team with technical and presentation expertise. The story must be technically correct and presented clearly. Organizations often provide templates for presentation but don’t review the final content put in the template by the individual teams.

During the Review

Reviews typically last for several days. It is easy to lose sight of the goal (or the end) of the review. The guidelines here are intended to keep the producers on the correct course.

  • Remember that there are two different roles participating in the review and remember which role each person is in. The more tightly coupled – offices side by side - the producers and the consumers are in the venture, the harder this is to remember. The person you ate lunch with yesterday is today evaluating your work. Out of the review comes a critique of those in the producer role by those in the consumer role. Casual comments made to friends, who happen to be in the opposite role, may show up in the critique.
  • Answer the questions asked by the consumers. Sounds obvious, but very often I find that the presenters feel they need to expand on the reviewer’s questions even when the original question can be answered with a simple yes or no. This causes long disruptions in the story the review is trying to present and may raise questions in the minds of the reviewers that would have escaped notice otherwise.
  • Think before answering. A common strategy is to repeat the question so that you buy time to think about the answer. Again sounds obvious, but no one likes silence (in a meeting). Starting an answer before you have thought all the way to the end of it can lead to backtracking or wandering off course. This confuses the consumers and may lead to even more questions than it answers.

After the review

Besides holding their breath waiting for the comments from the consumers of the review, the producer team should debrief the review.

  • Follow up on mistakes noted in presentation material. Often this material has come from source documents in the project. Whether the presentation is corrected, the source documents certainly should be.
  • Thank the team for their effort. The managers should do this before any feedback comes back to show their appreciation of the effort is unrelated to the outcome. I doubt I have ever seen a team prepare for a review totally within the normal working hours. There is almost always some personal sacrifice.
  • Place the review materials under control, if it is not already. A maturing organization learns from previous victories and defeats. The first step in preparing for the next review is to review the previous material and the comments that were generated.

4 CASE STUDY

AGM, the fictional product line company that I use as a continuing example in this column has several types of clients [McGregor 05]. AGM is producing three computer games in each of three increments where the first increment are freeware games that are available for download from the company web site, the second are games sold to wireless device developers, and the third increment was an implementation with an increased number of variation points to allow customers to specify details related to their company so that the games become free advertisements for their company.

The freeware games involved no customer interfacing since the customers were random after the fact encounters on the web. The third increment is a sufficiently low margin market where there is little time to interface with customers. The second increment, however, is in a domain accustomed to joint development and tightly coupled subcontractor relationships.

The initial customer of the products in the second increment required a series of reviews to ensure that schedule progress was on track and that product quality was adequate. AGM prepared carefully for the review since a problem in this review might cause other customers to look elsewhere. AGM organized the review by features, since their development was organized by features. Two person presentation teams were formed for each feature set. One person was knowledgeable about the software while the other was a hardware engineer. The product line manager, lead system engineer, and lead software architect were designated as the core team who reviewed each presentation for correctness and consistency and the total package for completeness.

The review with the customer was successful. Only in two cases did the primary presenter have to hand off a question to the other team member, but those two “saves” were worth the coordination effort. The producers did find several holes in the design as they prepared for the review, but each issue was resolved prior to the review.

5 SUMMARY

Reviews are one of the primary interfaces to the consumers of our products. While nothing replaces doing a technically excellent job on a project, a few simple techniques can be used to keep reviews focused and productive. I have related techniques that I have observed to be effective. Maintaining a positive reputation in the eyes of the consumers is of strategic importance to the producers and is worth considerable effort.

ACKNOWLEDGEMENTS

I want to thank Raj Ramlagan of Honeywell TSI for his valuable comments about the earlier version of this column.


REFERENCES

[IBM 05] IBM/Microsoft joint venture. http://www.1.ibm.com/servers/eserver/xseries/windows/index.html.

[McGregor 03] John D. McGregor. Testing Models: The Requirements Model, Journal of Object-Oriented Programming, June 1998.

[McGregor 05] John D. McGregor. A Pedagogical Product Line. http://www.cs.clemson.edu/~johnmc/productLines/example/frontPage.htm

[Parnas 85] David L. Parnas and David M. Weiss. Active Design Reviews: “Principles and Practices”, Proceedings of the 8th International Conference on Software Engineering, London, 1985.

About the author

Dr. John D. McGregor is an associate professor of computer science at Clemson University and a partner in Luminary Software, a software engineering consulting firm. His research interests include software product lines and component-base software engineering. His latest book is A Practical Guide to Testing Object-Oriented Software (Addison-Wesley 2001). Contact him at johnmc@lumsoft.com.


Cite this column as follows: John McGregor: “Customer Interface Management", in Journal of Object Technology, vol. 4, no. 5, July-August 2005, pp. 19-25 http://www.jot.fm/issues/issue_2005_07/column2


Previous column

Next column