We use cookies to ensure that we give you the best experience on our website. You can change your cookie settings at any time. Otherwise, we'll assume you're OK to continue.

Durham University

Research & business

View Profile

Publication details

Budgen, David. & Thomson, Mitchell. (2003). CASE tool evaluation experiences from an empirical study. Journal of systems and software 67(2): 55-75.

Author(s) from Durham


While research activity in software engineering often results in the development of software tools and solutions that are intended to demonstrate the feasibility of an idea or concept, any resulting conclusions about the degree of success attained are rarely substantiated through the use of supporting experimental evidence. As part of the development of a prototype computer assisted software engineering (CASE) tool intended to support opportunistic design practices, we sought to evaluate the use of the tool by both experienced and inexperienced software engineers. This work involved performing a review of suitable techniques, and then designing and perfomring a set of experimental studies to obtain data which could be used to assess how well the CASE tool met its design goals. We provide an assessment of how effective the chosen evaluation process was, and conclude by identifying the need for an 'evaluation framework' to help with guiding such studies.


Adelson, B., Soloway, E., 1985. The role of domain experience in
software design. IEEE Transactions on Software Engineering SE-
11 (11), 1351–1360.
Adrion, W.R., 1993. Research methodology in software engineering.
ACM SIGSOFT Software Engineering Notes.
Basili, V.R., Selby, R.W., Hutchens, D.H., 1986. Experimentation in
software engineering. IEEE Transactions on Software Engineering
SE-12 (7), 733–743.
Brooks, R.E., 1980. Studying programmer behaviour experimentally:
the problems of proper methodology. Communications of the
ACM 23 (4), 207–213.
Budgen, D., 1993. Software Design. Addison-Wesley.
Budgen, D., Marashi, M., Reeves, M., 1993. CASE tools: Masters or
servants? In: Proceedings of the 1993 Software Engineering
Environments Conference. IEEE Computer Society Press, pp.
Curtis, B., Krasner, H., Iscoe, N., 1988. A field study of the software
design process for large systems. Communications of the ACM 31
(11), 1268–1287.
Davies, S.P., 1995. Effects of concurrent verbalisation on design
problem solving. Design Studies 16 (1), 102–116.
Davies, S.P., Castell, A.M., 1992. Contextualizing design: Narratives
and rationalization in empirical studies of software design. Design
Studies 13 (4), 379–392.
DuPlessis, A.L., 1993. A method for case tool evaluation. Information
and Management 25 (2), 93–102.
EASE97, 2000. Evaluation and assessment in software engineering:
EASE97. Special Issue of Information and Software Technology
39 (11).
EASE98, 1998. Evaluation and assessment in software engineering:
EASE98. Special Issue of Information and Software Technology
40 (14).
EASE99, 2000. Evaluation and assessment in software engineering:
EASE99. Special Issue of Journal of Systems and Software 52 (2/
Ericsson, K.A., Simon, H.A., 1993. Protocol Analysis: Verbal Reports
as Data. MIT Press., revised 1993.
Fenton, N., 1993. How effective are software engineering methods?
Journal of Systems and Software 22 (2), 141–146.
Gamma, E., Helm, R., Johnson, R., Vlissides, J., 1995. Design
Patterns––Elements of Reusable Object-Oriented Software. Addison-
Guindon, R., 1990a. Designing the design process: exploiting opportunistic
thoughts. Human-Computer Interaction 5, 305–344.
Guindon, R., 1990b. Knowledge exploited by experts during software
system design. International Journal of Man-Machine Studies 33,
Isoda, S., Yamamoto, S., Kuroki, H., Oka, A., 1995. Evaluation and
introduction of the structured methodology and a case tool.
Journal of Systems and Software 28, 49–58.
Jeffries, R., Millere, R., Wharton, C., Uyeda, K., 1991. User interface
evaluation in the real world: a comparison of four techniques. In:
CHI91 Proceedings. ACM Press, pp. 119–124.
Kitchenham, B.A., 1996–1997. Evaluating software engineering methods
and tools. ACM SIGSOFT Software Engineering Notes 21(1 ),
and following issues.
Kitchenham, B.A., Linkman, S.G., Law, D.T., 1994. Critical review of
quantitative assessment. Software Engineering Journal 9 (2), 43–53.
LeBlanc, L.A., Korn, W.M., 1994. A phased approach to the
evaluation and selection of CASE tools. Information and Software
Technology 36 (5), 267–273.
Lewis, C., Polson, P., Wharton, C., Rieman, J., 1990. Testing a
walkthrough methodology for theory-based design of walkup-
and-use interfaces. In: CHI90 Proceedings. ACM Press,
pp. 235–241.
Maier, M.W., Emery, D., Hilliard, R., 2001. Software architecture:
introducing IEEE Standard 1471. IEEE Computer 34 (4), 107–109.
Misra, S.K., 1990. Analysing CASE system characteristics: evaluative
framework. Information and Software Technology 32 (6), 415–
Pfleeger, S.L., 1994. Design and analysis in software engineering. part
1: the language of case studies and formal experiments. ACM
SIGSOFT Software Engineering Notes 19 (4), 16–20.
Pohthong, A., Budgen, D., 2000. Accessing software component
documentation during design: an observational study. In: Proceedings
of the Seventh Asia-Pacific Software Engineering Conference,
APSEC 2000. IEEE Computer Society Press, pp. 196–203.
Prechelt, L., Unger, B., Tichy, W.F., Br€ossler, P., Votta, L.G., 2001. A
controlled experiment in maintenance comparing design patterns to
simpler solutions. IEEE Transactions on Software Engineering 27
(12), 1134–1144.
Reeves, A., Marashi, M., Budgen, D., 1995. A software design
framework or how to support real designers. Software Engineering
Journal 10 (4), 141–155.
Rombach, H.D., Basili, V.R., Selby, R.W. (Eds.), 1993. Experimental
Software Engineering Issues: Critical Assessment and Future
Directions. In: Vol. 706 of Lecture Notes in Computer Science.
Scheiderman, B., 1986. Designing the user interface: strategies for
effective humancomputer interaction. Addison-Wesley.
Takahashi, K., Oka, A., Yamamoto, S., Idosa, S., 1995. A comparative
study of structured and text-oriented analysis and design
methodologies. Journal of Systems and Software 28, 69–75.
Tichy, W.F., 1998. Should computer scientists experiment more? IEEE
Computer 31(5), 32–40.
Tichy, W.F., Lukowicz, P., Prechelt, L., Heinz, E.A., 1995. Experimental
evaluation in computer science: A quantitative study.
Journal of Systems and Software 28 (1), 9–18.
Vessey, I., Jarvenpaa, S.L., Tractinsky, N., 1992. Evaluation of vendor
products: CASE tools as methodology companions. Communications
of the ACM 35 (4), 90–105.
Vessey, I., Sravanapudi, A.P., 1995. CASE tools as collaborative
support technologies. Communications of the ACM 38 (1), 83–
Visser, W., Hoc, J.-M., 1990. Expert software design strategies. In:
Hoc, J.-M., Green, T.R.G., Samurcay, R., Gilmore, D.J. (Eds.),
Psychology of Programming. Academic Press, pp. 235–249.
Whitefield, A., Sutcliffe, A.G., 1992. A case study in human factors
evaluation. Information Systems and Technology 34 (7), 443–451.
Wohlin, C., Runeson, P., H€ost, M., Ohlsson, M.C., Regnell, B.,
Wesslen, A., 2000. Experimentation in Software Engineering: an
Introduction. Kluwer Academic Publishers.
Zelkowitz, M.V., Wallace, D.R., 1998. Experimental models for
validating technology. IEEE Computer 31(5), 23–31.