5 CONCLUSION
The automatic collection, interpretation and visual-
ization of software measures is a complex task. The
aim of this approach is to provide a lightweight
tool for project managers to assess their development
projects and the applied software process. The an-
choring of software measures on elements of the pro-
cess model enables the project manager to define soft-
ware measures in advance before a concrete project
starts. This allows for comparison of distribution ra-
tios of effort and time spent on the activities of several
projects in order to assess the progress of the current
project or the productivity of the individual projects.
Therefore we think that with our approach to an-
chor software measures on elements of the process
model and to identify by means of the work break-
down structure the entities that have to be measured
we have found a practicable tradeoff between the au-
tomatic collection of information and the additional
effort the developers have to perform.
The developers have to record their activities by
indicating the associated WBS code within the time
recording system. Furthermore these references to the
work breakdown structure must also be maintained
within the software configuration management and
the bug tracking system. This is however a common
modus operandi (Selby, 2005).
With our approach we gain the possibility to collect
certain software measures in a standardized way that
allows the cross-project comparison of the measure-
ment results, because we define the entities to mea-
sure already before the start of the individual projects
on the basis of the process model. The actual compu-
tation of the software measures is realized automati-
cally at project runtime using Maven.
REFERENCES
Ambler, S. (2002). Agile Modeling: Effective Practices for
eXtreme Programming and the Unified Process. John
Wiley & Sons, New York.
Auer, M., Graser, B., and Biffl, S. (2003). A survey on the
fitness of commercial software metric tools for service
in heterogeneous environments: Common pitfalls. In
Proceedings of the Ninth International Software Met-
rics Symposium (METRICS 2003). IEEE Computer
Society.
Basili, V. R., Caldiera, G., and Rombach, H. D. (1994).
Goal question metric paradigm. In Marciniak, J. J.,
editor, Encyclopedia of Software Engineering, vol-
ume 1, pages 528–532. John Wiley & Sons.
Basili, V. R. and Rombach, H. D. (1988). The TAME
Project: Towards Improvement-Oriented Software
Environments. IEEE Transactions on Software En-
gineering, 14(6):758 – 773.
Beck, K. (2004). Extreme Programming explained: Em-
brace Change. Addison Wesley, Upper Saddle River,
New Jersey, second edition.
Chrissis, M. B., Konrad, M., and Shrum, S. (2003). CMMI:
Guidelines for Process Integration and Product Im-
provement. Addison-Wesley, Boston.
Cockburn, A. (2001). Agile Software Development.
Addison-Wesley, Boston, MA.
Dami, S., Estublier, J., and Amiour, M. (1998). APEL: A
graphical yet executable formalism for process mod-
eling. Automated Software Engineering: An Interna-
tional Journal, 5(1):61–96.
Futrell, R. T., Shafer, D. F., and Shafer, L. I. (2002). Quality
Software Project Management. Prentice Hall, Upper
Saddle River, NJ.
Johnson, P. (2001). You can’t even ask them to push a
button: Toward ubiquitous, developer-centric, empiri-
cal software engineering. In The NSF Workshop for
New Visions for Software Design and Productivity:
Research and Applications, Nashvile, TN.
Kempkens, R., R
¨
osch, P., Scott, L., and Zettel, J. (2000).
Instrumenting measurement programs with tools. In
PROFES ’00: Proceedings of the Second Interna-
tional Conference on Product Focused Software Pro-
cess Improvement, pages 353–375, London. Springer.
Kruchten, P. (2003). The Rational Unified Process - An
Introduction. Addison Wesley, Boston, third edition.
Lott, C. M. (1996). Measurement-based Feedback in a
Process-centered Software Engineering Environment.
PhD thesis, University of Maryland.
McGarry, J., Card, D., Jones, C., Layman, B., Clark, E.,
Dean, J., and Hall, F. (2002). Practical Software Mea-
surement: Objective Information for Decision Mak-
ers. Addison-Wesley, Boston.
M
¨
unch, J. and Heidrich, J. (2004). Software project control
centers: Concepts and approaches. Journal of Systems
and Software, 70:3–19. Issues 1–2.
Project Management Institute (2001). Practice Standard for
Work Breakdown Structures. Project Management In-
stitute.
Putnam, L. H. and Myers, W. (2003). Five Core Metrics:
The Intelligence behind successful Software Manage-
ment. Dorset House Publishing Co., New York.
Russac, J. (2002). Cheaper, better, faster: A measurement
program that works. In International Function Point
Users Group, editor, IT Measurement: Practical Ad-
vice from the Experts. Addison-Wesley, Boston, Mas-
sachusetts.
Selby, R. W. (2005). Measurement-driven dashboards en-
able leading indicators for requirements and design of
large-scale systems. In 11th IEEE International Sym-
posium on Software Metrics (METRICS 2005). IEEE
Computer Society.
TOWARDS ANCHORING SOFTWARE MEASURES ON ELEMENTS OF THE PROCESS MODEL
237