scispace - formally typeset
Search or ask a question

Showing papers by "Lars Lundberg published in 2006"


Book ChapterDOI
01 Jan 2006
TL;DR: This chapter takes up tailoring as a means of making software flexible and shows that the borders between maintenance and use become blurred since tailorability can replace maintenance by professional software engineers by tailoring by advanced users.
Abstract: The change of change applications to suit the needs of users in different places and facilitate development over time has long been a major challenge for software maintenance experts. In this chapter we take up tailoring as a means of making software flexible. Starting with two case studies— one taking up tailoring for different users and the other addressing changes over time—the article discusses problems related to both the use and development of a tailorable application. Developing tailorable software presents new challenges: how do you create a user-friendly tailoring interface? How do you decide what should be tailorable, and how do you create a software architecture that permits this? How do you ensure that the tailorable system gives acceptable performance? Our experience shows that the borders between maintenance and use become blurred since tailorability can replace maintenance by professional software engineers by tailoring by advanced users. Using our experience of the two selected cases, we identify and discuss five important issues to consider when designing and implementing tailorable systems in industrial settings.

20 citations


Journal ArticleDOI
TL;DR: A study to assess the impact of experience and maturity on productivity in software development on the specialized platform and suggests a number of improvement suggestions and guidelines for the process of introducing a new technology.
Abstract: Introducing new and specialized technology is often seen as a way of meeting increasing non-functional requirements. An example of such a technology is a software platform that provides high performance and availability. The novelty of such a platform and lack of related experience and competence among the staff may affect initial development productivity. The competence problems should disappear with time. In this paper, we present a study, which we conducted at Ericsson. The purpose of the study was to assess the impact of experience and maturity on productivity in software development on the specialized platform. We quantify the impact by comparing productivity of two projects. One represents an initial development stage while the other represents a subsequent and thus more matured development stage. Both projects resulted in large commercial products. We reveal a factor of four difference in productivity. The difference was caused by a higher code delivery rate and a lower number of code lines per functionality in the latter project. We assess the impact of both these issues on productivity and explain their nature. Based on our findings, we suggest a number of improvement suggestions and guidelines for the process of introducing a new technology.

13 citations


Proceedings ArticleDOI
27 Mar 2006
TL;DR: A study empirically evaluate the accuracy of fault prediction offered by statistical models as compared to expert estimations in a large telecommunication system and it is shown that the statistical methods clearly outperform the Expert estimations.
Abstract: Fault prediction models still seem to be more popular in academia than in industry. In industry, expert estimations of fault proneness are the most popular methods of deciding where to focus the fault detection efforts. In this paper, we present a study in which we empirically evaluate the accuracy of fault prediction offered by statistical models as compared to expert estimations. The study is industry based. It involves a large telecommunication system and experts that were involved in the development of this system. Expert estimations are compared to simple prediction models built on another large system, also from the telecommunication domain. We show that the statistical methods clearly outperform the expert estimations. As the main reason for the superiority of the statistical models we see their ability to cope with large datasets, which results in their ability to perform reliable predictions for larger number of components in the system, as well as the ability to perform prediction at a more fine-grain level, e.g., at the class instead of at the component level.

10 citations


Proceedings ArticleDOI
24 Sep 2006
TL;DR: The authors' methods for predicting fault densities in modified classes early in the development process provide predictions that are of similar quality to the predictions based on metrics available after the code is implemented, and enable better planning of efficient fault prevention and fault detection activities.
Abstract: In this paper we suggest and evaluate a method for predicting fault densities in modified classes early in the development process, i.e., before the modifications are implemented. We start by establishing methods that according to literature are considered the best for predicting fault densities of modified classes. We find that these methods can not be used until the system is implemented. We suggest our own methods, which are based on the same concept as the methods suggested in the literature, with the difference that our methods are applicable before the coding has started. We evaluate our methods using three large telecommunication systems produced by Ericsson. We find that our methods provide predictions that are of similar quality to the predictions based on metrics available after the code is implemented. Our predictions are, however, available much earlier in the development process. Therefore, they enable better planning of efficient fault prevention and fault detection activities.

8 citations


Journal IssueDOI
TL;DR: This paper presents a novel method, the application kernel approach, for adaption of an existing uniprocessor Kernel as a ‘black box’, to which no or very small changes are made, and implementations of the approach for the Linux operating system show that the approach can be realized with fairly small resources.
Abstract: The current trend of using multiprocessor computers for server applications requires operating system adaptations to take advantage of more powerful hardware. However, modifying large bodies of software is very costly and time consuming, and the cost of porting an operating system to a multiprocessor might not be motivated by the potential performance benefits. In this paper we present a novel method, the application kernel approach, for adaption of an existing uniprocessor kernel to multiprocessor hardware. Our approach considers the existing uniprocessor kernel as a ‘black box’, to which no or very small changes are made. Instead, the original kernel runs operating system services unmodified on one processor whereas the other processors execute applications on top of a small custom kernel. We have implemented the application kernel for the Linux operating system, which illustrates that the approach can be realized with fairly small resources. We also present an evaluation of the performance and complexity of our approach, where we show that it is possible to achieve good performance while at the same time keeping the implementation complexity low. Copyright © 2006 John Wiley & Sons, Ltd.

6 citations


01 Jan 2006
TL;DR: In this article, the Stern-Brocot tree is used to decompose a rational number a/b in c-ratio as evenly as possible while maintaining the sum of numerators and sum of denominators.
Abstract: We consider a fundamental number theoretic problem where practial applications abound. We decompose any rational number a/b in c ratios as evenly as possible while maintaining the sum of numerators and the sum of denominators. The minimum and maximum of the ratios give rational estimates of a/b from below and from above. The case c=b gives the usual floor and ceiling functions. We furthermore define the max-min-difference, which is zero iff c≤GCD(a,b), quantifying the distance to relative primality. A main tool for investigating the properties of these quantities is the Stern-Brocot tree, where all positive rational numbers occur in lowest terms and in size order. We prove basic properties such that there is a unique decomposition that gives both the minimum and the maximum. It turns out that this decomposition contains at most three distinct ratios. The problem has arisen in a generalization of the 4/3-conjecture in computer science.

3 citations


Proceedings ArticleDOI
21 May 2006
TL;DR: Investigation of how to monitor the verification process as input to decisions such as improvement actions determined that the approach can be used for quantitative monitoring of process quality and as decision support to do rapid improvement actions.
Abstract: In a competitive environment where time-to-market is crucial for success, software development companies initiate process improvement programs that can shorten the development time. They especially seek improvements in the verification activities since rework commonly constitutes a significant part of the development cost. However, the success of process improvement initiatives is dependent on early and observable results since a lack of feedback on the effect of improvements is a common cause of failure. This paper investigates how to monitor the verification process as input to decisions such as improvement actions. The suggested approach was applied on three industrial software products at Ericsson and the results determined that the approach can be used for quantitative monitoring of process quality and as decision support to do rapid improvement actions.

1 citations