scispace - formally typeset
Search or ask a question

Showing papers by "University of Sannio published in 2007"


Journal ArticleDOI
TL;DR: The effectiveness of the proposed MPC formulation is demonstrated by simulation and experimental tests up to 21 m/s on icy roads, and two approaches with different computational complexities are presented.
Abstract: In this paper, a model predictive control (MPC) approach for controlling an active front steering system in an autonomous vehicle is presented. At each time step, a trajectory is assumed to be known over a finite horizon, and an MPC controller computes the front steering angle in order to follow the trajectory on slippery roads at the highest possible entry speed. We present two approaches with different computational complexities. In the first approach, we formulate the MPC problem by using a nonlinear vehicle model. The second approach is based on successive online linearization of the vehicle model. Discussions on computational complexity and performance of the two schemes are presented. The effectiveness of the proposed MPC formulation is demonstrated by simulation and experimental tests up to 21 m/s on icy roads

1,184 citations


Journal ArticleDOI
TL;DR: In this article, the authors discussed the baseline geochemical maps of elements harmful to human health, using concentration values of 2389 stream sediment samples collected over the Campania region (Southern Italy).

191 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, Rana X. Adhikari1, Juri Agresti1  +462 moreInstitutions (50)
TL;DR: In this paper, the authors presented upper limits on the gravitational wave emission from 78 radio pulsars based on data from the third and fourth science runs of the LIGO and GEO 600 gravitational wave detectors.
Abstract: We present upper limits on the gravitational wave emission from 78 radio pulsars based on data from the third and fourth science runs of the LIGO and GEO 600 gravitational wave detectors The data from both runs have been combined coherently to maximize sensitivity For the first time, pulsars within binary (or multiple) systems have been included in the search by taking into account the signal modulation due to their orbits Our upper limits are therefore the first measured for 56 of these pulsars For the remaining 22, our results improve on previous upper limits by up to a factor of 10 For example, our tightest upper limit on the gravitational strain is 26×10-25 for PSR J1603-7202, and the equatorial ellipticity of PSR J2124–3358 is less than 10-6 Furthermore, our strain upper limit for the Crab pulsar is only 22 times greater than the fiducial spin-down limit

170 citations


Journal ArticleDOI
TL;DR: Key ingredients of the algorithm are a delayed column-and-row generation technique, exploiting the special structure of the formulation, to solve the LP-relaxation, and cutting planes to strengthen the formulation and limit the size of the enumeration tree.
Abstract: Given a directed graph G(V,A), the p-Median problem consists of determining p nodes (the median nodes) minimizing the total distance from the other nodes of the graph. We present a Branch-and-Cut-and-Price algorithm yielding provably good solutions for instances with |V|≤3795. Key ingredients of the algorithm are a delayed column-and-row generation technique, exploiting the special structure of the formulation, to solve the LP-relaxation, and cutting planes to strengthen the formulation and limit the size of the enumeration tree.

169 citations


Journal ArticleDOI
TL;DR: In this article, the opportunities provided by the use of Fiber Reinforced Polymers (FRP) composites for the shear strengthening of tuff masonry structures were assessed on full-scale panels subjected to in-plane shear-compression tests at the ENEL HYDRO S.A. laboratory, ITALY.
Abstract: The opportunities provided by the use of Fiber Reinforced Polymers (FRPs) composites for the shear strengthening of tuff masonry structures were assessed on full-scale panels subjected to in-plane shear-compression tests at the ENEL HYDRO S.p.A. laboratory, ITALY. Tuff masonry specimens have been arranged in order to simulate both mechanical and textural properties typical of buildings located in South-Central Italian historical centres. In this paper, the outcomes of the experimental tests are presented. The monotonic shear-compression tests were performed under displacement control and experimental data have provided information about in-plane behaviour of as-built and FRP strengthened tuff masonry walls. Failure modes, shear strength, displacement capacity and post-peak performance are discussed.

167 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, Rana X. Adhikari1, Juri Agresti1  +481 moreInstitutions (49)
TL;DR: In this article, the authors present the first broadband wide parameter space upper limits on periodic gravitational waves from coherent search techniques, and demonstrate the data analysis method on a real data set and present their results as upper limits over large volumes of the parameter space.
Abstract: We carry out two searches for periodic gravitational waves using the most sensitive few hours of data from the second LIGO science run. Both searches exploit fully coherent matched filtering and cover wide areas of parameter space, an innovation over previous analyses which requires considerable algorithm development and computational power. The first search is targeted at isolated, previously unknown neutron stars, covers the entire sky in the frequency band 160–728.8 Hz, and assumes a frequency derivative of less than 4×10^(−10) Hz/s. The second search targets the accreting neutron star in the low-mass x-ray binary Scorpius X-1 and covers the frequency bands 464–484 Hz and 604–624 Hz as well as the two relevant binary orbit parameters. Because of the high computational cost of these searches we limit the analyses to the most sensitive 10 hours and 6 hours of data, respectively. Given the limited sensitivity and duration of the analyzed data set, we do not attempt deep follow-up studies. Rather we concentrate on demonstrating the data analysis method on a real data set and present our results as upper limits over large volumes of the parameter space. In order to achieve this, we look for coincidences in parameter space between the Livingston and Hanford 4-km interferometers. For isolated neutron stars our 95% confidence level upper limits on the gravitational wave strain amplitude range from 6.6×10^(−23) to 1×10^(−21) across the frequency band; for Scorpius X-1 they range from 1.7×10^(−22) to 1.3×10^(−21) across the two 20-Hz frequency bands. The upper limits presented in this paper are the first broadband wide parameter space upper limits on periodic gravitational waves from coherent search techniques. The methods developed here lay the foundations for upcoming hierarchical searches of more sensitive data which may detect astrophysical signals.

157 citations


Journal ArticleDOI
TL;DR: In this article, the analysis of the bond between fiber-reinforced polymer (FRP) rebars and concrete was performed referring to different kinds of FRP rebars.
Abstract: The structural performance of reinforced concrete elements is related to the interface behavior of rebars to concrete. In the last decade several research works were carried out to investigate the bond between fiber-reinforced polymer (FRP) rebars and concrete, however some aspects need further studies in order to obtain reliable design indications. In this paper the analysis of bond was performed referring to different kinds of FRP rebars and some varying influential parameters (surface treatment, kinds of fibers, and kinds of test). Results obtained show the role of the investigated parameters on bond stress law; in particular the surface treatments involve different transfer mechanisms passing from simple chemical adhesion and friction, for sanded rebars, to a relevant contribution of mechanical interlocking for deformed rebars. The kind of test utilized influences the most significant parameters of the bond stress–slip law and in different ways depending on the kind of rebars. Finally the kind of fibe...

132 citations


Journal ArticleDOI
B. P. Abbott1, R. Abbott1, Rana X. Adhikari1, Juri Agresti1  +446 moreInstitutions (43)
TL;DR: In this paper, a search for short-duration gravitational-wave bursts with arbitrary waveform in the 64-1600 Hz frequency range appeared in all three LIGO interferometers.
Abstract: The fourth science run of the LIGO and GEO 600 gravitational-wave detectors, carried out in early 2005, collected data with significantly lower noise than previous science runs. We report on a search for short-duration gravitational-wave bursts with arbitrary waveform in the 64–1600 Hz frequency range appearing in all three LIGO interferometers. Signal consistency tests, data quality cuts and auxiliary-channel vetoes are applied to reduce the rate of spurious triggers. No gravitational-wave signals are detected in 15.5 days of live observation time; we set a frequentist upper limit of 0.15 day−1 (at 90% confidence level) on the rate of bursts with large enough amplitudes to be detected reliably. The amplitude sensitivity of the search, characterized using Monte Carlo simulations, is several times better than that of previous searches. We also provide rough estimates of the distances at which representative supernova and binary black hole merger signals could be detected with 50% efficiency by this analysis.

109 citations


Proceedings ArticleDOI
20 May 2007
TL;DR: This paper shows how the evolution of changes at source code line level can be inferred from CVS repositories, by combining information retrieval techniques and the Levenshtein edit distance.
Abstract: Observing the evolution of software systems at different levels of granularity has been a key issue for a number of studies, aiming at predicting defects or at studying certain phenomena, such as the presence of clones or of crosscutting concerns. Versioning systems such as CVS and SVN, however, only provide information about lines added or deleted by a contributor: any change is shown as a sequence of additions and deletions. This provides an erroneous estimate of the amount of code changed. This paper shows how the evolution of changes at source code line level can be inferred from CVS repositories, by combining information retrieval techniques and the Levenshtein edit distance. The application of the proposed approach to the ArgoUML case study indicates a high precision and recall.

102 citations


Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, Rana X. Adhikari1, Juri Agresti1  +449 moreInstitutions (48)
TL;DR: In this article, the authors search for an anisotropic background of gravitational waves using data from the LIGO S4 science run and a method that is optimized for point sources.
Abstract: We searched for an anisotropic background of gravitational waves using data from the LIGO S4 science run and a method that is optimized for point sources. This is appropriate if, for example, the gravitational wave background is dominated by a small number of distinct astrophysical sources. No signal was seen. Upper limit maps were produced assuming two different power laws for the source strain power spectrum. For an f^(−3) power law and using the50 Hz to 1.8 kHz band the upper limits on the source strain power spectrum vary between 1.2×10^(−48) Hz^(−1) (100 Hz/f)^3 and 1.2×10^(−47) Hz^(−1) (100 Hz/f)^3, depending on the position in the sky. Similarly, in the case of constant strain power spectrum, the upper limits vary between 8.5×10−49 Hz−1 and 6.1×10^(−48) Hz^(−1). As a side product a limit on an isotropic background of gravitational waves was also obtained. All limits are at the 90% confidence level. Finally, as an application, we focused on the direction of Sco-X1, the brightest low-mass x-ray binary. We compare the upper limit on strain amplitude obtained by this method to expectations based on the x-ray flux from Sco-X1.

101 citations


Journal ArticleDOI
TL;DR: In this paper, the present status of research on thermodynamic aspects of ion exchange in Italian natural zeolites, namely phillipsite, chabazite and clinoptilolite, is reported.

Proceedings ArticleDOI
07 Sep 2007
TL;DR: Results show how patterns more suited to support the application purpose tend to change more frequently, and that different kind of changes have a different impact on co-changed classes and a different capability of making the system resilient to changes.
Abstract: Design patterns are solutions to recurring design problems, conceived to increase benefits in terms of reuse, code quality and, above all, maintainability and resilience to changes. This paper presents results from an empirical study aimed at understanding the evolution of design patterns in three open source systems, namely JHotDraw, ArgoUML, and Eclipse-JDT. Specifically, the study analyzes how frequently patterns are modified, to what changes they undergo and what classes co-change with the patterns. Results show how patterns more suited to support the application purpose tend to change more frequently, and that different kind of changes have a different impact on co-changed classes and a different capability of making the system resilient to changes.

Journal ArticleDOI
TL;DR: Results obtained from a controlled experiment and a replica support the idea that useful prediction models for class diagrams understandability and modifiability can be built on the basis of early measures, in particular, measures that capture structural complexity through associations and generalizations.
Abstract: The usefulness of measures for the analysis and design of object oriented (OO) software is increasingly being recognized in the field of software engineering research. In particular, recognition of the need for early indicators of external quality attributes is increasing. We investigate through experimentation whether a collection of UML class diagram measures could be good predictors of two main subcharacteristics of the maintainability of class diagrams: understandability and modifiability. Results obtained from a controlled experiment and a replica support the idea that useful prediction models for class diagrams understandability and modifiability can be built on the basis of early measures, in particular, measures that capture structural complexity through associations and generalizations. Moreover, these measures seem to be correlated with the subjective perception of the subjects about the complexity of the diagrams. This fact shows, to some extent, that the objective measures capture the same aspects as the subjective ones. However, despite our encouraging findings, further empirical studies, especially using data taken from real projects performed in industrial settings, are needed. Such further study will yield a comprehensive body of knowledge and experience about building prediction models for understandability and modifiability.

Book ChapterDOI
17 Sep 2007
TL;DR: The architecture supports the actuation of various negotiation processes and offers a search-based algorithm to assist the negotiating parts in the achievement of an agreement.
Abstract: Software systems built by composing existing services are more and more capturing the interest of researchers and practitioners. The envisaged long term scenario is that services, offered by some competing providers, are chosen by some consumers and used for their own purpose, possibly, in conjunction with other services. In the case the consumer is not anymore satisfied by the performance of some service, he can try to replace it with some other service. This implies the creation of a global market of services and poses new requirements concerning validation of exploited services, security of transactions engaged with services, trustworthiness, creation and negotiation of Service Level Agreements with these services. In this paper we focus on the last aspect and present our approach for negotiation of Service Level Agreements. Our architecture supports the actuation of various negotiation processes and offers a search-based algorithm to assist the negotiating parts in the achievement of an agreement.

Proceedings ArticleDOI
07 Jul 2007
TL;DR: This paper proposes the use of Genetic Algorithms to generate inputs and configurations for service-oriented systems that cause SLA violations and has been implemented in a tool and applied to an audio processing workflow and to a service for chart generation.
Abstract: The diffusion of service oriented architectures introduces the need for novel testing approaches. On the one side, testing must be able to identify failures in the functionality provided by service. On the other side, it needs to identify cases in which the Service Level Agreement (SLA) negotiated between the service provider and the service consumer is not met. This would allow the developer to improve service performances, where needed, and the provider to avoid promising Quality of Service (QoS) levels that cannot be guaranteed. This paper proposes the use of Genetic Algorithms to generate inputs and configurations for service-oriented systems that cause SLA violations. The approach has been implemented in a tool and applied to an audio processing workflow and to a service for chart generation. In both cases, the approach was able to produce test data able to violate some QoS constraints.

Journal ArticleDOI
TL;DR: The rationale of this paper is to evaluate the distribution of the sum of SR random variables, both for the case of independent as well as correlated LOS components, and to carry out an extensive performance analysis of maximal ratio combining (MRC) detection scheme on SR fading channels.
Abstract: The distribution of the sum of non-negative random variables plays an essential role in the performance analysis of diversity schemes for wireless communications over fading channels. While for common fading models such as the Rayleigh, Rice, and Nakagami, the performance of diversity systems is well understood, a minor attention has been devoted to the shadowed-Rice (SR) case, namely a Rice fading channel with fluctuating (e.g. random) Line of Sight (LOS) component. Indeed, the analytical performance evaluation of diversity systems on SR fading channels requires the availability of handy expressions for the distribution of the combined received power. To this end, the rationale of this paper is twofold: first, to evaluate the distribution of the sum of SR random variables, both for the case of independent as well as correlated LOS components, and then to carry out an extensive performance analysis of maximal ratio combining (MRC) detection scheme on SR fading channels.

Journal ArticleDOI
TL;DR: The results of the experiment suggest that pair programming slows down the task, yet improves quality, and are compared with those of a previous exploratory experiment involving students, and demonstrate how the outcomes exhibit very similar trends.

Proceedings ArticleDOI
19 Jun 2007
TL;DR: The paper presents the on QoS ontology, an openly available OWL ontology for QoS, and evaluates it in a QoS-aware matching environment and shows that exploiting QoS knowledge significantly improves matching recall without deteriorating precision.
Abstract: The evolution of the Web towards a global computing environment is promoting new research efforts aimed at the formal characterization of Web Services QoS. Reasoning on QoS is a key to improve matching process during the discovery of desired services and a step towards the transformation of applications in collections of loosely coupled services virtually connected by semantic similarities. The paper presents the on QoS ontology, an openly available OWL ontology for QoS, and evaluates it in a QoS-aware matching environment. The ontology can be used to express functions of QoS metrics useful to improve the recall tied to the matching of a template request with target Web Services. To this end, the ontology introduces the concept of derivation in the matching process. This gives the possibility of matching a QoS template with published Web Services by deriving different QoS parameters when a one-to-one matching fails. The proposed matching algorithm utilizes a reasoner that exploits the ontology to avoid apparent mismatches. An experimental evaluation shows that exploiting QoS knowledge significantly improves matching recall without deteriorating precision.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, Rana X. Adhikari1, Juri Agresti1  +452 moreInstitutions (48)
TL;DR: In this article, the authors proposed an upper bound of 4.5×10^(−1/2) on the waveform strength in the detectable polarization state reaching the Hanford (WA) 4 km detector.
Abstract: We have searched for gravitational waves (GWs) associated with the SGR 1806−20 hyperflare of 27 December 2004. This event, originating from a Galactic neutron star, displayed exceptional energetics. Recent investigations of the x-ray light curve’s pulsating tail revealed the presence of quasiperiodic oscillations (QPOs) in the 30–2000 Hz frequency range, most of which coincides with the bandwidth of the LIGO detectors. These QPOs, with well-characterized frequencies, can plausibly be attributed to seismic modes of the neutron star which could emit GWs. Our search targeted potential quasimonochromatic GWs lasting for tens of seconds and emitted at the QPO frequencies. We have observed no candidate signals above a predetermined threshold, and our lowest upper limit was set by the 92.5 Hz QPO observed in the interval from 150 s to 260 s after the start of the flare. This bound corresponds to a (90% confidence) root-sum-squared amplitude h^(90%)_(rss-det) =4.5×10^(−22) strain Hz^(−1/2) on the GW waveform strength in the detectable polarization state reaching our Hanford (WA) 4 km detector. We illustrate the astrophysical significance of the result via an estimated characteristic energy in GW emission that we would expect to be able to detect. The above result corresponds to 7.7×10^(46) erg (=4.3×10^(−8) M_⊙c^2), which is of the same order as the total (isotropic) energy emitted in the electromagnetic spectrum. This result provides a means to probe the energy reservoir of the source with the best upper limit on the GW waveform strength published and represents the first broadband asteroseismology measurement using a GW detector.

Journal ArticleDOI
TL;DR: The Capacity Formulation of the Network Loading Problem is studied, introducing the new class of Tight Metric Inequalities, that completely characterize the convex hull of the integer feasible solutions of the problem.

Proceedings ArticleDOI
24 May 2007
TL;DR: Assessment of the effectiveness of UML stereotypes for Web design in support to comprehension tasks and different behaviors of users with different degrees of ability and experience suggests alternative comprehension strategies of (and tool support for) different categories of users.
Abstract: Proponents of design notations tailored for specific application domains or reference architectures, often available in the form of UML stereotypes, motivate them by improved understandability and modifiability. However, empirical studies that tested such claims report contradictory results, where the most intuitive notations are not always the best performing ones. This indicates the possible existence of relevant influencing factors, other than the design notation itself. In this work we report the results of a family of three experiments performed at different locations and with different subjects, in which we assessed the effectiveness of UML stereotypes for Web design in support to comprehension tasks. Replications with different subjects allowed us to investigate whether subjects' ability and experience play any role in the comprehension of stereotyped diagrams. We observed different behaviors of users with different degrees of ability and experience, which suggests alternative comprehension strategies of (and tool support for) different categories of users.

Journal ArticleDOI
TL;DR: A modification of the Kuramoto model is proposed to account for the effective change in the coupling constant among the oscillators, as suggested by some experiments on Josephson junction, laser arrays, and mechanical systems, where the active elements are turned on one by one.
Abstract: We propose a modification of the Kuramoto model to account for the effective change in the coupling constant among the oscillators, as suggested by some experiments on Josephson junction, laser arrays, and mechanical systems, where the active elements are turned on one by one. The resulting model is analytically tractable and predicts that both first and second order phase transitions are possible, depending upon the value of a new parameter that tunes the coupling among the oscillators. Numerical simulations of the model are in accordance with the analytical estimates, and in qualitative agreement with the behavior of Josephson junctions coupled via a cavity.

Journal ArticleDOI
TL;DR: In this paper, a Model Predictive Control (MPC) approach for combined braking and steering systems in autonomous vehicles is presented, where the control objective is to best follow a given path by controlling the front steering angle and the brakes at the four wheels independently, while fulfilling various physical and design constraints.

Proceedings ArticleDOI
03 Sep 2007
TL;DR: A technique to identify bug-introducing changes to train a model that can be used to predict if a new change may introduces or not a bug is introduced, represented as elements of a n-dimensional vector space of terms coordinates extracted from source code snapshots.
Abstract: A version control system, such as CVS/SVN, can provide the history of software changes performed during the evolution of a software project. Among all the changes performed there are some which cause the introduction of bugs, often resolved later with other changes.In this paper we use a technique to identify bug-introducing changes to train a model that can be used to predict if a new change may introduces or not a bug. We represent software changes as elements of a n-dimensional vector space of terms coordinates extracted from source code snapshots.The evaluation of various learning algorithms on a set of open source projects looks very promising, in particular for KNN (K-Nearest Neighbor algorithm) where a significant tradeoff between precision and recall has been obtained.

Proceedings ArticleDOI
26 Jun 2007
TL;DR: Researchers and practitioners working in the area of program comprehension are encouraged to join forces to design and carry out studies related to program comprehension, including observational studies, controlled experiments, case studies, surveys, and contests, and to develop standards for describing and carrying out such studies in a way that facilitates replication of data and aggregation of the results of related studies.
Abstract: The field of program comprehension is characterized by both the continuing development of new tools and techniques and the adaptation of existing techniques to address program comprehension needs for new software development and maintenance scenarios. The adoption of these techniques and tools in industry requires proper experimentation to assess the advantages and disadvantages of each technique or tool and to let the practitioners choose the most suitable approach for a specific problem. The objective of this working session is to encourage researchers and practitioners working in the area of program comprehension to join forces to design and carry out studies related to program comprehension, including observational studies, controlled experiments, case studies, surveys, and contests, and to develop standards for describing and carrying out such studies in a way that facilitates replication of data and aggregation of the results of related studies.

Proceedings ArticleDOI
22 Oct 2007
TL;DR: Evidence suggests that a large amount of information is lost and it is conjecture that to benefit from CVS and problem reporting systems, more systematic issue classification and more reliable traceability mechanisms are needed.
Abstract: Information obtained by merging data extracted from problem reporting systems - such as Bugzilla - and versioning systems - such as Concurrent Version System (CVS) - is widely used in quality assessment approaches. This paper attempts to shed some light on threats and difficulties faced when trying to integrate information extracted from Mozilla CVS and bug repositories. Indeed, the heterogeneity of Mozilla bug reports, often dealing with non-defect issues, and lacking of traceable information may undermine validity of quality assessment approaches relying on repositories integration. In the reported Mozilla case study, we observed that available integration heuristics are unable to recover thousands of traceability links. Furthermore, Bugzilla classification mechanisms do not enforce a distinction between different kinds of maintenance activities. Obtained evidence suggests that a large amount of information is lost; we conjecture that to benefit from CVS and problem reporting systems, more systematic issue classification and more reliable traceability mechanisms are needed.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, Rana X. Adhikari1, Juri Agresti1  +466 moreInstitutions (48)
TL;DR: In this paper, the LIGO Livingston interferometer and the ALLEGRO resonant-bar detector were examined for cross correlations indicative of a stochastic gravitational-wave background in the frequency range 850-950 Hz, with most of the sensitivity arising between 905 and 925 Hz.
Abstract: Data from the LIGO Livingston interferometer and the ALLEGRO resonant-bar detector, taken during LIGO’s fourth science run, were examined for cross correlations indicative of a stochastic gravitational-wave background in the frequency range 850–950 Hz, with most of the sensitivity arising between 905 and 925 Hz. ALLEGRO was operated in three different orientations during the experiment to modulate the relative sign of gravitational-wave and environmental correlations. No statistically significant correlations were seen in any of the orientations, and the results were used to set a Bayesian 90% confidence level upper limit of Ω_(gw)(f)≤1.02, which corresponds to a gravitational-wave strain at 915 Hz of 1.5×10^(−23) Hz^(−1/2). In the traditional units of h^2_(100)Ω_(gw)(f), this is a limit of 0.53, 2 orders of magnitude better than the previous direct limit at these frequencies. The method was also validated with successful extraction of simulated signals injected in hardware and software.

Book ChapterDOI
01 Jan 2007
TL;DR: Service-oriented Architectures (SOA) introduce a major shift of perspective in software engineering: in contrast to components, services are used instead of being physically integrated, which leaves the user with no control over changes that can happen in the service itself.
Abstract: Service-oriented Architectures (SOA) introduce a major shift of perspective in software engineering: in contrast to components, services are used instead of being physically integrated. This leaves the user with no control over changes that can happen in the service itself. When the service evolves, the user may not be aware of the changes, and this can entail unexpected system failures.

Journal ArticleDOI
TL;DR: In this article, the effects produced by a suction/liquid heat exchanger installed in a refrigerating cycle, evidencing that, its use can improve or decrease the system performance depending on the operating conditions.

Book ChapterDOI
16 Jul 2007
TL;DR: This paper presents a model-driven approach to the development of web applications based on the Ubiquitous Web Application (UWA) design framework, the Model-View-Controller (MVC) architectural pattern and the JavaServer Faces technology.
Abstract: This paper presents a model-driven approach to the development of web applications based on the Ubiquitous Web Application (UWA) design framework, the Model-View-Controller (MVC) architectural pattern and the JavaServer Faces technology. The approach combines a complete and robust methodology for the user-centered conceptual design of web applications with the MVC metaphor, which improves separation of business logic and data presentation. The proposed approach, by carrying the advantages of Model-Driven Development (MDD) and user-centered design, produces Web applications which are of high quality from the user's point of view and easier to maintain and evolve.