scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Services Sciences in 2008"


Journal ArticleDOI
TL;DR: The Analytic Hierarchy Process (AHP) is a theory of measurement through pairwise comparisons and relies on the judgements of experts to derive priority scales that measure intangibles in relative terms.
Abstract: Decisions involve many intangibles that need to be traded off To do that, they have to be measured along side tangibles whose measurements must also be evaluated as to, how well, they serve the objectives of the decision maker The Analytic Hierarchy Process (AHP) is a theory of measurement through pairwise comparisons and relies on the judgements of experts to derive priority scales It is these scales that measure intangibles in relative terms The comparisons are made using a scale of absolute judgements that represents, how much more, one element dominates another with respect to a given attribute The judgements may be inconsistent, and how to measure inconsistency and improve the judgements, when possible to obtain better consistency is a concern of the AHP The derived priority scales are synthesised by multiplying them by the priority of their parent nodes and adding for all such nodes An illustration is included

6,787 citations


Journal ArticleDOI
TL;DR: In this article, the authors argue that the leagile approach is not a universal solution in the fashion and textile business and that the lean approach is more adequate for some companies.
Abstract: In fashion and textile business, the demand changes rapidly due to fashion trends and a volatile market situation. This demand is unpredictable and could vary and change completely in a short time, creating high difficulties for supply chain. To create a leagil (lean and agile) supply chain is one observed way for a fashion and textile retailing company to optimise its performance and to remain competitive. One good example from such is fashion retailer Zara, which has adopted leagile approach and combined this with key success factors for fashion retailing. However, in this paper, we argue that the leagile approach is not a universal solution in the fashion and textile business. For some fashion and textile companies, the lean approach is more adequate. Case study findings and simulation results reveal that the lean and leagile approach could coexist as different strategy alternatives – simulation results favour leagile strategy, while five year profitability analysis shows lean apparel retailer H&M to have higher profitability than Zara.

42 citations


Journal ArticleDOI
TL;DR: In this paper a new algorithm of inventory classification based on the association rules is presented by using the Support-Confidence framework the consideration of cross-selling effect is introduced to generate a new criterion that is then used to rank inventory items.
Abstract: Today, engineering science and information technologies have given us powerful computational tools to support production planning and inventory control. In history, the ABC classification is usually used for inventory items aggregation because the number of inventory items is so large that it is not computationally feasible to set stock and service control guidelines for each individual item. A fundamental principle in ABC classification is that, ranking all inventory items with respect to a notion of profit based on historical transactions. The difficulty is that the profit of one item not only comes from its own sales, but also from its influence on the sales of other items, that is, the 'cross-selling effect'. In this paper a new algorithm of inventory classification based on the association rules is presented. By using the Support-Confidence framework the consideration of cross-selling effect is introduced to generate a new criterion that is then used to rank inventory items. A numeral example is used to explain the new algorithm and empirical experiments are implemented to evaluate its effectiveness and utility.

19 citations


Journal ArticleDOI
TL;DR: This paper compares Enterprise Service (ES) construction with the construction of conventional business applications in order to assess the (re)usability of business application construction techniques for ES construction.
Abstract: The ever growing complexity of Information System (IS) landscapes is a transparency and simplification challenge by its own When business requirements frequently change or when technical innovations are frequently implemented, IS agility has to be aimed at in addition to transparency and simplicity Service orientation claims to support agility An integrated methodology to service construction is needed which is based on an appropriate Enterprise Architecture (EA) framework, and which reflects the different life cycles on the various architectural layers As a contribution towards service construction methodology, this paper compares Enterprise Service (ES) construction with the construction of conventional business applications in order to assess the (re)usability of business application construction techniques for ES construction Based on an analysis of traditional business application design techniques, hypotheses for the design of ESs are developed Five case studies are presented and analysed which represent ES design practices in large companies with heterogeneous IS landscapes On the one hand, there is evidence that ESs are designed along the same guidelines which proved useful for traditional application development On the other hand, there are indicators that the specific properties of ESs lead to adjusted design guidelines

16 citations


Journal ArticleDOI
TL;DR: In this paper, the authors employ alternate techniques to examine whether passage of the Sarbanes-Oxley (SOX) Act has had positive effects on the efficiency of public accounting firms.
Abstract: In this paper, we employ alternate techniques to examine whether passage of the Sarbanes-Oxley (SOX) Act has had positive effects on the efficiency of public accounting firms. These alternate techniques extend from use of the non-parametric, 'frontier'-oriented method of Data Envelopment Analysis (DEA), and include more traditional regression-based approaches using central tendency estimates. Using data from 58 of the 100 largest accounting firms in the USA, we find that efficiency increased at high levels of statistical significance and discover that this result is consistent for all of the different methods - frontier and central tendencies used in this paper. We also find that this result is not affected by inclusion or exclusion of the Big 4 firms. All results are found to be robust as well as consistent.

16 citations


Journal ArticleDOI
TL;DR: In this article, the benefits of investing in these asset classes are analyzed by applying models that recognize higher-order moments or the whole return distribution like the power-utility, Omega- and Score-value model.
Abstract: During the past years, the institutional interest in investments into hedge funds and Real Estate Investment Trusts (REITs) has grown considerably. In this paper, the benefits of investing in these asset classes are analysed by applying models that recognise higher-order moments or the whole return distribution like the power-utility, Omega- and Score-value model. Trying to obtain more general results than those we can find from historical data only, we modelled the asset returns by Markov switching processes and did a Monte Carlo study. Within this design, we analysed the optimal allocations to hedge funds and REITs statically and with monthly reallocations based on data from Asian markets. Our main findings are that in the static case the utility model and the Score model are dominant, whereas the mean-variance model appears to be the model of the first choice in the dynamic case. In both settings, hedge funds are the most dominant asset of the optimal portfolios. REITs are mainly used for diversification and added at comparably lower rates.

11 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a method for analysing a vendor's exposure to uncertainties in IT services by measuring the worst expected loss over a time horizon under normal market conditions at a given confidence interval.
Abstract: We present a method for analysing a vendor's exposure to uncertainties in IT services. These uncertainties arise in project portfolios with projects whose value covaries with managerial decisions about firm strategy, technology standards and platforms to be used. Past research involving information systems (IS) decision-making under uncertainty has focused on real options methods which inform a priori investment decisions. We explore methodologies associated with asset valuation theory in financial economics called value-at-risk (VaR), which measures the worst expected loss over a time horizon under normal market conditions at a given confidence interval. We show that VaR analysis informs management on how to leverage existing capabilities and risk exposures to inform the IT sourcing decision and ongoing IT services risk management. We explore information requirements, outcomes and limitations for VaR. We also present an evaluative framework to help a senior manager to bring these concepts into use in a real-world organisation.

11 citations


Journal ArticleDOI
TL;DR: The goal is to analyse the predictive power of network intrusion classification models trained with data of varying quality, using two different algorithms for classification to build predictive models capable of distinguishing between 'bad' TCP/IP connections, called intrusions attacks, and 'good' normal TCP/ IP connections.
Abstract: In this paper, we present our research in applying statistical machine learning methods for network intrusion detection. With the advent of online distributed services, the issue of preventing network intrusion and other forms of information security failures is gaining prominence. In this work, we use two different algorithms for classification (decision trees and naive Bayes classifier) to build predictive models capable of distinguishing between 'bad' TCP/IP connections, called intrusions attacks, and 'good' normal TCP/IP connections. We investigate the effect of training the models using both clean and dirty data. The goal is to analyse the predictive power of network intrusion classification models trained with data of varying quality. The classifiers are contrasted with a clustering-based approach for comparison purposes.

10 citations


Journal ArticleDOI
TL;DR: This paper elaborate on the content of web service contracts from a legal perspective and derive a set of legal requirements and proposes an ontology-based representation of contract clauses as well as monitoring information so that a service execution meets the requirements expressed in a contract.
Abstract: Service-oriented computing as a concept for providing interoperability and flexibility within heterogeneous environments has gained much attention within the last few years. Dynamically integrating external web services into enterprise applications requires automatic contracting between service requestors and providers and automatic contract monitoring. This paper suggests a semi-automatic approach since in the current legal environment full automation is not feasible. We elaborate on the content of web service contracts from a legal perspective and derive a set of legal requirements. Based on these requirements we propose an ontology-based representation of contract clauses as well as monitoring information. We can thus automatically evaluate whether a service execution meets the requirements expressed in a contract.

8 citations


Journal ArticleDOI
TL;DR: The SOSR methodology is demonstrated by the modernisation of two different legacy systems and shows that this methodology can be used by software developers and system integrators to reengineer tightly coupled legacy information systems into the loosely coupled, agile, service-oriented information systems.
Abstract: In this paper, a Service-Oriented Software Reengineering (SOSR) methodology is proposed for reengineering a legacy system into a service-oriented system. Although Service-Oriented Computing (SOC) enables a software developer to design loosely coupled software components and integrate them with other software systems, most components in a legacy system were not developed as services. The SOSR methodology is based on a set of best practices that are architecture-centric, service-oriented, role-specific and model-driven. The SOSR methodology is demonstrated by the modernisation of two different legacy systems – a Business-to-Business (B2B) system and the other is a Business-to-Consumer (B2C) System. The resulting service-oriented systems and the evaluation of the methodology in terms of non-functional system requirements such as interoperability, etc. show that this methodology can be used by software developers and system integrators to reengineer tightly coupled legacy information systems into the loosely coupled, agile, service-oriented information systems.

7 citations


Journal ArticleDOI
TL;DR: In this paper, the authors employed the modified theory of Planned Behaviour (TPB) model as a conceptual framework to assess the effects of attitude, subjective norm and behavioural control on e-government service acceptance and explain adopters' behavioural intention.
Abstract: This paper employed the modified Theory of Planned Behaviour (TPB) model as a conceptual framework to assess the effects of three key beliefs of TPB: attitude, subjective norm and behavioural control, on e-government service acceptance and explain adopters' behavioural intention. Structural Equation Modelling (SEM) was applied to analyse a set of empirical data and the results show that attitude and behavioural control have significant effects on adopters' behavioural intention, which are consistent with past research both on IS product and e-commerce service. Although we found a significant indirect effect of subjective norm on intention from a competing model including interdependency terms between these three beliefs, the importance of subjective norm in the model needs further research to assess. This finding may partially make clear the important role of subjective norm in the modified TPB model in explaining the e-government acceptance behaviour.

Journal ArticleDOI
TL;DR: This study, by adapting a capacitated lot sizing approach, demonstrates an interesting application of inventory models in a service area where time is the commodity being managed.
Abstract: Efficient allocation of e-mail processing time will have a major impact on an organisation's productivity. When a Knowledge Worker (KW) is processing an e-mail, should she process all the e-mail in her inbox, or allocate time for e-mail and other activities? The KW's time may have varying values in different periods depending on other activities required of her in each period. Not answering e-mails quickly or rescheduling daily tasks to answer e-mails may incur a cost. On the other hand, valuable time spent on e-mail also results in a cost. These two types of costs are analogous to backorder/holding and production costs in inventory models. The objective is to find an optimal e-mail time allocation policy that minimises the total cost of processing all e-mails in a timely manner. This study, by adapting a capacitated lot sizing approach, demonstrates an interesting application of inventory models in a service area where time is the commodity being managed.

Journal ArticleDOI
TL;DR: In this paper, the authors propose a methodology for service system design based on symbiosis concepts, and the four-category service systems are characterised by the benefit-exchange properties of symbiosis (among the interactions and relationships between providers and customers during value coproduction).
Abstract: This paper begins with the presentation of a service classification with two dimensions – continuity of coproduction and mutual adaptability, which define the four-category service systems served as a new reference architecture of service system design. The purpose of iDesign is to propose semi-automated value coproduction and systematic service innovation through the system awareness of ecological symbiosis. Namely, this paper proposes a methodology for service system design based on the symbiosis concepts, and the four-category service systems are characterised by the benefit-exchange properties of symbiosis (among the interactions and relationships between providers and customers during value coproduction). Meanwhile, this paper outlines a service performance evaluation model (in consideration of service productivity and customer satisfaction) and subsequently specifies the goal performance criteria as required for the four-category service systems. Finally, three artwork design service systems are used to exemplify the ideas behind the methodology of symbiosis-based service system design.

Book ChapterDOI
TL;DR: In this paper, a historical account of the evolution of mathematics and risk management over the last 20 years is presented, focusing primarily on present credit market developments and giving an account of some new credit derivatives: collateralised fund obligations.
Abstract: In this paper we present a historical account of the evolution of mathematics and risk management over the last 20 years In it, we will focus primarily on present credit market developments and we give an account of some new credit derivatives: collateralised fund obligations

Journal ArticleDOI
TL;DR: In this article, the authors examine the historical behaviour of interest rate movements in seven major currencies AUD, CAD, CHF, EUR, GBP, JPY and USD and apply principle components analysis and hierarchical cluster analysis to illustrate, understand and model the past collective movements of yield curves.
Abstract: We examine the historical behaviour of interest rate movements in seven major currencies AUD, CAD, CHF, EUR, GBP, JPY and USD. We apply principle components analysis and hierarchical cluster analysis to illustrate, understand and model the past collective movements of yield curves. We show that simple correlations are not able to capture the complex behaviour observed in the data set. In order to model risk factors that are intimately connected, we propose so-called archetypes of collective movements as building blocks. Thus, we start from collective movements that are coherent from a historical perspective. A set of risk factor forecasts is then generated by adapting an archetype rather than building single risk factor forecasts from scratch. This approach opens the door to integrated, coherent forecasts created from complex building blocks. The methods may be applied within scenario simulations, forecasting, filtering techniques and technical analysis.

Journal ArticleDOI
TL;DR: The results of this study indicate that several socially-oriented theories may be relevant to the IT SLA negotiation process, and represent a starting point for the identification of context-specific IT SLAs negotiation support systems.
Abstract: Continuing exponential growth in the Information Technology (IT) outsourcing market implies a need to understand the negotiated Service Level Agreements (SLAs) that underlie the majority of those sourcing relationships. Knowledge of the negotiation processes that are associated with the development of IT SLAs is a necessary precondition for designing and developing Negotiation Support Systems (NSSs) intended to support those processes. To gain such knowledge, it is first necessary to identify theoretical perspectives that may be relevant to the IT SLA negotiation process, postulate reasonable propositions therefrom, and then evaluate those propositions in a practical, exploratory fashion. Accordingly, the current paper draws on socially-oriented perspectives to develop theory-driven propositions which are then evaluated in an experimental setting. The results of this study indicate that several socially-oriented theories may be relevant to the IT SLA negotiation process, and represent a starting point for the identification of context-specific IT SLA negotiation support systems.

Journal ArticleDOI
TL;DR: In this paper, the authors model a sailor's sequence of jobs as a supply chain and measure the value of assigning a sailor to a particular job early enough to encourage this investment while recognising the inputs of all relevant stakeholders.
Abstract: Numerous studies have attempted to provide technological solutions to the problem of sailor distribution and assignment. But they have failed to model the roles of several stakeholders. This paper aims to provide timely technical support and relevant information to the stakeholders. The approach is to model a sailor's sequence of jobs as a supply chain. The contribution of this effort is to introduce supply chain management as a way of thinking about the problem. Each command receives a sailor's services, but that command also invests in the sailor by providing training and experience. It is this investment activity that makes the sailor more valuable to commands later in the supply chain. An optimal system will encourage commands to invest in sailor attributes that are desired by other commands with jobs later in the supply chain. The model that is outlined in this paper measures the value of assigning a sailor to a particular job early enough to encourage this investment while recognising the inputs of all relevant stakeholders.