scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Operational Research Society in 2008"


Journal ArticleDOI
TL;DR: The meaning of conceptual modelling and the requirements of a conceptual model are discussed, and a conceptual modelling framework is proposed due to a paucity of advice on how to design a conceptual models.
Abstract: Conceptual modelling is probably the most important aspect of a simulation study. It is also the most difficult and least understood. Over 40 years of simulation research and practice have provided only limited information on how to go about designing a simulation conceptual model. This paper, the first of two, discusses the meaning of conceptual modelling and the requirements of a conceptual model. Founded on existing literature, a definition of a conceptual model is provided. Four requirements of a conceptual model are described: validity, credibility, utility and feasibility. The need to develop the simplest model possible is also discussed. Owing to a paucity of advice on how to design a conceptual model, the need for a conceptual modelling framework is proposed. Built on the foundations laid in this paper, a conceptual modelling framework is described in the paper that follows.

442 citations


Journal ArticleDOI
TL;DR: Simulation results indicate that there is no evacuation strategy that can be considered as the best strategy across different road network structures, and the performance of the strategies depends on both road network structure and population density.
Abstract: This study investigates the effectiveness of simultaneous and staged evacuation strategies using agent-based simulation. In the simultaneous strategy, all residents are informed to evacuate simultaneously, whereas in the staged evacuation strategy, residents in different zones are organized to evacuate in an order based on different sequences of the zones within the affected area. This study uses an agent-based technique to model traffic flows at the level of individual vehicles and investigates the collective behaviours of evacuating vehicles. We conducted simulations using a microscopic simulation system called Paramics on three types of road network structures under different population densities. The three types of road network structures include a grid road structure, a ring road structure, and a real road structure from the City of San Marcos, Texas. Default rules in Paramics were used for trip generation, destination choice, and route choice. Simulation results indicate that (1) there is no evacuation strategy that can be considered as the best strategy across different road network structures, and the performance of the strategies depends on both road network structure and population density; (2) if the population density in the affected area is high and the underlying road network structure is a grid structure, then a staged evacuation strategy that alternates non-adjacent zones in the affected area is effective in reducing the overall evacuation time.

345 citations


Journal ArticleDOI
TL;DR: In this paper, the authors combine features of routing and scheduling problems and cooperative game theory to analyze the profit margins resulting from horizontal cooperation among freight carriers in order to balance their request portfolios.
Abstract: In modern transportation systems, the potential for further decreasing the costs of fulfilling customer requests is severely limited while market competition is constantly reducing revenues. However, increased competitiveness through cost reductions can be achieved if freight carriers cooperate in order to balance their request portfolios. Participation in such coalitions can benefit the entire coalition, as well as each participant individually, thus reinforcing the market position of the partners. The work presented in this paper uniquely combines features of routing and scheduling problems and of cooperative game theory. In the first part, the profit margins resulting from horizontal cooperation among freight carriers are analysed. It is assumed that the structure of customer requests corresponds to that of a pickup and delivery problem with time windows for each freight carrier. In the second part, the possibilities of sharing these profit margins fairly among the partners are discussed. The Shapley value can be used to determine a fair allocation. Numerical results for real-life and artificial instances are presented.

306 citations


Journal ArticleDOI
TL;DR: It is argued that the unique contribution that OR can continue to make to forecasting is through developing models that link the effectiveness of new forecasting methods to the organizational context in which the models will be applied.
Abstract: From its foundation, operational research (OR) has made many substantial contributions to practical forecasting in organizations. Equally, researchers in other disciplines have influenced forecasting practice. Since the last survey articles in JORS, forecasting has developed as a discipline with its own journals. While the effect of this increased specialization has been a narrowing of the scope of OR's interest in forecasting, research from an OR perspective remains vigorous. OR has been more receptive than other disciplines to the specialist research published in the forecasting journals, capitalizing on some of their key findings. In this paper, we identify the particular topics of OR interest over the past 25 years. After a brief summary of the current research in forecasting methods, we examine those topic areas that have grabbed the attention of OR researchers: computationally intensive methods and applications in operations and marketing. Applications in operations have proved particularly important, including the management of inventories and the effects of sharing forecast information across the supply chain. The second area of application is marketing, including customer relationship management using data mining and computer-intensive methods. The paper concludes by arguing that the unique contribution that OR can continue to make to forecasting is through developing models that link the effectiveness of new forecasting methods to the organizational context in which the models will be applied. The benefits of examining the system rather than its separate components are likely to be substantial.

272 citations


Journal ArticleDOI
TL;DR: In this article, uncertainty and sensitivity analysis are used to assess the robustness of the final outcome and to analyse how much each source of uncertainty contributes to the output variance, using the Technology Achievement Index as an illustration.
Abstract: Composite indicators (CIs) are often used for benchmarking countries' performance, but they frequently stir controversies about the unavoidable subjectivity in their construction. Data Envelopment Analysis helps to overcome some key limitations, as it does not need any prior information on either the normalization of sub-indicators or on an agreed unique set of weights. Still, subjective decisions remain, and such modelling uncertainty propagates onto countries' CI scores and rankings. Uncertainty and sensitivity analysis are therefore needed to assess the robustness of the final outcome and to analyse how much each source of uncertainty contributes to the output variance. The current paper reports on these issues, using the Technology Achievement Index as an illustration.

248 citations


Journal ArticleDOI
TL;DR: This paper proposes that the zero sum gains DEA (ZSG-DEA) models look especially suitable for treating equilibrium models, where the sum of the quantities produced by all decision-making units can be set as the upper admissible bound.
Abstract: Data envelopment analysis (DEA) literature has proposed alternative models for performance assessment in the presence of undesirable outputs, such as pollutant emissions, where increased outputs imply reduced performance. However, the case where global equilibrium of outputs should be imposed has not yet been considered. We propose that the zero sum gains DEA (ZSG-DEA) models look especially suitable for treating equilibrium models, where the sum of the quantities produced by all decision-making units can be set as the upper admissible bound. This paper uses ZSG-DEA models to evaluate the carbon dioxide emission case study, which can be considered part of the Kyoto Protocol statement.

244 citations


Journal ArticleDOI
TL;DR: This paper investigates repetitive purchase decisions of perishable items in the face of uncertain demand (the newsvendor problem) and shows that in all cases both learning and convergence occur and are effected by the mean demand.
Abstract: This paper investigates repetitive purchase decisions of perishable items in the face of uncertain demand (the newsvendor problem). The experimental design includes: high, or low profit levels; and uniform, or normal demand distributions. The results show that in all cases both learning and convergence occur and are effected by: (1) the mean demand; (2) the order-size of the maximal expected profit; and (3) the demand level of the immediately preceding round. In all cases of the experimental design, the purchase order converges to a value between the mean demand and the quantity for maximizing the expected profit.

184 citations


Journal ArticleDOI
TL;DR: A framework for conceptual modelling consists of five iterative activities: understanding the problem situation, determining the modelling and general project objectives, identifying the model outputs, identify the model inputs, and determining the model content.
Abstract: Following on from the definition of a conceptual model and its requirements laid out in a previous paper, a framework for conceptual modelling is described The framework consists of five iterative activities: understanding the problem situation, determining the modelling and general project objectives, identifying the model outputs, identify the model inputs, and determining the model content The framework is demonstrated with a modelling application at a Ford Motor Company engine assembly plant The paper concludes with a discussion on identifying data requirements from the conceptual model and the assessment of the conceptual model

179 citations


Journal ArticleDOI
TL;DR: The stock control implications of the development of a theoretically coherent demand categorization scheme for forecasting only are assessed by experimentation on an inventory system developed by a UK-based software manufacturer.
Abstract: Different stock keeping units (SKUs) are associated with different underlying demand structures, which in turn require different methods for forecasting and stock control. Consequently, there is a need to categorize SKUs and apply the most appropriate methods in each category. The way this task is performed has significant implications in terms of stock and customer satisfaction. Therefore, categorization rules constitute a vital element of intelligent inventory management systems. Very little work has been conducted in this area and, from the limited research to date, it is not clear how managers should classify demand patterns for forecasting and inventory management. A previous research project was concerned with the development of a theoretically coherent demand categorization scheme for forecasting only. In this paper, the stock control implications of such an approach are assessed by experimentation on an inventory system developed by a UK-based software manufacturer. The experimental database consists of the individual demand histories of almost 16 000 SKUs. The empirical results from this study demonstrate considerable scope for improving real-world systems.

168 citations


Journal ArticleDOI
TL;DR: This paper aims to show how Compromise Programming, linked with some results connecting this approach with classic utility optimization, can become a useful analytical tool for designing and assessing macroeconomic policies.
Abstract: This paper aims to show how Compromise Programming, linked with some results connecting this approach with classic utility optimization, can become a useful analytical tool for designing and assessing macroeconomic policies. The functioning of the method is illustrated through an application to the Spanish economy. In this way, starting from a Computable General Equilibrium Model, a frontier of growth–inflation combinations is determined. After that, several Pareto-efficient policies that represent compromises between economic growth and inflation rate are established and interpreted in economic terms.

152 citations


Journal ArticleDOI
TL;DR: The results of the study show that many of the SMEs are not aware of Six Sigma and do not have the resources to implement Six Sigma projects, and that Lean Sigma was not generally popular among SMEs.
Abstract: Approaches to business improvement have evolved and grown since the early 1900s and today the process focused, statistically driven Six Sigma methodology has been widely used by companies such as GE, Motorola, Honeywell, Bombardier, ABB, Sony, DuPont, American Express, Ford and many other companies in improving the business performance and optimizing the bottom-line benefits. Although Six Sigma business management strategy has been exploited by many world class organizations as mentioned above, there is still less documented evidence of its implementation in small and medium-sized enterprises (SMEs). This paper reports the key findings of a Six Sigma pilot survey in UK manufacturing SMEs. The results of the study are based primarily on descriptive statistics. The results of the study show that many of the SMEs are not aware of Six Sigma and do not have the resources to implement Six Sigma projects. It was also found that Lean Sigma was not generally popular among SMEs. Management involvement and participation, linking Six Sigma to customers and to business strategy are the most critical factors for the successful deployment of Six Sigma in SMEs.

Journal ArticleDOI
TL;DR: This paper introduces newsvendor variants that account for demand uncertainty as well as the uncertainty surrounding the occurrence of an extreme event and the optimal inventory level is determined and compared to the classic newsv vendor solution.
Abstract: Government agencies, not-for-profit organizations, and private corporations often assume leading roles in the delivery of supplies, equipment, and manpower to support initial response operations after a disaster strikes. These organizations are faced with challenging logistics decisions to ensure that the right supplies (including equipment and personnel) are in the right places, at the right times, and in the right quantities. Such logistics planning decisions are further complicated by the uncertainties associated with predicting whether or not a potential threat will materialize into an emergency situation. This paper introduces newsvendor variants that account for demand uncertainty as well as the uncertainty surrounding the occurrence of an extreme event. The optimal inventory level is determined and compared to the classic newsvendor solution and the difference is interpreted as the insurance premium associated with proactive disaster-relief planning. The insurance policy framework represents a practical approach for decision makers to quantify the risks and benefits associated with stocking decisions related to preparing for disaster relief efforts or supply chain disruptions.

Journal ArticleDOI
TL;DR: A tool for multi-criteria decision aid to be referred to as a Reasoning Map is proposed, motivated by a desire to provide an integrated approach to problem structuring and evaluation, and in particular, to make the transition between these two processes a natural and seamless progression.
Abstract: This paper proposes a tool for multi-criteria decision aid to be referred to as a Reasoning Map. It is motivated by a desire to provide an integrated approach to problem structuring and evaluation, and in particular, to make the transition between these two processes a natural and seamless progression. The approach has two phases. In the first one, the building of a Reasoning Map supports problem structuring, capturing a decision maker's reasoning as a network of means and ends concepts. In the second phase, this map is enhanced, employing a user-defined qualitative scale to measure both performances of decision options and strengths of influence for each means–end link. This latter phase supports the decision maker in evaluating the positive and negative impacts of an action through synthesis of the qualitative information. A case study, which investigates the use of the method in practice, is also presented.

Journal ArticleDOI
TL;DR: An algorithm based on tabu search is presented for the periodic vehicle routing problem and computational results presented on randomly generated test problems that are made publicly available.
Abstract: In this paper, we consider a periodic vehicle routing problem that includes, in addition to the classical constraints, the possibility of a vehicle doing more than one route per day, as long as the maximum daily operation time for the vehicle is not exceeded. In addition, some constraints relating to accessibility of the vehicles to the customers, in the sense that not every vehicle can visit every customer, must be observed. We refer to the problem we consider here as the site-dependent multi-trip periodic vehicle routing problem. An algorithm based on tabu search is presented for the problem and computational results presented on randomly generated test problems that are made publicly available. Our algorithm is also tested on a number of routing problems from the literature that constitute particular cases of the proposed problem. Specifically we consider the periodic vehicle routing problem; the site-dependent vehicle routing problem; the multi-trip vehicle routing problem; and the classical vehicle routing problem. Computational results for our tabu search algorithm on test problems taken from the literature for all of these problems are presented.

Journal ArticleDOI
TL;DR: It is shown that solving an approximation to this robust formulation of the NDP can be done efficiently for a network with single origin and destination per commodity and general uncertainty in transportation costs and demand that are independent of each other.
Abstract: In many applications, the network design problem (NDP) faces significant uncertainty in transportation costs and demand, as it can be difficult to estimate current (and future values) of these quantities. In this paper, we present a robust optimization-based formulation for the NDP under transportation cost and demand uncertainty. We show that solving an approximation to this robust formulation of the NDP can be done efficiently for a network with single origin and destination per commodity and general uncertainty in transportation costs and demand that are independent of each other. For a network with path constraints, we propose an efficient column generation procedure to solve the linear programming relaxation. We also present computational results that show that the approximate robust solution found provides significant savings in the worst case while incurring only minor sub-optimality for specific instances of the uncertainty.

Journal ArticleDOI
TL;DR: The aim was to assess the usefulness of system dynamics (SD) in a healthcare context and to elicit proposals concerning ways of improving patient experience and a hybrid approach using stock/flow symbols from SD was created.
Abstract: Department of Health staff wished to use systems modelling to discuss acute patient flows with groups of NHS staff. The aim was to assess the usefulness of system dynamics (SD) in a healthcare context and to elicit proposals concerning ways of improving patient experience. Since time restrictions excluded simulation modelling, a hybrid approach using stock/flow symbols from SD was created. Initial interviews and hospital site visits generated a series of stock/flow maps. A ‘Conceptual Framework’ was then created to introduce the mapping symbols and to generate a series of questions about different patient paths and what might speed or slow patient flows. These materials formed the centre of three workshops for NHS staff. The participants were able to propose ideas for improving patient flows and the elicited data was subsequently employed to create a finalized suite of maps of a general acute hospital. The maps and ideas were communicated back to the Department of Health and subsequently assisted the work of the Modernization Agency.

Journal ArticleDOI
TL;DR: An alternative procedure proposed in this paper uses the directional distance function and the resulting Nerlove–Luenberger measure of super-efficiency, which generally leads to a complete ranking of the observations and is easily interpreted.
Abstract: In a recent paper published in this Journal, Lovell and Rouse (LR) proposed a modification of the standard data envelopment analysis (DEA) model that overcomes the infeasibility problem often encountered in computing super-efficiency. In the LR procedure one appropriately scales up the observed input vector (scale down the output vector) of the relevant super-efficient firm thereby usually creating its inefficient surrogate. By contrast, Chen suggested a different procedure that replaces input–output bundles that are found to be inefficient in standard DEA by their efficient projections. An alternative procedure proposed in this paper uses the directional distance function and the resulting Nerlove–Luenberger measure of super-efficiency. The fact that the directional distance function combines, by definition, features of both an input-oriented and an output-oriented model, generally leads to a complete ranking of the observations and is easily interpreted. A dataset on international airlines is utilized in an illustrative empirical application.

Journal ArticleDOI
TL;DR: This paper investigates vehicle-routing problems in which the travel times are random variables, and deliveries are made subject to soft time-window constraints, and model the travel time using a shifted gamma distribution.
Abstract: This paper investigates vehicle-routing problems in which the travel times are random variables, and deliveries are made subject to soft time-window constraints. In particular, we model the travel ...

Journal ArticleDOI
TL;DR: In this paper, an exact algorithm which decomposes the petrol station replenishment problem into a truck loading problem and a routing problem is proposed to solve the loading problem, which is handled using two different strategies, based either on a matching approach or on a column generation scheme.
Abstract: In the petrol station replenishment problem (PSRP), the aim is to deliver petroleum products to petrol stations by means of an unlimited heterogeneous fleet of compartmented tank trucks. The problem consists of jointly determining quantities to deliver within a given interval, of allocating products to tank truck compartments and of designing delivery routes to stations. This article describes an exact algorithm which decomposes the PSRP into a truck loading problem and a routing problem. An algorithm which makes use of assignment, optimality tests and possibly standard ILP algorithm is proposed to solve the loading problem. The routing problem is handled using two different strategies, based either on a matching approach or on a column generation scheme. This algorithm was extensively tested on randomly generated data and on a real-life case arising in Eastern Quebec.

Journal ArticleDOI
TL;DR: An optimization model is developed for spares provisioning under a multi-item, multi-echelon scenario and the objective is to maximize the profit to the supplier under a PBL contract.
Abstract: Performance-based logistics (PBL) is emerging as a preferred logistic support strategy within the public sector, especially the Department of Defence. Under a PBL strategy, the customer buys performance, such as operational availability, mission readiness and operational reliability, instead of contracting for a specified collection of resources defining the underlying support infrastructure. The literature on PBL is still in its infancy and additional research is required to optimize logistic resources such as spare parts, equipment, facilities, labour etc within a PBL context. In this paper, an optimization model is developed for spares provisioning under a multi-item, multi-echelon scenario. The objective of the optimization model is to maximize the profit to the supplier under a PBL contract.

Journal ArticleDOI
TL;DR: Improved model formulations for hub covering problems, including non-increasing quantity-dependent transport time functions for transport links, and computational results show that the new solution approaches for problems with quantity-independent transport times clearly outperform previous work on this topic.
Abstract: Improved model formulations for hub covering problems are proposed. We discuss multiple and single allocation problems, including non-increasing quantity-dependent transport time functions for transport links for the latter case. Computational results are presented, which show that the new solution approaches for problems with quantity-independent transport times clearly outperform previous work on this topic.

Journal ArticleDOI
TL;DR: The results reveal that although, in general, private schools obtain better academic results than public schools in absolute terms, this is not the consequence of comparatively more effective management but rather of having pupils with a more favourable background for the educational process.
Abstract: The purpose of this paper is to compare the efficiency of a set of Spanish public and private high schools using data envelopment analysis (hereafter DEA). In view of the usual difficulties of obtaining reliable budget figures on private schools, we have used a restrictive efficiency notion which focuses on the relation between the academic results obtained by each school and the socio-economic background and academic profile of its pupils. In this study, special emphasis is placed upon decomposing the overall inefficiencies of each school into managerial (due to individual performance) and programme (due to structural differences between management models) components. Our results reveal that although, in general, private schools obtain better academic results than public schools in absolute terms, this is not the consequence of comparatively more effective management but rather of having pupils with a more favourable background for the educational process.

Journal ArticleDOI
TL;DR: This paper analyses the single-machine makespan scheduling problem with two different aging effect models and provides a polynomial time algorithm to solve the problem.
Abstract: In this paper, we study a single-machine scheduling problem with the cyclic process of an aging effect. This phenomenon appears in many realistic production processes. Thus, it is important to consider the phenomenon in scheduling problems. We analyse the single-machine makespan scheduling problem with two different aging effect models and provide a polynomial time algorithm to solve the problem.

Journal ArticleDOI
TL;DR: A new metaheuristic is described for the VRPPC, which uses a perturbation procedure in the construction and improvement phases and also performs exchanges between the sets of customers served by the private fleet and the common carrier.
Abstract: The purpose of this article is to propose a perturbation metaheuristic for the vehicle routing problem with private fleet and common carrier (VRPPC). This problem consists of serving all customers in such a way that (1) each customer is served exactly once either by a private fleet vehicle or by a common carrier vehicle, (2) all routes associated with the private fleet start and end at the depot, (3) each private fleet vehicle performs only one route, (4) the total demand of any route does not exceed the capacity of the vehicle assigned to it, and (5) the total cost is minimized. This article describes a new metaheuristic for the VRPPC, which uses a perturbation procedure in the construction and improvement phases and also performs exchanges between the sets of customers served by the private fleet and the common carrier. Extensive computational results show the superiority of the proposed metaheuristic over previous methods.

Journal ArticleDOI
TL;DR: This paper describes a plot, analogous to the standard ROC, for displaying the performance trace of an algorithm as the relative costs of the two different kinds of misclassification—classing a fraudulent transaction as legitimate or vice versa—are varied.
Abstract: In predictive data mining, algorithms will be both optimized and compared using a measure of predictive performance. Different measures will yield different results, and it follows that it is crucial to match the measure to the true objectives. In this paper, we explore the desirable characteristics of measures for constructing and evaluating tools for mining plastic card data to detect fraud. We define two measures, one based on minimizing the overall cost to the card company, and the other based on minimizing the amount of fraud given the maximum number of investigations the card company can afford to make. We also describe a plot, analogous to the standard ROC, for displaying the performance trace of an algorithm as the relative costs of the two different kinds of misclassification—classing a fraudulent transaction as legitimate or vice versa—are varied.

Journal ArticleDOI
TL;DR: Experimental results show that these greedy heuristics are much more efficient and provide competitive results when compared to those of a multi-start generalized reduced gradient algorithm.
Abstract: The allocation of fresh produce to shelf space represents a new decision support research area which is motivated by the desire of many retailers to improve their service due to the increasing demand for fresh food. However, automated decision making for fresh produce allocation is challenging because of the very short lifetime of fresh products. This paper considers a recently proposed practical model for the problem which is motivated by our collaboration with Tesco. Moreover, the paper investigates heuristic and meta-heuristic approaches as alternatives for the generalized reduced gradient algorithm, which becomes inefficient when the problem size becomes larger. A simpler single-item inventory problem is firstly studied and solved by a polynomial time bounded procedure. Several dynamic greedy heuristics are then developed for the multi-item problem based on the procedure for the single-item inventory problem. Experimental results show that these greedy heuristics are much more efficient and provide competitive results when compared to those of a multi-start generalized reduced gradient algorithm. In order to further improve the solution, we investigated simulated annealing, a greedy randomized adaptive search procedure and three types of hyper-heuristics. Their performance is tested and compared on a set of problem instances which are made publicly available for the research community.

Journal ArticleDOI
TL;DR: A dual-based heuristic is developed that exploits the multi-commodity flow problem structure embedded in the formulation that considers the fixed costs of establishing the hubs and the arcs in the network, and the variable costs associated with the demands on the arcs.
Abstract: Many air, less-than-truck load and intermodal transportation and telecommunication networks incorporate hubs in an effort to reduce total cost. These hubs function as make bulk/break bulk or consolidation/deconsolidation centres. In this paper, a new hub location and network design formulation is presented that considers the fixed costs of establishing the hubs and the arcs in the network, and the variable costs associated with the demands on the arcs. The problem is formulated as a mixed integer programming problem embedding a multi-commodity flow model. The formulation can be transformed into some previously modelled hub network design problems. We develop a dual-based heuristic that exploits the multi-commodity flow problem structure embedded in the formulation. The test results indicate that the heuristic is an effective way to solve this computationally complex problem.

Journal ArticleDOI
TL;DR: The results suggest that perceived usefulness, ease of use, security, convenience and responsiveness to service requests significantly explain the variation in customer interactions.
Abstract: This paper empirically explores the major considerations associated with Internet-enabled e-banking systems and systematically measures the determinants of customer interactions with e-banking services. The results suggest that perceived usefulness, ease of use, security, convenience and responsiveness to service requests significantly explain the variation in customer interactions. Exploratory factor analysis and reliability test indicate that these constructs are relevant and reliable. Confirmatory factor analysis confirms that they possess significant convergent and discriminatory validities. Both perceived usefulness and perceived ease of use have significant impacts on customer interactions with Internet e-banking services. Perceived security, responsiveness and convenience also represent the primary avenues influencing customer interactions. In particular, stringent security control is critical to Internet e-banking operations. Prompt responses to service requests can encourage customers to use Internet e-banking services. The findings have managerial implications for enhancing extant Internet e-banking operations and developing viable Internet e-banking services.

Journal ArticleDOI
TL;DR: The role of an individual forgetting context on customer capital in today's organization is identified through an empirical study of 229 sellers (front-line contact people) in the Spanish optometry industry.
Abstract: Customer capital is a result of interaction between an organization and its customers. Customers change their characteristics, including addresses, behaviour and preferences; but as the customer requirements change, basic beliefs or processes that is, things that individuals take for granted at an implicit and an explicit level of knowledge, must also change. This paper aims to identify the role of an individual forgetting context on customer capital in today's organization through an empirical study of 229 sellers (front-line contact people) in the Spanish optometry industry. Two structural equation models have been used, resulting in the conclusion that before obtaining an up-to-date memory, it is necessary to identify new ways of doing and interpreting things, which in turn result in a shift in relations that favour the customers.

Journal ArticleDOI
TL;DR: An approach to combine financial statement data using Data Envelopment Analysis to determine a relative financial strength (RFS) indicator that captures a firm's fundamental strength or competitiveness in comparison to all other firms in the industry/market segment is proposed.
Abstract: Fundamental analysis is an approach for evaluating a firm for its investment-worthiness whereby the firm's financial statements are subject to detailed investigation to predict future stock price p...