scispace - formally typeset
Search or ask a question

Showing papers by "HEC Montréal published in 2003"


Journal Article
Line Dubé1, Guy Paré1
TL;DR: The level of methodological rigor in positivist IS case research conducted over the past decade has experienced modest progress with respect to some specific attributes but the overall assessed rigor is somewhat equivocal and there are still significant areas for improvement.
Abstract: Case research has commanded respect in the information systems (IS) discipline for at least a decade Notwithstanding the relevance and potential value of case studies, this methodological approach was once considered to be one of the least systematic Toward the end of the 1980s, the issue of whether IS case research was rigorously conducted was first raised Researchers from our field (eg, Benbasat et al 1987; Lee 1989) and from other disciplines (eg, Eisenhardt 1989; Yin 1994) called for more rigor in case research and, through their recommendations, contributed to the advancement of the case study methodology Considering these contributions, the present study seeks to determine the extent to which the field of IS has advanced in its operational use of case study method Precisely, it investigates the level of methodological rigor in positivist IS case research conducted over the past decade To fulfill this objective, we identified and coded 183 case articles from seven major IS journals Evaluation attributes or criteria considered in the present review focus on three main areas, namely, design issues, data collection, and data analysis While the level of methodological rigor has experienced modest progress with respect to some specific attributes, the overall assessed rigor is somewhat equivocal and there are still significant areas for improvement One of the keys is to include better documentation particularly regarding issues related to the data collection and analysis processes

1,472 citations


Journal ArticleDOI
TL;DR: A tabu search heuristic for the dial-a-ride problem with the following characteristics is described: users specify transportation requests between origins and destinations, and side constraints relate to vehicle capacity, route duration and the maximum ride time of any user.
Abstract: This article describes a tabu search heuristic for the dial-a-ride problem with the following characteristics. Users specify transportation requests between origins and destinations. They may provide a time window on their desired departure or arrival time. Transportation is supplied by a fleet of vehicles based at a common depot. The aim is to design a set of least cost vehicle routes capable of accommodating all requests. Side constraints relate to vehicle capacity, route duration and the maximum ride time of any user. Extensive computational results are reported on randomly generated and real-life data sets.

558 citations


Posted Content
TL;DR: In this paper, it was shown that using as few as 40 pre-screened series often yield satisfactory or even better results than using all 147 series, while weighting the data by their properties when constructing the factors also lead to improved forecasts.
Abstract: Factors estimated from large macroeconomic panels are being used in an increasing number of applications. However, little is known about how the size and the composition of the data affect the factor estimates. In this paper, we question whether it is possible to use more series to extract the factors, and yet the resulting factors are less useful for forecasting, and the answer is yes. Such a problem tends to arise when the idiosyncratic errors are cross-correlated. It can also arise if forecasting power is provided by a factor that is dominant in a small dataset but is a dominated factor in a larger dataset. In a real time forecasting exercise, we find that factors extracted from as few as 40 pre-screened series often yield satisfactory or even better results than using all 147 series. Weighting the data by their properties when constructing the factors also lead to improved forecasts. Our simulation analysis is unique in that special attention is paid to cross-correlated idiosyncratic errors, and we also allow the factors to have stronger loadings on some groups of series than others. It thus allows us to better understand the properties of the principal components estimator in empirical applications.

520 citations


Journal ArticleDOI
TL;DR: The literature on real-time vehicle routing is still disorganized and some issues that have not received attention so far are highlighted in this paper, where a particular emphasis is put on parallel computing strategies.

332 citations


Journal ArticleDOI
TL;DR: The main features of the problem are described and classified and some modeling issues are discussed and a summary of the most important algorithms is provided.
Abstract: The Dial-a-Ride Problem (DARP) consists of designing vehicle routes and schedules for n users who specify pick-up and drop-off requests between origins and destinations. The aim is to plan a set of m minimum cost vehicle routes capable of accommodating as many users as possible, under a set of constraints. The most common example arises in door-to-door transportation for elderly or disabled people. The purpose of this article is to review the scientific literature on the DARP. The main features of the problem are described and classified and some modeling issues are discussed. A summary of the most important algorithms is provided.

331 citations


Journal ArticleDOI
TL;DR: A scenario-based conceptualization of the IT outsourcing risk is proposed, wherein risk is defined as a quadruplet comprising a scenario, the likelihood of that scenario, its consequences and the risk mitigation mechanisms that can attenuate or help avoid the occurrence of a scenario.
Abstract: Many firms have adopted outsourcing in recent years as a means of governing their information technology (IT) operations. While outsourcing is associated with significant benefits, it can also be a risky endeavour. This paper proposes a scenario-based conceptualization of the IT outsourcing risk, wherein risk is defined as a quadruplet comprising a scenario, the likelihood of that scenario, its consequences and the risk mitigation mechanisms that can attenuate or help avoid the occurrence of a scenario. This definition draws on and extends a risk assessment framework that is widely used in engineering. The proposed conceptualization of risk is then applied to the specific context of IT outsourcing using previous research on IT outsourcing as well as transaction cost and agency theory as a point of departure.

328 citations


Journal ArticleDOI
TL;DR: In this article, the authors empirically verifies ability and integrity as being antecedents of trust formation in virtual teams and find that effective team performance is independent of the formation of trust.
Abstract: Trust has been deemed to be critical in ensuring the efficient operation of virtual teams and organizations. This study empirically verifies ability and integrity as being antecedents of trust formation in virtual teams. However, effective team performance was found to be independent of the formation of trust. Further analysis suggests that information symmetry and good communication distinguish high performance teams from low performance teams.

294 citations


Journal ArticleDOI
TL;DR: This work proposes a formulation in which the various constraints of political districting problems are integrated into a single multicriteria function, by means of a tabu search and adaptive memory heuristic.

267 citations


Journal ArticleDOI
TL;DR: An epidemiological study on two large cohorts, namely users and non-users of cell phones, verifying whether an association exists between cell phone use and road crashes, separating those with injuries.

235 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed a scale for measuring store personality and assessed its psychometric properties, including stability of the factorial structure of the 34-item storepersonality scale as well as the reliability of each composite dimension.
Abstract: The objective of this research study was to develop a scale for measuring store personality and to assess its psychometric properties. A preliminary study showed that store personality comprised five dimensions, termed sophistication, solidity, genuineness, enthusiasm, and unpleasantness. A follow-up survey with 226 adult consumers confirmed the stability of the factorial structure of the 34-item store-personality scale as well as the reliability of each composite dimension. Some empirical evidence was gathered with respect to the scale's construct validity, because the proposed store-personality scale was shown to behave in a manner consistent with self-image congruence theory. Additional analyses revealed that a reduced scale including 20 items exhibited factorial stability and resulted in reliable measures of the five store-personality dimensions. Finally, some empirical support was obtained in favor of using the proposed scale across different retail settings. © 2003 Wiley Periodicals, Inc.

235 citations


Posted Content
TL;DR: The authors evaluate the usefulness of alternative univariate and multivariate estimates of the output gap for predicting inflation and conclude that the relative usefulness of real-time output gap estimates diminishes further when compared to simple bivariate forecasting models which use past inflation and output growth.
Abstract: A stable predictive relationship between inflation and the output gap, often referred to as a Phillips curve, provides the basis for countercyclical monetary policy in many models. In this paper, we evaluate the usefulness of alternative univariate and multivariate estimates of the output gap for predicting inflation. Many of the ex post output gap measures we examine appear to be quite useful for predicting inflation. However, forecasts using real-time estimates of the same measures do not perform nearly as well. The relative usefulness of real-time output gap estimates diminishes further when compared to simple bivariate forecasting models which use past inflation and output growth. Forecast performance also appears to be unstable over time, with models often performing differently over periods of high and low inflation. These results call into question the practical usefulness of the output gap concept for forecasting inflation.

Journal ArticleDOI
TL;DR: The authors argue that it is time to take strategy seriously in three senses: undertaking systematic research on the field itself; developing appropriate responses to recent failures in the field; and building more heedful interrelationships between actors within the field, particularly between business schools and practitioners.
Abstract: Strategy is a pervasive and consequential practice in mostWestern societies. We respond to strategy’s importance by drawing an initial map of strategy as an organizational field that embraces not just firms, but consultancies, business schools, the state and financial institutions. Using the example of Enron, we show how the strategy field is prone to manipulations in which other actors in the field can easily become entrapped, with grave consequences. Given these consequences, we argue that it is time to take strategy seriously in three senses: undertaking systematic research on the field itself; developing appropriate responses to recent failures in the field; and building more heedful interrelationships between actors within the field, particularly between business schools and practitioners.

Journal ArticleDOI
01 Aug 2003-Networks
TL;DR: In this article, a basic Variable Neighborhood Search and two Tabu Search heuristics for the p-Center problem without the triangle inequality are presented. But the 1-interchange neighborhood can be used even more efficiently than for solving the p -Median problem.
Abstract: The p-Center problem consists of locating p facilities and assigning clients to them in order to minimize the maximum distance between a client and the facility to which he or she is allocated. In this paper, we present a basic Variable Neighborhood Search and two Tabu Search heuristics for the p-Center problem without the triangle inequality. Both proposed methods use the 1-interchange (or vertex substitution) neighborhood structure. We show how this neighborhood can be used even more efficiently than for solving the p-Median problem. Multistart 1-interchange, Variable Neighborhood Search, Tabu Search, and a few early heuristics are compared on small- and large-scale test problems from the literature. © 2003 Wiley Periodicals, Inc.

Journal ArticleDOI
TL;DR: In this paper, the authors consider a channel of distribution with a single manufacturer M and a retailer R where M advertises in national media to build up the image for one of his brands.

Journal ArticleDOI
TL;DR: The authors investigated the role of ownership structure and investor protection in postprivatization corporate governance and found that the government relinquishes control over time, mainly to the benefit of local institutions and foreign investors.
Abstract: We investigate the role of ownership structure and investor protection in postprivatization corporate governance. We find that the government relinquishes control over time, mainly to the benefit of local institutions and foreign investors. We also show that private ownership tends to concentrate over time. In addition to firm-level variables, investor protection, political and social stability explain the cross-firm differences in ownership concentration. We find that the positive effect of ownership concentration on firm performance matters more in countries with weak investor protection and that private domestic ownership leads to higher performance.

Journal ArticleDOI
TL;DR: A districting study undertaken for the Côte-des-Neiges local community health clinic in Montreal by means of a tabu search technique that iteratively moves a basic unit to an adjacent district or swaps two basic units between adjacent districts was solved.
Abstract: This article describes a districting study undertaken for the Cote-des-Neiges local community health clinic in Montreal. A territory must be partitioned into six districts by suitably grouping territorial basic units. Five districting criteria must be respected: indivisibility of basic units, respect for borough boundaries, connectivity, visiting personnel mobility, and workload equilibrium. The last two criteria are combined into a single objective function and the problem is solved by means of a tabu search technique that iteratively moves a basic unit to an adjacent district or swaps two basic units between adjacent districts. The problem was solved and the clinic management confirmed its satisfaction after a 2 year implementation period.

Journal ArticleDOI
TL;DR: In this paper, the authors identified the critical factors impacting on the development of purchasing groups and the importance and the nature of these factors change, depending on the developing phase of the purchasing group, as illustrated by the application that they have made of these features to the American healthcare sector.

Journal ArticleDOI
TL;DR: In this article, the authors investigate the amount and quality of information that would be voluntarily delivered to some stakeholder by a potential polluter and find that information may be hazier when the stakeholder is confident (or naive) a priori, the cost of analyzing the received reports increases little with their complexity, or a polluter's net expected payoff from undertaking an industrial activity that would turn out to be unsafe is small.

Journal ArticleDOI
TL;DR: In this article, the authors investigated the relationship between banks' capital, securitization and risk in the context of the rapid growth of off-balance-sheet activities in the Canadian financial sector.
Abstract: This paper is the first attempt that empirically investigates the relationship between banks capital, securitization and risk in the context of the rapid growth of off-balance-sheet activities in the Canadian financial sector. The evidence over the 1988-1998 period indicates that a) securitization has negative effects on both Tier 1 and Total risk-based capital ratios, and b) there exists a positive statistical link between securitization and banks' risk. These results seem to accord with Kim and Santomero (1988) who concluded that banks might be induced to shift to more risky assets under the current capital requirements for credit risk.

Journal ArticleDOI
TL;DR: In this paper, the authors report the results of an experimental study where four characteristics of premium-based sales promotions were manipulated in the context of a computer purchase: the attractiveness of the premium, the extent to which it fits the product category, the reception delay of premium, and the mention of its value.
Abstract: This paper reports the results of an experimental study where four characteristics of premium‐based sales promotions were manipulated in the context of a computer purchase: the attractiveness of the premium, the extent to which it fits the product category, the reception delay of the premium, and the mention of its value. The results show that these factors had interactive effects on consumer reactions. Thus, although the attractiveness of the premium generally had a positive impact on consumer appreciation of the promotional offer, a promotion including an unattractive premium was nevertheless positively evaluated if the premium was a good fit to the product category. Sales promotions, including a premium that fits well the product category, were less likely to be perceived as manipulative. However, if the product‐premium fit was poor and the premium was not attractive, mentioning the value of the premium helped to reduce the perceptions of manipulation intent. It is concluded that more research is needed on this managerially relevant topic in light of the complex dynamics that appear to underlie the relationships between the characteristics of premium‐based promotions and consumer reactions.

Journal ArticleDOI
TL;DR: In this paper, an unbiased forecast of the terminal value of a portfolio requires compounding of its initial value at its arithmetic mean return for the length of the investment period, however, compounding at the arithmetic average historical return results in an upwardly biased forecast.
Abstract: An unbiased forecast of the terminal value of a portfolio requires compounding of its initial value at its arithmetic mean return for the length of the investment period. Compounding at the arithmetic average historical return, however, results in an upwardly biased forecast. This bias does not necessarily disappear even if the sample average return is itself an unbiased estimator of the true mean, the average is computed from a long data series, and returns are generated according to a stable distribution. In contrast, forecasts obtained by compounding at the geometric average will generally be biased downward. The biases are empirically significant. For investment horizons of 40 years, the difference in forecasts of cumulative performance can easily exceed a factor of 2. And the percentage difference in forecasts grows with the investment horizon, as well as with the imprecision in the estimate of the mean return. For typical investment horizons, the proper compounding rate is in between the arithmetic ...

Journal ArticleDOI
Georges Dionne1, M. Garand1
TL;DR: In this article, the authors isolate the significant determinants that affect the decision of non-financial firms to hedge their risks and show that several factors related to maximizing the firm's value significantly affect their decision to hedge the price of gold.

Journal ArticleDOI
TL;DR: In this article, the authors suggest that diversity and debate may not be enough; a powerful CEO's emotional reactions, rooted in character, may short-circuit the presumed linkages between diversity, decision-making processes, and performance.
Abstract: Faced with confusing and sometimes contradictory research results linking team composition to performance, recent research on top management teams (TMTs) has begun to investigate hitherto unexplored variables that might influence the hypothesized relationships. Increasing attention is being paid to the nature and quality of TMT strategic decision-making processes, with scholars arguing that diversity per se will not affect performance outcomes unless that diversity is allowed to make itself felt through systematic debate. The findings presented here suggest that diversity and debate may not be enough; a powerful CEO's emotional reactions, rooted in character, may short-circuit the presumed linkages between diversity, decision-making processes, and performance. This has important theoretical and methodological implications for this research stream, helping to explain why existing large-sample research in this area has failed to produce consistent and robust results. Suggestions are made for ways to improve...

Posted Content
TL;DR: In this article, the authors investigated the impact of monetary policy on the evolution of the U.S. economic environment over the last 40 years and found that monetary policy has been more stable in the recent past, as a result of both the way it has responded to shocks, but also by ruling out non-fundamental fluctuations.
Abstract: Recent research provides evidence of important changes in the U.S. economic environment over the last 40 years. This appears to be associated with an alteration of the monetary transmission mechanism. In this paper we investigate the implications for the evolution of monetary policy effectiveness. Using an identified VAR over the pre- and post-1980 periods we first provide evidence of a reduction in the effect of monetary policy shocks in the latter period. We then present and estimate a fully specified model that replicates well the dynamic response of output, inflation, and the federal funds rate to monetary policy shocks in both periods. Using the estimated structural model, we perform counterfactual experiments to determine the source of the observed change in the monetary transmission mechanism, as well as in the economy's response to supply and demand shocks. The main finding is that monetary policy has been more stabilizing in the recent past, as a result of both the way it has responded to shocks, but also by ruling out non-fundamental fluctuations.

01 Mar 2003
TL;DR: In this paper, the authors study the use of dual-optimal inequalities to accelerate and stabilize the whole convergence process of column generation and propose two methods for recovering primal feasibility and optimality, depending on the type of inequalities that are used.
Abstract: Column generation is one of the most successful approaches for solving large-scale linear programming problems. However, degeneracy difficulties and long-tail effects are known to occur as problems become larger. In recent years, several stabilization techniques of the dual variables have proven to be effective. We study the use of two types of dual-optimal inequalities to accelerate and stabilize the whole convergence process. Added to the dual formulation, these constraints are satisfied by all or a subset of the dual-optimal solutions. Therefore, the optimal objective function value of the augmented dual problem is identical to the original one. Adding constraints to the dual problem leads to adding columns to the primal problem, and feasibility of the solution may be lost. We propose two methods for recovering primal feasibility and optimality, depending on the type of inequalities that are used. Our computational experiments on the binary and the classical cutting-stock problems, and more specifically on the so-called triplet instances, show that the use of relevant dual information has a tremendous effect on the reduction of the number of column generation iterations.

Journal ArticleDOI
TL;DR: This paper introduces a mathematical function, called image function, which allows the calculation of the value of the logical parameter associated with a logical variable depending on the state of the system, and shows how all steady states can be derived as solutions to a system of steady-state equations.

Journal ArticleDOI
TL;DR: A noncooperative equilibrium of a differential game played with Markovian strategies is identified of a cooperative game where the players make coordinated marketing decisions and the question whether the manufacturer can design an incentive strategy such that the retailers will stick to their parts of the agreed solution is addressed.

Journal ArticleDOI
L.M Farrell1
TL;DR: In this paper, the authors illustrate a potential agency risk problem for suppliers of project-based financing due to asset substitution, where the failure of accounting standards to adapt to the explosion of new property rights, as in the case of the collapse of Enron Corp in the United States, would tend to increase project agency risk.

Journal ArticleDOI
TL;DR: This article found that low readability significantly reduces the effects of argument strength under both low and high involvement, and that readers' linguistic ability is the dominant influence factor. But they also found that the effect of readability on cognitive responses and attitudes was moderated by the readers' motivation or by their linguistic ability.
Abstract: The present study focuses on testing rival hypotheses regarding the effects of advertising readability: Are the effects of readability on cognitive responses and attitudes moderated by the readers' motivation or by their linguistic ability? A two (low/high involvement) by two (strong/weak arguments) by two (low/high readability) factorial design was used to test the hypotheses. The findings support the hypothesis that readers' linguistic ability is the dominant influence factor, because low readability significantly reduces the effects of argument strength under both low and high involvement. Psycholinguistic theory provides explanation for the findings. The implications for advertising practice relate to consumers' levels of literacy. © 2003 Wiley Periodicals, Inc.

Journal ArticleDOI
TL;DR: In this paper, Duan et al. present a new Markov chain technology for pricing barrier options that readily handles all of these problems, and out-and-in options can be valued within their framework even when volatility follows a GARCH process.
Abstract: Barrier options have become commonplace in the option market, and a variety of other financial contracts may also be thought of in terms of barrier options. But the existence of a price barrier can significantly complicate the option valuation problem when volatility is time-varying, or the barrier itself moves over time, or the barrier is only monitored at discrete intervals. In this article, Duan et al. present a new Markov chain technology for pricing barrier options that readily handles all of these problems. Out-and-in options can be valued within their framework even when volatility follows a GARCH process and a discretely monitored time-varying barrier is present.