scispace - formally typeset
Search or ask a question

Showing papers in "Decision Sciences in 2016"


Journal ArticleDOI
TL;DR: The conditions under which increased remanufacturing due to take-back legislation causes an increase in total environmental impact are characterized and the impact of legislation on consumer surplus and manufacturer profits is model and identified.
Abstract: In the last two decades, many countries have enacted product take-back legislation that holds manufacturers responsible for the collection and environmentally sound treatment of end-of-use products. In an industry regulated by such legislation, we consider a manufacturer that also sells remanufactured products under its brand name. Using a stylized model, we consider three levels of legislation: no take-back legislation, legislation with collection targets, and legislation with collection and reuse targets. We characterize the optimal solution for the manufacturer and analyze how various levels of legislation affect manufacturing, remanufacturing, and collection decisions. First, we explore whether legislation with only collection targets causes an increase in remanufacturing levels, which is argued to be an environmentally friendlier option for end-of-use treatment than other options such as recycling. While increased remanufacturing alone is usually perceived as a favorable environmental outcome, if one considers the overall environmental impact of new and remanufactured products, this might not be the case. To study this issue, we model the environmental impact of the product following a life cycle analysis–based approach. We characterize the conditions under which increased remanufacturing due to take-back legislation causes an increase in total environmental impact. Finally, we model the impact of legislation on consumer surplus and manufacturer profits and identify when total welfare goes down because of legislation.

127 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate the strategic impact of the agency model by examining a digital goods supply chain with one supplier and two competing retailers and find that the agent model can coordinate the competing retailers by dividing the coordinated profits into a prenegotiated revenue sharing proportion.
Abstract: While digital goods industries such as entertainment, software, and publishing are growing at a rapid pace, traditional supply chain contract models have failed to evolve with the new digital economy. To illustrate, the agency model utilized by the e-book publishing industry has recently received much negative attention brought by the U.S. Department of Justice's lawsuit against Apple, Inc. The emerging agency model in the e-book industry works as follows: the publisher sets the price of the digital goods and the retailers who serve as agents retain a percentage of the revenue associated with a consumer purchase. The regulators claim that the agency model is hurting this industry as well as the consumer's welfare because e-book prices have increased after the introduction of the agency model. We investigate the strategic impact of the agency model by examining a digital goods supply chain with one supplier and two competing retailers. In comparison to the benchmark wholesale model, we find that the agency model can coordinate the competing retailers by dividing the coordinated profits into a prenegotiated revenue sharing proportion. Further, we also identify the Pareto improving region whereby both the supplier and the retailers prefer the agency model to the wholesale model. Our main qualitative insight regarding the agency model still holds even when we consider the presence of the printed books in the marketplace. Thus, contrary to current press presaging the negative impact of the agency model on the e-books industry, we find the agency model to be superior to the traditional wholesale contracts for publishers, retailers and consumers in this digital goods industry.

101 citations


Journal ArticleDOI
TL;DR: This article uses reliability indices and develops analytical formulations that model the impact of upstream supply chain on individual entities’ reliability to quantify the total reliability of a network.
Abstract: Risk management in supply chains has been receiving increased attention in the past few years. In this article, we present formulations for the strategic supply chain network design problem with dual objectives, which usually conflict with each other: minimizing cost and maximizing reliability. Quantifying the total reliability of a network design is not as straightforward as total cost calculation. We use reliability indices and develop analytical formulations that model the impact of upstream supply chain on individual entities’ reliability to quantify the total reliability of a network. The resulting multiobjective nonlinear model is solved using a novel hybrid algorithm that utilizes a genetic algorithm for network design and linear programming for network flow optimization. We demonstrate the application of our approach through illustrative examples in establishing tradeoffs between cost and reliability in network design and present managerial implications.

75 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a contrasting view: Competing OEMs without remanufacturing capacities sometimes benefit from the entry of third-party remanufacturers (TPRs), and they investigate how the number of TPRs affects the OEMs' profits.
Abstract: Researchers and managers broadly agree that the entry of third-party remanufacturers (TPRs) hurts original equipment manufacturers (OEMs) because of the cannibalization problem. Thus, OEMs should always try to deter the entry of TPRs. In this article, we present a contrasting view: Competing OEMs without remanufacturing capacities sometimes benefit from the entry of TPRs. The key feature of our model is that there exists a group of newness-conscious consumers in the market who do not buy the remanufactured product regardless of the price, whereas a group of functionality-oriented consumers (FOCs) may buy a remanufactured one at a low price. In a steady-state-period setting, we investigate how the number of TPRs affects the OEMs’ profits. We find that, from the perspective of two competing OEMs: (i) The entry of one or many TPRs may lead to a higher profit; (ii) The entry of many TPRs may be better than the entry of one TPR; and (iii) The impact of the entry of one or many TPRs may be reversed as FOCs’ willingness-to-pay for the remanufactured product increases.

71 citations


Journal ArticleDOI
TL;DR: The results indicate a negative relationship between environmental uncertainty and SCF indicating that, as the level of environmental uncertainty increases it becomes increasingly challenging to match a product's supply and demand characteristics with its supply chain design characteristics.
Abstract: The main purpose of this research was to build on Fisher's (1997) seminal article. First, it expands on the Fisher framework by empirically establishing the link between the firm's environmental characteristics and the firm's level of supply chain fit (SCF). The results indicate a negative relationship between environmental uncertainty and SCF indicating that, as the level of environmental uncertainty increases it becomes increasingly challenging to match a product's supply and demand characteristics with its supply chain design characteristics. Second, the key contribution of this research is the introduction of supply chain agility as a capability that helps mitigate the negative relationship between aspects of environmental uncertainty and SCF. Finally, this research contributes to theory development and managerial practice by exploring the complex relationship between SCF, supply chain agility, and financial performance. Developing high levels of supply chain agility and SCF requires deployment of resources for the focal firm. This manuscript provides a better understanding of how such expenditures can generate financial benefits for organizations.

69 citations


Journal ArticleDOI
TL;DR: It is proposed that future research should use alternative theories to incorporate overlooked aspects of decision-making, integrate different theories to account for the interdependencies between decisions, and adopt a portfolio perspective that considers each decision as part of an overall offshoring strategy.
Abstract: Mirroring the growing trend for firms to support their operations by locating activities abroad, research on the practice of offshoring has increased considerably in recent years. However, despite the mounting research, understanding of the key factors influencing decision-making for offshoring remains surprisingly limited due to fragmentation. In this study, we synthesize and integrate insights from different research domains in order to develop a comprehensive decisional framework for key offshoring decisions. The integrative decisional framework is based on a systematic review of offshoring research published in the most influential management and business journals in the past 25 years. In addition to providing a snapshot of the state of research on decision-making for offshoring, this study aims to stimulate future research by identifying promising research opportunities. In particular, we propose that future research should use alternative theories to incorporate overlooked aspects of decision-making, integrate different theories to account for the interdependencies between decisions, and adopt a portfolio perspective that considers each decision as part of an overall offshoring strategy.

59 citations


Journal ArticleDOI
TL;DR: It is concluded that emissions per revenue can serve as the best proxy for emissions as a metric for measuring overall environmental stewardship and develop and compare several measures of environmental efficiency.
Abstract: Despite documented benefits of remanufacturing, many manufacturers have yet to embrace the idea of tapping into remanufactured-goods markets. In this article, we explore this dichotomy and analyze the effect of remanufacturable product design on market segmentation and product and trade-in prices by studying a two-stage profit-maximization problem in which a price-setting manufacturer can choose whether or not to open a remanufactured-goods market for its product. Our results suggest that it is optimal for a manufacturer to design a remanufacturable product when the value-added from remanufacturing is relatively high but product durability is relatively low and innovation is nominal. In addition, we find that entering a remanufactured-goods market in and of itself does not necessarily translate into environmental friendliness. On the one hand, the optimal trade-in program could result in low return and/or remanufacturing rates. On the other hand, a low price for remanufactured products could attract higher demand and thereby potentially result in more damage to the environment. Meanwhile, external restrictions imposed on total greenhouse gas emissions draw criticism in their own right because they risk stifling growth or reducing overall consumer welfare. Given these trade-offs, we therefore develop and compare several measures of environmental efficiency and conclude that emissions per revenue can serve as the best proxy for emissions as a metric for measuring overall environmental stewardship.

43 citations


Journal ArticleDOI
TL;DR: The analysis shows that capabilities in improvement, innovation, sensing weak signals, and responsiveness all help sustain high-quality performance, and suggests that what it takes to achieve high- quality performance is different, in part, from what it took to sustain it.
Abstract: Many organizations that were once quality leaders have had challenges sustaining high-quality performance. Although research has examined frameworks and concepts that lead to high-quality performance, few studies examine how to sustain high-quality performance. Sustaining performance may require additional capabilities from what it takes to achieve it. Drawing on quality management literature, organizational resilience literature, and the theory of dynamic capabilities in the strategy literature, this study empirically investigates the effects of four capabilities that help sustain high-quality performance. The analysis shows that capabilities in improvement, innovation, sensing weak signals, and responsiveness all help sustain high-quality performance. This suggests that what it takes to achieve high-quality performance is different, in part, from what it takes to sustain it. The data comes from a survey of 147 manufacturing business units. The analysis shows that the relative benefits of these capabilities may depend on the level of competitive intensity and environmental uncertainty. The findings provide empirical support for a theoretical model and practical guidance for sustaining quality performance. [web URL: http://onlinelibrary.wiley.com/doi/10.1111/deci.12210/full]

43 citations


Journal ArticleDOI
TL;DR: The main determinant of the equilibrium in mature industries is to respond well to the actions of the competing chain rather than to directly maximize the profit of each chain, which means that the equilibrium does not necessarily maximize the profits of the entire industry.
Abstract: Our main objective is to investigate the influence of the bargaining power within a chain on its industry. As a building block, we first discuss the implications of bargaining within a single chain by considering an asymmetric Nash bargaining over the wholesale price (BW). We show that both Manufacturer Stackelberg (MS) and vertical integration (VI) strategies are special cases of the BW contract. We then develop the Nash equilibrium in an industry with two supply chains that use BW. We identify the profit-maximizing (coordinating) bargaining power within this industry. We show that when a chain is not monopolistic, VI does not coordinate the chain and that the MS contract, where the manufacturer has all the bargaining power, is coordinating when competition is intense. We find that the main determinant of the equilibrium in mature industries is to respond well to the actions of the competing chain rather than to directly maximize the profit of each chain. That is, the equilibrium does not necessarily maximize the profit of the entire industry. While a coordination of the industry could then increase the profitability of both chains, such a coordination is likely against antitrust law. Moreover, if one chain cannot change its actions, the other chain may unilaterally improve its profitability by deviating from the equilibrium. Our results lead to several predictions supported by empirical findings, such as that in competitive industries chains will work “close to” the MS contract.

42 citations


Journal ArticleDOI
TL;DR: A comprehensive analysis of 31 priority rules (PRs) on 18,480 portfolios containing 55,440 iterative projects finds that the best PRs for iterative project portfolios differ significantly from those for acyclical ones, and that thebest PRs at the project level differ from those at the portfolio level.
Abstract: Managers of product development (PD) project portfolios face difficult decisions in allocating limited resources to minimize project or portfolio delay. Although PD projects are highly iterative (cyclical), almost all of the vast literature on project scheduling assumes that projects are acyclical. This article addresses this gap with a comprehensive analysis of 31 priority rules (PRs) on 18,480 portfolios containing 55,440 iterative projects. We find that the best PRs for iterative project portfolios differ significantly from those for acyclical ones, and that the best PRs at the project level differ from those at the portfolio level. The best PR depends on project and portfolio characteristics such as network density, iteration intensity, resource loading profile, and amount of resource contention. In particular, by amplifying the effects of iteration, high-density networks hold dramatically different implications for iterative projects. Moreover, the best PR also differs depending on whether the objective is to minimize the average delay to all projects or to minimize delay to the overall portfolio. Thus, a project or portfolio manager who uses the same PR on all occasions will exhibit unnecessarily poor performance in most cases.

42 citations


Journal ArticleDOI
TL;DR: This study adopts an organizational dependence view to examine how three types of intergroup structures—administrative (formalization and centralization), task (task interdependence), and physical—influence project performance and buyer learning in NPD projects.
Abstract: Buying and supplying organizations rely on each other for developing better products in an efficient manner, which explains the popularity of involving suppliers in new product development (NPD). However, such involvement is not always successful, partially due to the challenges of structuring a buyer–supplier team to manage joint dependence and dependence asymmetry. This study adopts an organizational dependence view to examine how three types of intergroup structures—administrative (formalization and centralization), task (task interdependence), and physical (colocation)—influence project performance and buyer learning in NPD projects. Furthermore, adopting a contingency theory perspective, we study whether the national context moderates the effects of intergroup structures on project outcomes. We adopt a two-group structural equation modeling approach to test hypotheses with survey responses from a sample of NPD projects in the United States (US) and China. Results show different ways in which intergroup structures influence project performance and buyer learning in the two culturally, economically, and institutionally distinct countries. We discuss the implications of these new findings and present directions for future research.

Journal ArticleDOI
Marc Reimann1
TL;DR: This article includes a new reactive capability, namely the utilization of refurbished consumer returns from early sales to react to demand later in the selling season, in a newsvendor-type model and provides both analytical and numerical insights into the optimal anticipative and reactive decisions.
Abstract: Centering around anticipative and reactive capabilities of firms, accurate response is an important supply-side strategy to deal with demand uncertainty. Clearly, the structure of the possible reaction will crucially influence the optimal anticipative decision making. In this article, we extend the existing literature in this area by including a new reactive capability, namely the utilization of refurbished consumer returns from early sales to react to demand later in the selling season. Because consumer returns depend on previous sales, there is also a direct link to the anticipative supply decision. We capture this effect in a newsvendor-type model and provide both analytical and numerical insights into the optimal anticipative and reactive decisions as well as the value of refurbishing in terms of the retailer's expected profitability.

Journal ArticleDOI
TL;DR: The early onset gaze bias, commonly found in eye movements during choice, is predicted by models even when attention is entirely random and independent of the choice process, showing that the LOB is not evidence of a feedback loop between evidence accumulation and attention.
Abstract: We use computational modelling to examine the ability of evidence accumulation models to produce the reaction time distributions and attentional biases found in behavioural and eye-tracking research. We focus upon simulating reaction times and attention in binary choice with particular emphasis upon whether different models can predict the late onset bias (LOB), commonly found in eye movements during choice (sometimes called the gaze cascade). The first finding is that this bias is predicted by models even when attention is entirely random and independent of the choice process. This shows that the LOB is not evidence of a feedback loop between evidence accumulation and attention. Second, we examine models with a relative evidence decision rule and an absolute evidence rule. In the relative models a decision is made once the difference in evidence accumulated for two items reaches a threshold. In the absolute models, a decision is made once one item accumulates a certain amount of evidence, independently of how much is accumulated for a competitor. Our core result is simple – the existence of the late onset gaze bias to the option ultimately chosen, together with a positively skewed reaction time distribution means that the stopping rule must be relative not absolute. A large scale grid search of parameter space shows that absolute threshold models struggle to predict these phenomena even when incorporating evidence decay and assumptions of either mutual inhibition or feed forward inhibition.

Journal ArticleDOI
TL;DR: In this paper, the performance of the traditional news-vendor implementation versus a dynamic forecast-based implementation was examined under a stationary AR(1) demand, and it was shown that under certain conditions it is best to ignore the correlation and opt out of forecasting and to simply implement the traditional Newsvendor model.
Abstract: The classic newsvendor model was developed under the assumption that period-to-period demand is independent over time. In real-life applications, the notion of independent demand is often challenged. In this article, we examine the newsvendor model in the presence of correlated demands. Specifically under a stationary AR(1) demand, we study the performance of the traditional newsvendor implementation versus a dynamic forecast-based implementation. We demonstrate theoretically that implementing a minimum mean square error (MSE) forecast model will always have improved performance relative to the traditional implementation in terms of cost savings. In light of the widespread usage of all-purpose models like the moving-average method and exponential smoothing method, we compare the performance of these popular alternative forecasting methods against both the MSE-optimal implementation and the traditional newsvendor implementation. If only alternative forecasting methods are being considered, we find that under certain conditions it is best to ignore the correlation and opt out of forecasting and to simply implement the traditional newsvendor model.

Journal ArticleDOI
TL;DR: This work hypothesizes that CPOE effectiveness depends on the prevalence of patient safety culture within a hospital, and empirically test this proposition using data from 268 hospitals and multiple data sources, which shows that while C POE complements the patient safety dimensions of handoffs and transitions, feedback and communication about error, and organizational learning, CPOe substitutes for the dimension of management support for safety, in the context of the authors' dependent variable.
Abstract: The U.S. government recommends that hospitals adopt Computerized Provider Order Entry (CPOE) systems to improve the quality problems that plague U.S. hospitals. However, CPOE studies show mixed results. We hypothesize that CPOE effectiveness depends on the prevalence of patient safety culture within a hospital. Using organizational information processing theory, we describe how patient safety culture and CPOE enable healthcare organizations to better process information. Specifically, we posit that CPOE complements some aspects of patient safety culture and substitutes for others. Using ridge regression, we empirically test this proposition using data from 268 hospitals and multiple data sources. Results show that while CPOE complements the patient safety dimensions of handoffs and transitions, feedback and communication about error, and organizational learning, CPOE substitutes for the dimension of management support for safety, in the context of our dependent variable. As organizations work to implement new systems, this research can help decision-makers understand how culture impacts such initiatives and account for culture when anticipating effects. [web URL: http://onlinelibrary.wiley.com/doi/10.1111/deci.12199/full]

Journal ArticleDOI
TL;DR: The research results indicate that trust, relationship orientation, knowledge sharing self-efficacy, and relative autonomous motivation regarding KSBs are the key influencing factors of K SBs of professionals.
Abstract: By incorporating the perspectives of social cognitive theory and relative autonomous motivations, this study examines a model that depicts the influence of personal and environmental factors on employees’ knowledge sharing behaviors (KSBs). Data that were collected from 294 professionals in the industry were analyzed using component-based structural equation modeling to examine the proposed model. The research results indicate that trust, relationship orientation, knowledge sharing self-efficacy, and relative autonomous motivation regarding KSBs are the key influencing factors of KSBs of professionals. A key implication is that managers must consider the impact of the level of employee-perceived autonomous motivation when they seek to facilitate KSBs. Finally, the theoretical and practical contributions are discussed, followed by the suggestions for future research directions.

Journal ArticleDOI
TL;DR: It is argued that investments in product and process innovativeness are either fostered or hindered, contingent on the form of hostile pressures from the external operating environment, suggesting that manufacturers consider applying these capabilities contingent upon the type of hostile climate they face.
Abstract: Is focusing on innovativeness the appropriate organizational response in hostile environments? This study addresses four distinct forms of hostile environments: market decline, restrictiveness, competition, and resource scarcity. The research draws on contingency theory to explain how these forms of hostility affect product innovativeness, process innovativeness, and firm performance. While the extant literature has investigated the effects of hostile environments on performance, little has been done to distinguish between different forms of hostility and, in turn, their potential effect on product and process innovativeness. We argue that investments in product and process innovativeness are either fostered or hindered, contingent on the form of hostile pressures from the external operating environment. These firm responses should in turn suppress the negative effects of hostility on performance. A survey using newly developed measurement scales for hostile environments was used to collect data from 148 small and medium-sized manufacturing plants. The results provide evidence that generally supports our hypotheses. More specifically, the direct effects of the four forms of hostile environments impact product and process innovativeness differently. Likewise, the suppression and consistent mediation effects of product and process innovativeness differ depending upon the type of hostile environment, suggesting that manufacturers consider applying these capabilities contingent upon the type of hostile climate they face. Therefore, to understand how firms leverage product and process innovativeness, hostile environments are best differentiated into categories rather than being aggregated. [web URL: http://onlinelibrary.wiley.com/doi/10.1111/deci.12196/full]

Journal ArticleDOI
TL;DR: This research is one of the first to focus on the strategies of crowdsourcing platforms and it is shown that the linear fee schedule widely used in practice is not optimal and that a platform is better off lowering the fee rate for contests with high prizes.
Abstract: Crowdsourcing platforms specialize in hosting open contests and usually charge a percentage of the prizes as service fees. While prior research has studied the design of contests and the behavior of contestants, the strategy of a crowdsourcing platform has remained largely unexplored. We develop a game-theoretic model of crowdsourcing services and find the optimal fee structure of a platform. We prove for the case of a single contest that the service fees should be an increasing concave function of task prizes and show that this also holds true for the case of multiple contests. We further find that for a platform with many users and tasks, there is an optimal ratio of the number of contestants and contests. Our research is one of the first to focus on the strategies of crowdsourcing platforms and our results have interesting managerial implications. We show that the linear fee schedule widely used in practice is not optimal and that a platform is better off lowering the fee rate for contests with high prizes. It is also in the best interests of a platform to develop both sides of the crowdsourcing market proportionally and keep the ratio of contestants and contests at the optimal level. [web URL: http://onlinelibrary.wiley.com/doi/10.1111/deci.12201/full]

Journal ArticleDOI
TL;DR: An experimental study of the price-setting newsvendor problem is presented, which extends the traditional framework by allowing the decision maker to determine both the selling price and the order quantity of a given item.
Abstract: We present an experimental study of the price-setting newsvendor problem, which extends the traditional framework by allowing the decision maker to determine both the selling price and the order quantity of a given item. We compare behavior under this model with two benchmark conditions where subjects have a single decision to make (price or quantity). We observe that subjects deviate from the theoretical benchmarks when they are tasked with a single decision. They also exhibit anchoring behavior, where their anchor is the expected demand when quantity is the decision variable and is the initial inventory level when price is the decision variable. When decision makers set quantity and price concurrently, we observe no significant difference between the normative (i.e., expected profit-maximizing) prices and the decision makers’ price choices. Quantity decisions move further from the normative benchmarks (compared to when subjects have a single decision to make) when the ratio of cost to price is less than half. When this ratio is reversed, there is no significant difference between order levels in single- and multi-task settings. In the multidecision framework, we also observe a tendency to match orders and expected demand levels, which subjects can control using prices.

Journal ArticleDOI
TL;DR: This article demonstrates how the principles of design of experiments can be applied in a system dynamics model to find the auction parameter values that substantially reduce the effect of collusion in government procurement auctions.
Abstract: Government departments are increasingly turning to auctions to procure goods and services. Collusion among bidders, however, reduces competition and raises winning bid prices. Since conventional collusion control measures based on the redesign of auction mechanisms are less effective in government procurement auctions, there is a need to devise control measures that decrease the effect of collusion. This article demonstrates how the principles of design of experiments can be applied in a system dynamics model to find the auction parameter values that substantially reduce the effect of collusion in government procurement auctions. This research makes a number of contributions. First, it develops a feedback-based dynamic mechanism of collusion in government procurement auctions. The mechanism proposes the winning bid price as being determined not by the total number of bidders but by the number of independent bidders. It defines each cartel as one independent bidder regardless of the number of bidders in the cartel. Second, the mechanism is tested by developing a system dynamics model to government auctions for procuring contracts for roadwork projects in India. Third, the principles of experimental design are applied to find the auction parameter values that ensure high bid participation and low winning price-to-reserve price ratios.

Journal ArticleDOI
TL;DR: A new, robust, and effective machine learning algorithm for newsvendor problems with demand shocks but without any demand distribution information is presented that outperforms the traditional approaches in a variety of situations including large and frequent shocks of the demand mean.
Abstract: In today's competitive market, demand volume and even the underlying demand distribution can change quickly for a newsvendor seller. We refer to sudden changes in demand distribution as demand shocks. When a newsvendor seller has limited demand distribution information and also experiences underlying demand shocks, the majority of existing methods for newsvendor problems may not work well since they either require demand distribution information or assume stationary demand distribution. We present a new, robust, and effective machine learning algorithm for newsvendor problems with demand shocks but without any demand distribution information. The algorithm needs only an approximate estimate of the lower and upper bounds of demand range; no other knowledge such as demand mean, variance, or distribution type is necessary. We establish the theoretical bounds that determine this machine learning algorithm's performance in handling demand shocks. Computational experiments show that this algorithm outperforms the traditional approaches in a variety of situations including large and frequent shocks of the demand mean. The method can also be used as a meta-algorithm by incorporating other traditional approaches as experts. Working together, the original algorithm and the extended meta-algorithm can help manufacturers and retailers better adapt their production and inventory control decisions in dynamic environments where demand information is limited and demand shocks are frequent

Journal ArticleDOI
TL;DR: This work identifies the Pareto set of the push and/or pull contracts in a local supplier–retailer supply chain with the presence of an outside market, and draws managerial implications.
Abstract: Wholesale price contracts are widely studied in a single supplier-single retailer supply chain, but without considering an outside market where the supplier may sell if he gets a high enough price and the retailer may buy if the price is low enough. We fill this gap in the literature by studying push and pull contracts in a local supplier–retailer supply chain with the presence of an outside market. Taking the local supplier's maximum production capacity and the outside market barriers into account, we identify the Pareto set of the push and/or pull contracts and draw managerial implications. The main results include the following. First, the most inefficient point of the pull Pareto set cannot always be removed by considering both the push and pull contracts. Second, the supplier's production capacity plays a significant role in the presence of an outside market; it affects the supplier's negotiating power with the retailer and the coordination of the supply chain can be accomplished only with a large enough capacity. Third, the import and export barriers influence the supply chain significantly: (i) an export barrier in the local market and the supplier's production capacity influence the supplier's export strategy; (ii) a low import (resp., export) barrier in the local market can improve the local supply chain's efficiency by use of a push (resp., pull) contract; and (iii) a high import (resp., export) barrier in the local market encourages the supplier (resp., retailer) to bear more inventory risk.

Journal ArticleDOI
TL;DR: A mathematical model of a self-sustaining response supply chain is developed that yields insights about the relationships and interactions among self-Sustainment, speed of disaster onset, dispersion of impact, and the cost of the relief efforts.
Abstract: Governmental organizations play a major role in disaster relief operations. Supply chains set up to respond to disasters differ dramatically in many dimensions that affect the cost of relief efforts. One factor that has been described recently is self-sustainment, which occurs when supplies consumed by intermediate stages of a supply chain must be provided via the chain itself because they are not locally available. This article applies the concept of self-sustainment to response supply chains. A mathematical model of a self-sustaining response supply chain is developed. Analysis of this model yields insights about the relationships and interactions among self-sustainment, speed of disaster onset, dispersion of impact, and the cost of the relief efforts.

Journal ArticleDOI
TL;DR: This work considers a manufacturer facing an unreliable supplier with high or low type on initial reliability and derives optimal procurement contracts for both mechanisms and finds that the moral hazard does not necessarily generate more profit for high-type supplier.
Abstract: We consider a manufacturer facing an unreliable supplier with high or low type on initial reliability. The private reliability can be enhanced through process improvement initiated by either manufacturer (manufacturer-initiated improvement, MI) or supplier (supplier-initiated improvement, SI). We derive optimal procurement contracts for both mechanisms and find that the moral hazard does not necessarily generate more profit for high-type supplier. Furthermore, information asymmetry causes a greater possibility of not ordering from low type in SI than MI. For low type, when an upward effort distortion appears in both mechanisms, a decreased (increased) unit penalty should be imposed in MI (SI) compared with symmetric information case. Although possibly efficient effort from the supplier could yield greater channel profit in SI, several scenarios violate this expectation. However, the manufacturer's expected profit in MI is no less than that in SI. When MI is extended to MSI where both manufacturer and supplier can exert effort, the expected profits of two parties are equal to those in SI. We further extend SI to SID, where both process improvement and dual-sourcing are available. The manufacturer considers the trade-off between the benefit from diversification and the loss from dual information rent to decide to choose SID or MI. By comparing SID with pure dual-sourcing, we find that supplier's process improvement could either accelerate or retard the exercise of dual-sourcing.

Journal ArticleDOI
TL;DR: The development and analysis of an online choice survey to understand consumer preferences among three types of online distribution channels are outlined and a Multinomial Logit model is employed to analyze the data and measure the consumer trade-offs between price and other attributes of the product.
Abstract: The selling of perishable services (e.g., hotel rooms, airline seats, and rental cars) online is increasingly popular with both retailers and consumers. Among the innovative approaches to online sales is opaque selling. First popularized by Priceline.com's name-your-own-price model, opaque selling hides some attributes of the service (notably, brand and specific location) until after the purchase decision, in exchange for a discounted price. This means that a branded “product” is being sold as somewhat of a commodity, but the brand “name” is protected by the opaque model. The attraction of this model for retailers is that they are presumably able to increase their revenue stream, albeit at a lower rate, by selling rooms that otherwise would remain in inventory. In this article, we outline the development and analysis of an online choice survey to understand consumer preferences among three types of online distribution channels: regular full information sales channels, and opaque sales channels with or without consumer bidding. A Multinomial Logit model is employed to analyze the data and measure the consumer trade-offs between price and other attributes of the product. We use the estimated model to calculate the incremental demand and revenue created by using an opaque channel simultaneously with regular full information channels. On balance, we find that correctly priced opaque channels can add to hotels revenue streams without undue cannibalization of regular room sales.

Journal ArticleDOI
TL;DR: A threshold is found for the cost of the low-quality product below which it is optimal to add it to the firm's portfolio, and it is shown that while a cost advantage is necessary to make the lower quality offering profitable under linear or convex relative utility functions, market segmentation alone can justify the addition of the lowerquality product under concave relative Utility functions.
Abstract: We investigate the profitability of adding a lower quality or remanufactured product to the product portfolio of a monopoly firm, both in single-period and steady-state settings. Consumer behavior is characterized by a deterministic utility function for the original product and a nonlinear relative utility function for the lower quality product. We find a threshold for the cost of the low-quality product below which it is optimal to add it to the firm's portfolio, and show that while a cost advantage is necessary to make the lower quality offering profitable under linear or convex relative utility functions, market segmentation alone can justify the addition of the lower quality product under concave relative utility functions. In particular, we characterize (i) the new product cost under which it is optimal to offer a lower quality version of the product even if it is as costly to produce as the original product; and (ii) the weighted average of new and remanufactured product costs in the steady state under which it becomes cost effective to offer new products under the remanufactured label. Finally, we also identify the maximum possible profits from customer segmentation and the form of the relative utility function that achieves them. We discuss the implications for the common marketing practices of branding and generics.

Journal ArticleDOI
TL;DR: This study develops and tests an integrated theoretical framework for modeling an individual's public transportation decision-making process using four independent variables: Perceived Public Transportation Security, Knowledge, Price, and Convenience and develops and refine associated items using confirmatory factor analysis.
Abstract: Understanding the decision-making factors associated with public transportation is essential in strategic development of public transportation to improve acceptance and utilization of mass transit systems. This research analyzes factors affecting attitudes toward public transportation and the choice of transportation mode by investigating the public transportation decision-making process of working professionals using a survey methodology. The objectives of this research are to model the transportation decision-making process of public transportation users in a metropolitan area and to determine key factors that affect the public transportation choices made by potential public transportation users. This study contributes to the literature by developing and testing an integrated theoretical framework for modeling an individual's public transportation decision-making process using four independent variables: Perceived Public Transportation Security, Knowledge, Price, and Convenience. We develop the proposed theoretical framework based upon the extant literature and tested it using partial least squares structural equation modeling (PLS-SEM). Based on the Theory of Reasoned Action, the Theory of Planned Behavior, and utility theory, we develop the factors and refine associated items using confirmatory factor analysis.

Journal ArticleDOI
TL;DR: Data on software exploits is analyzed to identify factors associated with the duration between a vulnerability discovery date and the date when an exploit is publicly available, a time window for patching before exploit attack levels may escalate.
Abstract: Enterprises experience opportunistic exploits targeted at vulnerable technology. Vulnerabilities in software-based applications, service systems, enterprise platforms, and supply chains are discovered and disclosed on an alarmingly regular basis. A necessary enterprise risk management task concerns identifying and patching vulnerabilities. Yet it is a costly affair to develop and deploy patches to alleviate risk and prevent damage from exploit attacks. Given the limited resources available, technology producers and users must identify priorities for such tasks. When not overlooked, vulnerability-patching tasks often are prioritized based on vulnerability disclosure dates, thus vulnerabilities disclosed earlier usually have patches developed and deployed earlier. We suggest priorities also should focus on time-dependent likelihoods of exploits getting published. We analyze data on software exploits to identify factors associated with the duration between a vulnerability discovery date and the date when an exploit is publicly available, a time window for patching before exploit attack levels may escalate. Actively prioritizing vulnerability patching based on likelihoods of exploit publication may help lessen losses due to exploit attacks. Technology managers might apply the insights to better estimate relative risk levels, and better prioritize protection efforts toward vulnerabilities having higher risk of earlier exploitation.

Journal ArticleDOI
TL;DR: A firm that acquires and remanufactures cores of multiple quality conditions to satisfy demand is investigated and it is found that sorting can be completely useless to RMTS system, and thus should never be adopted regardless of the sorting cost.
Abstract: Variation in core condition and uncertainty in market demand pose great challenges for remanufacturers to match supply with demand. This article investigates a firm that acquires and remanufactures cores of multiple quality conditions to satisfy demand. Both remanufacturing-to-stock (RMTS) and remanufacturing-to-order (RMTO) systems are considered. In each system, a sorting operation that resolves the core conditions before remanufacturing may or may not be adopted, leading to four possible sorting/remanufacturing strategies: (1) no sorting in RMTS; (2) sorting in RMTS; (3) no sorting in RMTO; and (4) sorting in RMTO. Under each strategy, we derive the optimal decisions on the acquisition and remanufacturing quantities, in two scenarios, respectively: (i) all acquired cores are remanufacturable and (ii) some cores are non-remanufacturable. We find that sorting can be completely useless to RMTS system, and thus should never be adopted regardless of the sorting cost. We provide the analytical condition under which this ineffectiveness of sorting occurs. Nevertheless, sorting is always useful to RMTO system and should be adopted when the sorting cost is below a threshold value. We also conduct an extensive numerical study and show that the effects of sorting to RMTO system are more significant than that to RMTS system.

Journal ArticleDOI
TL;DR: A methodology for firms purchasing spare parts to manage end-of-supply risk by utilizing proportional hazard models in terms of supply chain conditions of the parts, demonstrated using data on about 2,000 spare parts collected from a maintenance repair organization in the aviation industry.
Abstract: Operators of long field-life systems like airplanes are faced with hazards in the supply of spare parts. If the original manufacturers or suppliers of parts end their supply, this may have large impacts on operating costs of firms needing these parts. Existing end-of-supply evaluation methods are focused mostly on the downstream supply chain, which is of interest mainly to spare part manufacturers. Firms that purchase spare parts have limited information on parts sales, and indicators of end-of-supply risk can also be found in the upstream supply chain. This article proposes a methodology for firms purchasing spare parts to manage end-of-supply risk by utilizing proportional hazard models in terms of supply chain conditions of the parts. The considered risk indicators fall into four main categories, of which two are related to supply (price and lead time) and two others are related to demand (cycle time and throughput). The methodology is demonstrated using data on about 2,000 spare parts collected from a maintenance repair organization in the aviation industry. Cross-validation results and out-of-sample risk assessments show good performance of the method to identify spare parts with high end-of-supply risk. Further validation is provided by survey results obtained from the maintenance repair organization, which show strong agreement between the firm's and the model's identification of high-risk spare parts.