scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 2001"


Journal ArticleDOI
Lee Fleming1
TL;DR: It is proposed that purely technological uncertainty derives from inventors' search processes with unfamiliar components and component combinations, which leads to less useful inventions on average and implies an increase in the variability that can result in both failure and breakthrough.
Abstract: While the course of technological change is widely accepted to be highly uncertain and unpredictable, little work has identified or studied the ultimate sources and causes of that uncertainty. This paper proposes that purely technological uncertainty derives from inventors' search processes with unfamiliar components and component combinations. Experimentation with new components and new combinations leads to less useful inventions on average, but it also implies an increase in the variability that can result in both failure and breakthrough. Negative binomial count and dispersion models with patent citation data demonstrate that new combinations are indeed more variable. In contrast to predictions, however, the reuse of components has a nonmonotonic and eventually positive effect on variability.

1,965 citations


Journal ArticleDOI
TL;DR: This paper looks inside the "black box" of product development at the fundamentaldecisions that are made by intention or default, adopting the perspective ofproduct development as a deliberate business process involving hundreds of decisions, many of which can be usefully supported by knowledge and tools.
Abstract: This paper is a review of research in product development, which we define as the transformation of a market opportunity into a product available for sale. Our review is broad, encompassing work in the academic fields of marketing, operations management, and engineering design. The value of this breadth is in conveying the shape of the entire research landscape. We focus on product development projects within a single firm. We also devote our attention to the development of physical goods, although much of the work we describe applies to products of all kinds. We look inside the "black box" of product development at the fundamentaldecisions that are made by intention or default. In doing so, we adopt the perspective of product development as a deliberate business process involving hundreds of decisions, many of which can be usefully supported by knowledge and tools. We contrast this approach to prior reviews of the literature, which tend to examine the importance of environmental and contextual variables, such as market growth rate, the competitive environment, or the level of top-management support.

1,725 citations


Journal ArticleDOI
TL;DR: An algorithm is developed that both clears the financial system in a computationally efficient fashion and provides information on the systemic risk faced by the individual system firms and produces qualitative comparative statics for financial systems.
Abstract: We consider default by firms that are part of a single clearing mechanism. The obligations of all firms within the system are determined simultaneously in a fashion consistent with the priority of debt claims and the limited liability of equity. We first show, via a fixed-point argument, that there always exists a "clearing payment vector" that clears the obligations of the members of the clearing system; under mild regularity conditions, this clearing vector is unique. Next, we develop an algorithm that both clears the financial system in a computationally efficient fashion and provides information on the systemic risk faced by the individual system firms. Finally, we produce qualitative comparative statics for financial systems. These comparative statics imply that, in contrast to single-firm results, even unsystematic, nondissipative shocks to the system will lower the total value of the system and may lower the value of the equity of some of the individual system firms.

1,261 citations


Journal ArticleDOI
TL;DR: Contracts that allow the supply chain to share demand forecasts credibly under either compliance regime are studied.
Abstract: Forecast sharing is studied in a supply chain with a manufacturer that faces stochastic demand for a single product and a supplier that is the sole source for a critical component. The following sequence of events occurs: the manufacturer provides her initial forecast to the supplier along with a contract, the supplier constructs capacity (if he accepts the contract), the manufacturer receives an updated forecast and submits a final order. Two contract compliance regimes are considered. If the supplier accepts the contract under forced compliance then he has little flexibility with respect to his capacity choice; under voluntary compliance, however, he maintains substantial flexibility. Optimal supply chain performance requires the manufacturer to share her initial forecast truthfully, but she has an incentive to inflate her forecast to induce the supplier to build more capacity. The supplier is aware of this bias, and so may not trust the manufacturer's forecast, harming supply chain performance. We study contracts that allow the supply chain to share demand forecasts credibly under either compliance regime.

999 citations


Journal ArticleDOI
TL;DR: This two-step research is using a combination of qualitative and quantitative methods and two data sets to suggest a conceptual, two-dimensional construct model for the classification of technical projects and for the investigation of project contingencies.
Abstract: Not many authors have attempted to classify projects according to any specific scheme, and those who have tried rarely offered extensive empirical evidence. From a theoretical perspective, a traditional distinction between radical and incremental innovation has often been used in the literature of innovation, and has created the basis for many classical contingency studies. Similar concepts, however, did not become standard in the literature of projects, and it seems that theory development in project management is still in its early years. As a result, most project management literature still assumes that all projects are fundamentally similar and that "one size fits all." The purpose of this exploratory research is to show how different types of projects are managed in different ways, and to explore the domain of traditional contingency theory in the more modern world of projects. This two-step research is using a combination of qualitative and quantitative methods and two data sets to suggest a conceptual, two-dimensional construct model for the classification of technical projects and for the investigation of project contingencies. Within this framework, projects are classified into four levels of technological uncertainty, and into three levels of system complexity, according to a hierarchy of systems and subsystems. The study provides two types of implications. For project leadership it shows why and how management should adapt a more project-specific style. For theory development, it offers a collection of insights that seem relevant to the world of projects as temporary organizations, but are, at times, different from classical structural contingency theory paradigms in enduring organizations. While still exploratory in nature, this study attempts to suggest new inroads to the future study of modern project domains.

793 citations


Journal ArticleDOI
TL;DR: A demand-based view of technology evolution that is focused on the interaction between technology development and the demand environment in which the technology is ultimately evaluated, and reveals that demand heterogeneity offers an alternative to supply-side explanations of the technology life cycle.
Abstract: The evolution of technology has been a central issue in the strategy and organizations literature. However, the focus of much of this work has been on what is essentially the "supply side" of technical change--the evolution of firm capabilities. We present a demand-based view of technology evolution that is focused on the interaction between technology development and the demand environment in which the technology is ultimately evaluated. We develop a formal computer simulation model that explicitly considers the influence of heterogeneity in market demand--the presence of consumers with different needs and requirements--on firms' innovation choices. The model is used to examine the dynamics of product and process innovation (Utterback and Abernathy 1975). The analysis reveals that demand heterogeneity offers an alternative to supply-side explanations of the technology life cycle. Further, by considering the implications of decreasing marginal utility from performance improvements, the model highlights the role of "technologically satisfied" consumers in shaping innovation incentives, and suggests a rationale for a new stage in the technology life cycle characterized by increasing performance at a stable price. The stage has not yet been treated formally in the literature, but is widely observed, most prominently in digital and information-based technologies.

761 citations


Journal ArticleDOI
TL;DR: This paper presents a method based on nonlinear programming that can be used to generate a limited number of discrete outcomes that satisfy specified statistical properties, and argues that what are the relevant properties, will be problem dependent.
Abstract: In models of decision making under uncertainty we often are faced with the problem of representing the uncertainties in a form suitable for quantitative models. If the uncertainties are expressed in terms of multivariate continuous distributions, or a discrete distribution with far too many outcomes, we normally face two possibilities: either creating a decision model with internal sampling, or trying to find a simple discrete approximation of the given distribution that serves as input to the model. This paper presents a method based on nonlinear programming that can be used to generate a limited number of discrete outcomes that satisfy specified statistical properties. Users are free to specify any statistical properties they find relevant, and the method can handle inconsistencies in the specifications. The basic idea is to minimize some measure of distance between the statistical properties of the generated outcomes and the specified properties. We illustrate the method by single- and multiple-period problems. The results are encouraging in that a limited number of generated outcomes indeed have statistical properties that are close to or equal to the specifications. We discuss how to verify that the relevant statistical properties are captured in these specifications, and argue that what are the relevant properties, will be problem dependent.

698 citations


Journal ArticleDOI
TL;DR: The study shows that the probability that an invention will be commercialized through firm formation is influenced by its importance, radicalness, and patent scope.
Abstract: Research on the creation of new high-technology companies has typically focused either on industry-level factors such as market structure and technology regime or on individual-level factors such as the work experience of entrepreneurs. This study complements these approaches by examining the effect of technological opportunities on firm formation. In particular, the study shows that the probability that an invention will be commercialized through firm formation is influenced by its importance, radicalness, and patent scope.

665 citations


Journal ArticleDOI
TL;DR: It is argued that complexity in product design and vertical integration of production are complements: that in-house production is more attractive when product complexity is high, as firms seek to capture the benefits of their investment in the skills needed to coordinate development of complex designs.
Abstract: This paper focuses on the connection between product complexity and vertical integration using original empirical evidence from the auto industry. A rich literature has addressed the choice between internal production and external sourcing of components in the auto industry. More recent literature has developed the concept of product architecture as another choice variable that may be one of the important contributors to product complexity. In this paper, we connect these two important decisions and study them jointly. We use the property rights approach to argue that complexity in product design and vertical integration of production are complements: that in-house production is more attractive when product complexity is high, as firms seek to capture the benefits of their investment in the skills needed to coordinate development of complex designs. We test this hypothesis with a simultaneous equations model applied to data from the luxury-performance segment of the auto industry. We find a significant and positive relationship between product complexity and vertical integration. This has implications for optimal incentive structures within firms, as well as for interpreting firm performance.

645 citations


Journal ArticleDOI
TL;DR: In this article, the authors adopt a multidisciplinary view of innovation by integrating operations and marketing perspectives of product development and show that the organizational process factors studied are associated with achievement of operational outcome targets for product quality, unit cost, and time-to-market.
Abstract: This paper adopts a multidisciplinary view of innovation by integrating operations and marketing perspectives of product development. The conceptual framework builds on the resource-based view of the firm and organizational information-processing theory to characterize relationships among organizational process factors, product development capabilities, critical uncertainties, and operational/market performance in product development projects. Data from a cross-sectional sample of 120 completed development projects for assembled goods is analyzed via a two-stage hierarchical moderated regression approach. The findings show that: 1 the organizational process factors studied are associated with achievement of operational outcome targets for product quality, unit cost, and time-to-market; 2 achievement of operational outcomes aids the achievement of market outcomes, in turn suggesting that development capabilities are indeed valuable firm resources; and 3 these relationships are robust under conditions of technological, market, and environmental uncertainty. This article provides practical insight into how product development projects can be better managed for operational and market success. Additionally, this article sets a theoretical and empirical basis for future research on the influence of organizational process factors and capabilities on diverse product-innovation outcomes.

621 citations


Journal ArticleDOI
TL;DR: A quality-based model for analyzing the strategic and policy issues concerning the development of products with conflicting traditional and environmental attributes is developed and two major findings show that green product development and stricter environmental standards might not necessarily benefit the environment.
Abstract: Green product development, which addresses environmental issues through product design and innovation as opposed to the traditional end-of-pipe-control approach, is receiving significant attention from customers, industries, and governments around the world. In this paper we develop a quality-based model for analyzing the strategic and policy issues concerning the development of products with conflicting traditional and environmental attributes. On the demand side of the problem, we use the framework of conjoint analysis to structure the preferences of the ordinary and green customers. On the supply side, we apply the theories in optimal product design and market segmentation to analyze the producer's strategic decisions regarding the number of products introduced and their prices and qualities. On the policy side, we evaluate the effects of environmental standards on the economic and environmental consequences of green product development. By jointly considering the interactions among the customers' preferences, the producer's product strategies, and the environmental standards imposed by governments, we present some interesting findings that can be used to manage and regulate the development of green products. Two major findings show that green product development and stricter environmental standards might not necessarily benefit the environment.

Journal ArticleDOI
TL;DR: It is proposed that learning curves can vary across organizations engaged in the same "learning task," due to organizational learning effects, and further revealed that the slope of the learning curve varies significantly across organizations.
Abstract: This paper examines learning curves in the health care setting to determine whether organizations achieve performance improvements from cumulative experience at different rates. Although extensive research has shown that cumulative experience leads to performance improvement across numerous contexts, the question of how much of this improvement is due to mere experience and how much is due to collective learning processes has received little attention. We argue that organizational learning processes may allow some organizations to benefit more than others from equivalent levels of experience. We thus propose that learning curves can vary across organizations engaged in the same "learning task," due to organizational learning effects. To investigate this proposition, we investigate cardiac surgery departments implementing a new technology for minimally invasive cardiac surgery. Data on operative procedure times from a sample of 660 patients who underwent the new operation at 16 different institutions are analyzed. The results confirm that cumulative experience is a significant predictor of learning, and further reveal that the slope of the learning curve varies significantly across organizations. Theoretical and practical implications of the work are discussed.

Journal ArticleDOI
TL;DR: It is shown that in this industry, constructs that support a more flexible development process are associated with better-performing projects and investments in architectural design play a dual role in a flexible process.
Abstract: Uncertain and dynamic environments present fundamental challenges to managers of the new product development process. Between successive product generations, significant evolutions can occur in both the customer needs a product must address and the technologies it employs to satisfy these needs. Even within a single development project, firms must respond to new information, or risk developing a product that is obsolete the day it is launched. This paper examines the characteristics of an effective development process in one such environment-the Internet software industry. Using data on 29 completed development projects we show that in this industry, constructs that support a more flexible development process are associated with better-performing projects. This flexible process is characterized by the ability to generate and respond to new information for a longer proportion of a development cycle. The constructs that support such a process are greater investments in architectural design, earlier feedback on product performance from the market, and the use of a development team with greater amounts of "generational" experience. Our results suggest that investments in architectural design play a dual role in a flexible process: First, through the need to select an architecture that maximizes product performance and, second, through the need to select an architecture that facilitates development process flexibility. We provide examples from our fieldwork to support this view.

Journal ArticleDOI
TL;DR: A cooperative, two-stage supply chain consisting of two members: a retailer and a supplier is considered, and each member updates the forecasts of future demands periodically, and is able to integrate the adjusted forecasts into his replenishment process.
Abstract: We consider a cooperative, two-stage supply chain consisting of two members: a retailer and a supplier. In our first model, called local forecasting, each member updates the forecasts of future demands periodically, and is able to integrate the adjusted forecasts into his replenishment process. Forecast adjustments made at both levels of the supply chain can be correlated. The supply chain has a decentralized information structure, so that day-to-day inventory and forecast information are known locally only. In our second model, named collaborative forecasting, the supply chain members jointly maintain and update a single forecasting process in the system. Hence, forecasting information becomes centralized. Finally, we consider as a benchmark the special case in which forecasts are not integrated into the replenishment processes at all. We propose a unified framework that allows us to study and compare the three types of settings. This study comes at a time when various types of collaborative forecasting partnerships are being experimented within industry, and when the drivers for success or failure of such initiatives are not yet fully understood. In addition to providing some managerial insights into questions that arise in this context, our set of models is tailored to serve as building blocks for future work in this emerging area of research.

Journal ArticleDOI
TL;DR: It is found that if uncertainty is resolved or costs/revenues occur after all decisions have been made, more variability may "smear out" contingencies and thus reduce the value of flexibility, which runs counter to established option pricing theory intuition.
Abstract: Managerial flexibility has value in the context of uncertain R&D projects, as management can repeatedly gather information about uncertain project and market characteristics and, based on this information, change its course of action. This value is now well accepted and referred to as "real option value." We introduce, in addition to the familiar real option of abandonment, the option of corrective action that management can take during the project. The intuition from options pricing theory is that higher uncertainty in project payoffs increases the real option value of managerial decision flexibility. However, R&D managers face uncertainty not only in payoffs, but also from many other sources. We identify five example types of R&D uncertainty, in market payoffs, project budgets, product performance, market requirements, and project schedules. How do they influence the value from managerial flexibility? We find that if uncertainty is resolved or costs/revenues occurafter all decisions have been made, more variability may "smear out" contingencies and thus reduce the value of flexibility. In addition, variability may reduce the probability of flexibility ever being exercised, which also reduces its value. This result runs counter to established option pricing theory intuition and contributes to a better risk management in R&D projects. Our model builds intuition for R&D managers as to when it is and when it is not worthwhile to delay commitments--for example, by postponing a design freeze, thus maintaining flexibility in R&D projects.

Journal ArticleDOI
TL;DR: It is shown that no traditional discount scheme, based on order quantities only, suffices to optimize channelwide profits when there are multiple nonidentical retailers, and an optimal strategy is characterized, maximizing total systemwide profits in a centralized system.
Abstract: We address a fundamental two-echelon distribution system in which the sales volumes of the retailers are endogenously determined on the basis of known demand functions. Specifically, this paper studies a distribution channel where a supplier distributes a single product to retailers, who in turn sell the product to consumers. The demand in each retail market arrives continuously at a constant rate that is a general decreasing function of the retail price in the market. We have characterized an optimal strategy, maximizing total systemwide profits in a centralized system. We have also shown that the same optimum level of channelwide profits can be achieved in a decentralized system, but only if coordination is achieved via periodically charged, fixed fees, and a nontraditional discount pricing scheme under which the discount given to a retailer is the sum of three discount components based on the retailer's i annual sales volume, ii order quantity, and iii order frequency, respectively. Moreover, we show that no traditional discount scheme, based on order quantities only, suffices to optimize channelwide profits when there are multiple nonidentical retailers. The paper also considers a scenario where the channel members fail to coordinate their decisions and provides numerical examples that illustrate the value of coordination. We extend our results to settings in which the retailers' holding cost rates depend on the wholesale price.

Journal ArticleDOI
TL;DR: A formal model is developed that integrates the structural elements of service delivery and finds that temporary imbalances between service capacity and demand interact with decision rules for effort allocation, capacity management, overtime, and quality aspirations to yield permanent erosion of the service standards and loss of revenue.
Abstract: The erosion of service quality throughout the economy is a frequent concern in the popular press. The American Customer Satisfaction Index for services fell in 2000 to 69.4%, down 5 percentage points from 1994. We hypothesize that the characteristics of services--inseparability, intangibility, and labor intensity--interact with management practices to bias service providers toward reducing the level of service they deliver, often locking entire industries into a vicious cycle of eroding service standards. To explore this proposition we develop a formal model that integrates the structural elements of service delivery. We use econometric estimation, interviews, observations, and archival data to calibrate the model for a consumer-lending service center in a major bank in the United Kingdom. We find that temporary imbalances between service capacity and demand interact with decision rules for effort allocation, capacity management, overtime, and quality aspirations to yield permanent erosion of the service standards and loss of revenue. We explore policies to improve performance and implications for organizational design in the service sector.

Journal ArticleDOI
TL;DR: In this paper, the long-run stock price performance of firms with effective total quality management (TQM) programs is evaluated. And the winning of quality awards is used as a proxy for effective TQM implementation.
Abstract: This paper documents the long-run stock price performance of firms with effective Total Quality Management (TQM) programs. The winning of quality awards is used as a proxy for effective TQM implementation. We compare stock price performance of award winners against various matched control groups for a five-year implementation period and a five-year postimplementation period. During the implementation period there is no difference in the stock price performance, but during the postimplementation period award winners significantly outperform firms in the various control groups. Depending on the control group used, the mean outperformance ranges from 38% to 46%. Our results clearly indicate that effective implementation of TQM principles and philosophies leads to significant wealth creation. Furthermore, our results should alleviate many of the concerns regarding the value of quality award systems. Overall, these systems are valuable in terms of recognizing TQM firms and promoting awareness of TQM.

Journal ArticleDOI
TL;DR: The results indicate that platforms are not appropriate for extreme levels of market diversity or high levels of nonplatform scale economies, and a firm's product positioning and introduction sequence decisions made during the product-planning phase are significantly impacted by the presence of platforms.
Abstract: In their quest to manage the complexity of offering greater product variety, firms in many industries are considering platform-based product development. Product platforms, which are component and subsystem assets shared across a product-family, enable a firm to better leverage investments in product design and development. While the platform approach offers a number of benefits, it also imposes certain additional costs that have not received adequate research attention. In this paper, we use an industrial example both to illustrate some of the costs and benefits of platform-based product development and to motivate the development of a mathematical model. The model is formulated to better understand the appropriateness of product platforms and their impact on product-planning decisions. Our results indicate that platforms are not appropriate for extreme levels of market diversity or high levels of nonplatform scale economies. Also, a firm's product positioning and introduction sequence decisions made during the product-planning phase are significantly impacted by the presence of platforms. Specifically, a platform increases the separation among products and offers a multitude of product introduction strategies. We translate our model findings into a managerial framework.

Journal ArticleDOI
TL;DR: This work finds transshipment prices which induce the locations to choose inventory levels consistent with joint-profit maximization, if each location aims to maximize its own profits.
Abstract: In situations where a seller has surplus stock and another seller is stocked out, it may be desirable to transfer surplus stock from the former to the latter. We examine how the possibility of such transshipments between two independent locations affects the optimal inventory orders at each location. If each location aims to maximize its own profits--we call this local decision making--their inventory choices will not, in general, maximize joint profits. We find transshipment prices which induce the locations to choose inventory levels consistent with joint-profit maximization.

Journal ArticleDOI
TL;DR: A framework for early analysis based on the success potential embodied in the product-idea itself and the circumstances of its emergence is proposed, identifying several determinants such as how the ideas originated, their specific configurations, and the level of technology required for their implementation that significantly distinguish successful from unsuccessful new products in the marketplace.
Abstract: In view of the distressingly low rate of success in new product introduction, it is important to identify predictive guidelines early in the new product development process so that better choices can be made and unnecessary costs avoided In this paper, we propose a framework for early analysis based on the success potential embodied in the product-idea itself and the circumstances of its emergence Based on two studies reporting actual introductions, we identified several determinants such as how the ideas originated, their specific configurations, and the level of technology required for their implementation that significantly distinguish successful from unsuccessful new products in the marketplace We suggest that these factors, together with already known factors of success/failure, may aid in the estimation of the potential of a concept early in its development

Journal ArticleDOI
TL;DR: Using data on patents assigned to the Massachusetts Institute of Technology during the 1980-1996 period, it is shown that four hypothesized dimensions of the technology regime--the age of the technical field, the tendency of the market toward segmentation, the effectiveness of patents, and the importance of complementary assets in marketing and distribution--influence the likelihood that new technology will be exploited through firm formation.
Abstract: At least since Schumpeter (1934 and 1942), researchers have been interested in identifying the dimensions of technology regimes that facilitate new firm formation as a mode of technology exploitation. Using data on 1,397 patents assigned to the Massachusetts Institute of Technology during the 1980-1996 period, I show that four hypothesized dimensions of the technology regime--the age of the technical field, the tendency of the market toward segmentation, the effectiveness of patents, and the importance of complementary assets in marketing and distribution--influence the likelihood that new technology will be exploited through firm formation.

Journal ArticleDOI
TL;DR: This paper develops a model of information-acquisition decisions by firms that are competing in a "strategic factor market" to purchase a scarce resource whose value is unknown and differs across firms.
Abstract: This paper develops a model of information-acquisition decisions by firms that are competing in a "strategic factor market" (Barney 1986) to purchase a scarce resource whose value is unknown and differs across firms. The model builds on the argument that more accurate expectations about the firm-specific value of resources is, other than luck, the only way for firms to obtain the specific resources required for competitive advantage. We address the more specific question of what types of information firms should gather to accomplish this goal. The model generates a series of testable hypotheses about how a firm's optimal mix of different types of information is affected by a number of factors, including the level of uncertainty about the value of the resource being acquired; the rarity, imitability, and nonsubstitutability of that resource; the level of inscrutability of firms' pre-existing stocks of resources; and firms' information-gathering and information-processing capacities.

Journal ArticleDOI
TL;DR: It is demonstrated that the prices of options, which depend on extrema, can be much more sensitive to the specification of the underlying price process than standard call and put options and show that a financial institution that uses the standard geometric Brownian motion assumption is exposed to significant pricing and hedging errors when dealing in path-dependent options.
Abstract: Much of the work on path-dependent options assumes that the underlying asset price follows geometric Brownian motion with constant volatility. This paper uses a more general assumption for the asset price process that provides a better fit to the empirical observations. We use the so-called constant elasticity of variance CEV diffusion model where the volatility is a function of the underlying asset price. We derive analytical formulae for the prices of important types of path-dependent options under this assumption. We demonstrate that the prices of options, which depend on extrema, such as barrier and lookback options, can be much more sensitive to the specification of the underlying price process than standard call and put options and show that a financial institution that uses the standard geometric Brownian motion assumption is exposed to significant pricing and hedging errors when dealing in path-dependent options.

Journal ArticleDOI
TL;DR: This paper speculates on the biases and their sizes by using the quantitative assessments of probability transformation and loss aversion suggested by prospect theory and presents quantitative corrections for the probability and certainty equivalence methods.
Abstract: This paper proposes a quantitative modification of standard utility elicitation procedures, such as the probability and certainty equivalence methods, to correct for commonly observed violations of expected utility. Traditionally, decision analysis assumes expected utility not only for the prescriptive purpose of calculating optimal decisions but also for the descriptive purpose of eliciting utilities. However, descriptive violations of expected utility bias utility elicitations. That such biases are effective became clear when systematic discrepancies were found between different utility elicitation methods that, under expected utility, should have yielded identical utilities. As it is not clear how to correct for these biases without further knowledge of their size or nature, most utility elicitations still calculate utilities by means of the expected utility formula. This paper speculates on the biases and their sizes by using the quantitative assessments of probability transformation and loss aversion suggested by prospect theory. It presents quantitative corrections for the probability and certainty equivalence methods. If interactive sessions to correct for biases are not possible, then the authors propose to use the corrected utilities rather than the uncorrected ones in prescriptions of optimal decisions. In an experiment, the discrepancies between the probability and certainty equivalence methods are removed by the authors' proposal.

Journal ArticleDOI
TL;DR: Empirical results suggest that firms which match supply chain structure to the type of product variety they offer outperform firms which fail to match such choices.
Abstract: Using data from the U.S. bicycle industry, we examine the relation among product variety, supply chain structure, and firm performance. Variety imposes two types of costs on a supply chain:production costs andmarket mediation costs. Production costs include, among other costs, the incremental fixed investments associated with providing additional product variants. Market mediation costs arise because of uncertainty in product demand created by variety. In the presence of demand uncertainty, precisely matching supply with demand is difficult. Market mediation costs include the variety-related inventory holding costs, product mark-down costs occurring when supply exceeds demand, and the costs of lost sales occurring when demand exceeds supply. We analyze product variety at the product attribute level, noting that the relative impact of variety on production and market mediation costs depends to a large extent on the attribute underlying the variety. That is, some types of variety incur high production costs and some types of variety incur high market mediation costs. We characterize supply chain structure by the degree to which production facilities are scale-efficient and by the distance of the production facility from the target market. We hypothesize that firms with scale-efficient production (i.e., high-volume firms) will offer types of variety associated with high production costs, and firms with local production will offer types of variety associated with high market mediation costs. This hypothesis implies that there is a coherent way to match product variety with supply chain structure. Empirical results suggest that firms which match supply chain structure to the type of product variety they offer outperform firms which fail to match such choices.

Journal ArticleDOI
TL;DR: This work explicitly investigates the marketing-manufacturing trade-off and derive analytical implications for three possible design configurations: unique, premium-common, and basic-common to develop an index that can rank order components in terms of their attractiveness for commonality.
Abstract: Product design decisions substantially affect the cost and revenue drivers. A design configuration with commonality can lower manufacturing cost. However, such a design may hinder the ability to extract price premiums through product differentiation. We explicitly investigate the marketing-manufacturing trade-off and derive analytical implications for three possible design configurations: unique, premium-common, and basic-common. Our model considers two distinct segments of consumers. Some of the implications of our analysis are not readily apparent. For example, when the high-quality component is made common, the average quality of the products offered to the two segments increases. One may infer that with higher average quality, higher prices or higher total revenues might ensue. However, this may not be the case, as detailed in the paper. Finally, our analysis provides a useful framework to develop an index that can rank order components in terms of their attractiveness for commonality.

Journal ArticleDOI
TL;DR: Analytically and through simulation, it is shown that the manufacturer's benefit is insignificant when the parameters of the AR(1) process are known to both parties, as in Lee, So, and Tang (LST).
Abstract: In a recent paper, Lee, So, and Tang (2000) showed that in a two-level supply chain with non-stationary AR(1) end demand, the manufacturer benefits significantly when the retailer shares point-of-sale (POS) demand data. We show in this paper, analytically and through simulation, that the manufacturer's benefit is insignificant when the parameters of the AR(1) process are known to both parties, as in Lee, So, and Tang (LST). The key reason for the difference between our results and those of LST is that LST assume that the manufacturer also uses an AR(1) process to forecast the retailer order quantity. However, the manufacturer can reduce the variance of its forecast further by using the entire order history to which it has access. Thus, when intelligent use of already available internal information (order history) suffices, there is no need to invest in interorganizational systems for information sharing.

Journal ArticleDOI
TL;DR: It is shown that state-dependent and base-stock policies are optimal for stochastic inventory systems with and without fixed costs, and conditions under which advance demand information has no operational value are determined.
Abstract: There is a growing consensus that a portfolio of customers with different demand lead times can lead to higher, more regular revenues and better capacity utilization. Customers with positive demand lead times place orders in advance of their needs, resulting inadvance demand information. This gives rise to the problem of finding effective inventory control policies under advance demand information. We show that state-dependent ( s, S) and base-stock policies are optimal for stochastic inventory systems with and without fixed costs. The state of the system reflects our knowledge of advance demand information. We also determine conditions under which advance demand information has no operational value. A numerical study allows us to obtain additional insights and to evaluate strategies to induce advance demand information.

Journal ArticleDOI
TL;DR: The impact of fixed and variable costs on the structure and competitiveness of supply chains with a serial structure and price-sensitive linear deterministic demand and the effects of vertical integration in the two-tier case are examined.
Abstract: Supply chains often consist of several tiers, with different numbers of firms competing at each tier. A major determinant of the structure of supply chains is the cost structure associated with the underlying manufacturing process. In this paper, we examine the impact of fixed and variable costs on the structure and competitiveness of supply chains with a serial structure and price-sensitive linear deterministic demand. The entry stage is modeled as a simultaneous game, where the players take the outcomes of the subsequent post-entry Cournot competition into account in making their entry decisions. We derive expressions for prices and production quantities as functions of the number of entrants at each tier of a multitier chain. We characterize viability and stability of supply-chain structures and show, using lattice arguments, that there is always an equilibrium structure in pure strategies in the entry game. Finally, we examine the effects of vertical integration in the two-tier case. Altogether, the paper provides a framework for comparing a variety of supply-chain structures and for studying how they are affected by cost structures and by the number of entrants throughout the chain.