scispace - formally typeset
Search or ask a question

Showing papers in "Decision Sciences in 2005"


Journal ArticleDOI
TL;DR: Whether uncertainty, equivocality, and platform development strategy change the relationships among internal integration, external integration, and competitive capabilities is considered.
Abstract: Effective product development requires firms to unify internal and external participants. As companies attempt to create this integrated environment, two important questions emerge. Does a high level of internal integration lead to a higher level of external integration? In the context of product development, this study considers whether internal integration in the form of concurrent engineering practices affects the level of external integration as manifested by customer integration, supplier product integration, and supplier process integration. External integration, in turn, may influence competitive capabilities, namely product innovation performance and quality performance. Second, using contingency theory, do certain contextual variables moderate the linkages between integration strategy (external and internal) and performance? Specifically, this study considers whether uncertainty, equivocality, and platform development strategy change the relationships among internal integration, external integration, and competitive capabilities. Data collected from 244 manufacturing firms across several industries were used to test these research questions. The results indicate that both internal and external integration positively influence product innovation and quality and ultimately, profitability. With respect to contingency effects, the results indicate that equivocality moderates the relationships between integration and performance.

801 citations


Journal ArticleDOI
TL;DR: The results suggest that strategy integration plays a strong, central role in the creation of manufacturing cost efficiency and new product flexibility capabilities and mediate the influence of strategy integration on market-based performance.
Abstract: Manufacturing plant managers have sought performance improvements through implementing best practices discussed in World Class Manufacturing literature. However, our collective understanding of linkages between practices and performance remains incomplete. This study seeks a more complete theory, advancing the idea that strategy integration and enhanced manufacturing capabilities such as cost efficiency and flexibility serve as intermediaries by which practices affect performance. Hypotheses related to this thesis are tested using data from 57 North American manufacturing plants that are past winners and finalists in Industry Week's“America's Best” competition (Drickhamer, 2001). The results suggest that strategy integration plays a strong, central role in the creation of manufacturing cost efficiency and new product flexibility capabilities. Furthermore, strategy integration moderates the influences of product-process development, supplier relationship management, workforce development, just-in-time flow, and process quality management practices on certain manufacturing capabilities. In turn, manufacturing cost efficiency and new product flexibility capabilities mediate the influence of strategy integration on market-based performance. These findings have implications for practice and for future research.

361 citations


Journal ArticleDOI
TL;DR: The study shows how specific IOS decisions allow manufacturing firms to better manage their dependence on the supplier for resources and thereby select system functionalities that are consistent with their own operating environments and the desired supply chain design.
Abstract: Manufacturing firms are increasingly seeking cost and other competitive advantages by tightly coupling and managing their relationship with suppliers. Among other mechanisms, interorganizational systems (IOS) that facilitate boundary-spanning activities of a firm enable them to effectively manage different types of buyer–supplier relationships. This study integrates literature from the operations and information systems fields to create a joint perspective in understanding the linkages between the nature of the IOS, buyer–supplier relationships, and manufacturing performance at the dyadic level. External integration, breadth, and initiation are used to capture IOS functionality, and their effect on process efficiency and sourcing leverage is examined. The study also explores the differences in how manufacturing firms use IOS when operating under varying levels of competitive intensity and product standardization. In order to test the research models and related hypothesis, empirical data on buyer–supplier dyads is collected from manufacturing firms. The results show that only higher levels of external integration that go beyond simple procurement systems, as well as who initiates the IOS, allow manufacturing firms to enhance process efficiency. In contrast, IOS breadth and IOS initiation enable manufacturing firms to enhance sourcing leverage over their suppliers. In addition, firms making standardized products in highly competitive environments tend to achieve higher process efficiencies and have higher levels of external integration. The study shows how specific IOS decisions allow manufacturing firms to better manage their dependence on the supplier for resources and thereby select system functionalities that are consistent with their own operating environments and the desired supply chain design.

359 citations


Journal ArticleDOI
TL;DR: An alternative approach is proposed that explicitly addresses the modeling of control structures and is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design.
Abstract: Owing to its inherent modeling flexibility, simulation is often regarded as the proper means for supporting decision making on supply chain design. The ultimate success of supply chain simulation, however, is determined by a combination of the analyst's skills, the chain members' involvement, and the modeling capabilities of the simulation tool. This combination should provide the basis for a realistic simulation model, which is both transparent and complete. The need for transparency is especially strong for supply chains as they involve (semi)autonomous parties each having their own objectives. Mutual trust and model effectiveness are strongly influenced by the degree of completeness of each party's insight into the key decision variables. Ideally, visual interactive simulation models present an important communicative means for realizing the required overview and insight. Unfortunately, most models strongly focus on physical transactions, leaving key decision variables implicit for some or all of the parties involved. This especially applies to control structures, that is, the managers or systems responsible for control, their activities and their mutual attuning of these activities. Control elements are, for example, dispersed over the model, are not visualized, or form part of the time-indexed scheduling of events. In this article, we propose an alternative approach that explicitly addresses the modeling of control structures. First, we will conduct a literature survey with the aim of listing simulation model qualities essential for supporting successful decision making on supply chain design. Next, we use this insight to define an object-oriented modeling framework that facilitates supply chain simulation in a more realistic manner. This framework is meant to contribute to improved decision making in terms of recognizing and understanding opportunities for improved supply chain design. Finally, the use of the framework is illustrated by a case example concerning a supply chain for chilled salads.

238 citations


Journal ArticleDOI
TL;DR: This article employs a contingency approach, arguing that the KM announcement would have a positive short-term impact on firm value in some conditions but not in others, and provides empirical support for the theory-based arguments, and helps develop a contingency framework of the effectiveness of KM efforts.
Abstract: The importance of knowledge management (KM) processes for organizational performance is now well recognized. Seeking to better understand the short-term impact of KM on firm value, this article focuses on public announcements of information technology (IT)-based KM efforts, and uses cumulative abnormal return (CAR) associated with an announcement as the dependent variable. This article employs a contingency approach, arguing that the KM announcement would have a positive short-term impact on firm value in some conditions but not in others. Thus, it pursues the following research question: What are the effects of contextual factors on the CAR associated with the announcement of an IT-based KM effort? Specific hypotheses are proposed based on information-processing theory, organizational learning theory, the knowledge-based theory of the firm, and the theory of knowledge creation. These hypotheses link CARs to alignment between industry innovativeness and the KM process, alignment between firm efficiency and the KM process, firm-specific instability, and firm diversification. The empirical study utilizes secondary data on 89 KM announcements from 1995 to 2002. The results largely support the hypotheses. Overall, this article provides empirical support for the theory-based arguments, and helps develop a contingency framework of the effectiveness of KM efforts.

227 citations


Journal ArticleDOI
TL;DR: The results show that the EUCS is a valid and robust instrument in the Web environment but that one of the subfactors, timeliness, will need further refinement in the future.
Abstract: The purpose of this study is to revise and revalidate the End-User Computing Satisfaction (EUCS) instrument to measure satisfaction with a Web site from a usability perspective. This study is especially important given the increased significance of the Web and the uniqueness of the Web as a computing environment. A total of 176 students participated in a lab simulation that involved a usability evaluation of the Lands' End Web site (http://www.landsend.com). Students were asked to complete a set of tasks, record their answers, and then complete the EUCS instrument. Confirmatory factor analysis and invariance analyses were conducted to test the reliability, validity, and generalizability of the revised EUCS. The results show that the EUCS is a valid and robust instrument in the Web environment but that one of the subfactors, timeliness, will need further refinement in the future. Usability practitioners can use the EUCS to measure end-user satisfaction with a Web site and use the feedback for improving Web-site design. We describe a case study of an actual usability application that utilized the revised EUCS effectively to support the design of building supply Web sites involving two types of end users, homeowners and contractors. We also propose a typology that researchers can use as a starting point to judge when it is necessary to revalidate an instrument like the EUCS. Finally, we discuss the limitations of our study and present avenues for future research.

170 citations


Journal ArticleDOI
TL;DR: This article studies the use of intervals in the simple multiattribute rating technique (SMART) and SWING weighting methods, and generalizes the methods by allowing the reference attribute to be any attribute, not just the most or the least important one.
Abstract: Interval judgments are a way of handling preferential and informational imprecision in multicriteria decision analysis (MCDA). In this article, we study the use of intervals in the simple multiattribute rating technique (SMART) and SWING weighting methods. We generalize the methods by allowing the reference attribute to be any attribute, not just the most or the least important one, and by allowing the decision maker to reply with intervals to the weight ratio questions to account for his/her judgmental imprecision. We also study the practical and procedural implications of using imprecision intervals in these methods. These include, for example, how to select the reference attribute to identify as many dominated alternatives as possible. Based on the results of a simulation study, we suggest guidelines for how to carry out the weighting process in practice. Computer support can be used to make the process visual and interactive. We describe the WINPRE software for interval SMART/SWING, preference assessment by imprecise ratio statements (PAIRS), and preference programming. The use of interval SMART/SWING is illustrated by a job selection example.

163 citations


Journal ArticleDOI
TL;DR: The research answers a call for rigorous research in the area of predictive marketing, an area in which many companies are excelling but where there is a scarcity of detailed knowledge regarding application of such models.
Abstract: This research presents the development of behavioral scoring models to predict future customer purchases in an online ordering application. Internet retailing lowers many barriers for customers switching between retailers for repeat purchases; thus, retaining existing customers is a key challenge for achieving profitability. Survey data were collected from 1,089 online customers of two companies. The subjective survey data were then used to predict purchases over the ensuing 12 months based on data from the company databases. The analysis illustrates the general applicability of predictive models of future customer purchases while also demonstrating the need to develop specific models tailored for an individual company’s operating and marketing environment. The models provide insight on how companies can target marketing dollars more effectively and allocate investment across multiple operational areas for maximum return. The research answers a call for rigorous research in the area of predictive marketing, an area in which many companies are excelling but where there is a scarcity of detailed knowledge regarding application of such models.

133 citations


Journal ArticleDOI
TL;DR: A procedure for measuring the competencies that can be developed in association with a Quality Management (QM) initiative is put forward and the reliability and validity of the resulting scale is analyzed.
Abstract: Despite the important contributions made by the Competency-Based Perspective (CBP) to strategic thought, certain issues on the operational definition of the theoretical concepts that characterize this approach remain unresolved, thus limiting its empirical application. In addressing this issue, the present study puts forward a procedure for measuring the competencies that can be developed in association with a Quality Management (QM) initiative and analyzes the reliability and validity of the resulting scale. This procedure could be transferred to studies that aim to carry out an empirical analysis based on the theoretical position of the CBP.

101 citations


Journal ArticleDOI
TL;DR: The results show that past use as measured by computer-recorded log data can significantly enhance the ability to predict system usage, and suggest that an accurate prediction of system usage requires a more rigorous approach than that often applied in information systems research.
Abstract: The objective of this study is to provide insights into how the predictive power for computer-recorded system usage can be improved. Based on 386 responses from actual users of an information system, we examine the predictive power for system usage according to the scales of the predictors used, namely, intention and past use. First, we show that the predictive power of intention can be significantly improved with the choice of an appropriate measure. However, even the desirable intention measure failed to explain two-thirds of the variance in system usage. Second, the results show that past use as measured by computer-recorded log data can significantly enhance our ability to predict system usage. Finally, when both intention and past use are controlled for, the explained variance in system usage is shown to vary widely from 20% to 73%, depending on the predictors' scales. Overall, our findings suggest that an accurate prediction of system usage requires a more rigorous approach than that often applied in information systems research.

84 citations


Journal ArticleDOI
TL;DR: The effectiveness of a tactical demand-capacity management policy to guide operational decisions in order-driven production systems is investigated via a heuristic that attempts to maximize revenue by selectively accepting or rejecting customer orders for multiple product classes when demand exceeds capacity constantly over the short term.
Abstract: This article investigates the effectiveness of a tactical demand-capacity management policy to guide operational decisions in order-driven production systems. The policy is implemented via a heuristic that attempts to maximize revenue by selectively accepting or rejecting customer orders for multiple product classes when demand exceeds capacity constantly over the short term. The performance of the heuristic is evaluated in terms of its ability to generate a higher profit compared to a first-come-first-served (FCFS) policy. The policies are compared over a wide range of conditions characterized by variations in both internal (firm) and external (market) factors. The heuristic, when used with a Whole Lot order-processing approach, produces higher profit compared to FCFS when profit margins of products are substantially different from each other and demand exceeds capacity by a large amount. In other cases it is better to use the heuristic in conjunction with the Split Lot order-processing approach.

Journal ArticleDOI
TL;DR: Analytical and numerical results clearly show that, as compared to the results for the riskless demand, dealing with uncertainty through a stochastic demand leads to higher retail prices if additive (multiplicative) error and higher claw backs in both error structures wherever applicable.
Abstract: This article considers the joint development of the optimal pricing and ordering policies of a profit-maximizing retailer, faced with (i) a manufacturer trade incentive in the form of a price discount for itself or a rebate directly to the end customer; (ii) a stochastic consumer demand dependent upon the magnitude of the selling price and of the trade incentive, that is contrasted with a riskless demand, which is the expected value of the stochastic demand; and (iii) a single-period newsvendor-type framework. Additional analysis includes the development of equal profit policies in either form of trade incentive, an assessment of the conditions under which a one-dollar discount is more profitable than a one-dollar rebate, and an evaluation of the impact upon the retailer-expected profits of changes in either incentive or in the degree of demand uncertainty. A numerical example highlights the main features of the model. The analytical and numerical results clearly show that, as compared to the results for the riskless demand, dealing with uncertainty through a stochastic demand leads to (i) (lower) higher retail prices if additive (multiplicative) error, (ii) lower (higher) pass throughs if additive (multiplicative) error, (iii) higher claw backs in both error structures wherever applicable, and (iv) higher rebates to achieve equivalent profits in both error structures.

Journal ArticleDOI
TL;DR: A systematic approach that incorporates fuzzy set theory in conjunction with portfolio matrices to assist managers in reaching a better understanding of the overall competitiveness of their business portfolios is proposed.
Abstract: We propose a systematic approach that incorporates fuzzy set theory in conjunction with portfolio matrices to assist managers in reaching a better understanding of the overall competitiveness of their business portfolios. Integer linear programming is also accommodated in the proposed integrated approach to help select strategic plans by using the results derived from the previous portfolio analysis and other financial data. The proposed integrated approach is designed from a strategy-oriented perspective for portfolio management at the corporate level. It has the advantage of dealing with the uncertainty problem of decision makers in doing evaluation, providing a technique that presents the diversity of confidence and optimism levels of decision makers. Furthermore, integer linear programming is used because it offers an effective quantitative method for managers to allocate constrained resources optimally among proposed strategies. An illustration from a real-world situation demonstrates the integrated approach. Although a particular portfolio matrix model has been adopted in our research, the procedure proposed here can be modified to incorporate other portfolio matrices.

Journal ArticleDOI
TL;DR: In this article, the authors model optimal responses to unplanned employee absences in multi-server queueing systems that provide discrete, pay-per-use services for impatient customers, and assess the performance of alternate absence recovery strategies under various staffing and scheduling regimes.
Abstract: The U.S. service sector loses 2.3% of all scheduled labor hours to unplanned absences, but in some industries, the total cost of unplanned absences approaches 20% of payroll expense. The principal reasons for unscheduled absences (personal illness and family issues) are unlikely to abate anytime soon. Despite this, most labor scheduling systems continue to assume perfect attendance. This oversight masks an important but rarely addressed issue in services management: how to recover from short-notice, short-term reductions in planned capacity. In this article, we model optimal responses to unplanned employee absences in multi-server queueing systems that provide discrete, pay-per-use services for impatient customers. Our goal is to assess the performance of alternate absence recovery strategies under various staffing and scheduling regimes. We accomplish this by first developing optimal labor schedules for hypothetical service environments with unreliable workers. We then simulate unplanned employee absences, apply an absence recovery model, and compute system profits. Our absence recovery model utilizes recovery strategies such as holdover overtime, call-ins, and temporary workers. We find that holdover overtime is an effective absence recovery strategy provided sufficient reserve capacity (maximum allowable work hours minus scheduled hours) exists. Otherwise, less precise and more costly absence recovery methods such as call-ins and temporary help service workers may be needed. We also find that choices for initial staffing and scheduling policies, such as planned overtime and absence anticipation, significantly influence the likelihood of successful absence recovery. To predict the effectiveness of absence recovery policies under alternate staffing/scheduling strategies and operating environments, we propose an index based on initial capacity reserves.

Journal ArticleDOI
TL;DR: The Economic Payout Model for Service Guarantees (EPMSG) is defined that provides an optimal service guarantee economic payout under certain conditions and its objective function considers customer revenue over the short- and long-term, the cost of creating and providing the service, thecost of recovery, the probability of a service failure, and the likelihood of customer retention as a function of economic payout.
Abstract: Service guarantees consist of a promise to a customer (marketing), the delivery of a service to the customer (operations), and actions to appease the customer when service failures happen (recovery). A part of recovery involves offering the customer an economic and/or noneconomic payout when things go wrong. When the economic payout is too high or low, the impact on the organization and the customer is usually negative. Therefore, determining the size of the economic payout is of critical strategic and tactical importance in businesses. Yet, no systematic quantitative methods are found in the literature to help managers determine the economic payout for service failures. The current ways an economic payout is determined are management judgment, the consensus of customer focus groups, competitive benchmarking, and the use of simple expected value methods. In this article, we define the Economic Payout Model for Service Guarantees (EPMSG) that provides an optimal service guarantee economic payout under certain conditions. The EPMSG and its objective function considers customer revenue over the short- and long-term, the cost of creating and providing the service, the cost of recovery, the probability of a service failure, and the probability of customer retention as a function of economic payout. A numerical example is provided of how EPMSG works. Customer retention probability distributions are examined assuming normal and gamma distributions. We end the article by describing the theoretical contributions, model limitations, managerial implications, and opportunities for future research.

Journal ArticleDOI
TL;DR: Surprisingly, counter-intuitive findings reveal that the unilateral application of e-procurement technology by the buyer may lower his purchasing costs, but increase the seller's and system's costs.
Abstract: This research investigates the impact of electronic replenishment strategy on the operational activities and performance of a two-stage make-to-order supply chain. We develop simulation-based rolling schedule procedures that link the replenishment processes of the channel members and apply them in an experimental analysis to study manual, semi-automated, and fully automated e-replenishment strategies in decentralized and coordinated decision-making supply chain structures. The average operational cost reductions for moving from a manual-based system to a fully automated system are 19.6, 29.5, and 12.5%, respectively, for traditional decentralized, decentralized with information sharing, and coordinated supply chain structures. The savings are neither equally distributed among participants, nor consistent across supply chain structures. As expected, for the fully coordinated system, total costs monotonically decrease with higher levels of automation. However, for the two decentralized structures, under which most firms operate today, counter-intuitive findings reveal that the unilateral application of e-procurement technology by the buyer may lower his purchasing costs, but increase the seller's and system's costs. The exact nature of the relationship is determined by the channel's operational flexibility. Broader results indicate that while the potential economic benefit of e-replenishment in a decentralized system is substantial, greater operational improvements maybe possible through supply chain coordination.

Journal ArticleDOI
TL;DR: This study shows that market heterogeneity presents an effective discriminating factor for the supplier to segment customers in the design of a coordination mechanism and extends traditional quantity discounts to discount policies based on both buyers' individual order size and their annual volume.
Abstract: A challenge of supply chain management is to align the objectives, and hence coordinate the activities, of independent supply chain members. In this study, we approach this problem in a simple way by extending traditional quantity discounts that are based solely on buyers' individual order size to discount policies that are based on both buyers' individual order size and their annual volume. We show that discount policies are able to achieve nearly optimal system profit and, hence, provide effective coordination, for a decentralized two-echelon distribution system, whereby a supplier sells a product to a group of heterogeneous and independent retailers each facing a downward-sloping demand curve of its retail price. When buyers are heterogeneous, a critical issue of coordination is to motivate different customers to increase their demand and lot size according to their potential so as to improve profits. We show that market heterogeneity presents an effective discriminating factor for the supplier to segment customers in the design of a coordination mechanism.

Journal ArticleDOI
TL;DR: An operationalizion of a strategy to identify the sequence of particular types of actions that work teams should pursue over time to apply process knowledge for reducing yield variation is proposed.
Abstract: The widespread recognition of the detrimental effects of high yield variation in advanced manufacturing technology settings, both in terms of cost and management of production processes, underscores the need to develop effective strategies for reducing yield variation. In this article, we report the findings of a longitudinal field study in an electromechanical motor assembly plant where we examined how the application of process knowledge by production work teams can reduce yield variation. We propose and provide an operationalizion of a strategy to identify the sequence of particular types of actions—actions to control the mean followed by actions to control the variance—that work teams should pursue over time to apply process knowledge for reducing yield variation. The results of our empirical analysis show that yield variation was significantly reduced on three of the four production lines at the manufacturing plant that served as our research site. Differences in strategies for applying process knowledge help explain the different results on each of the production lines.

Journal ArticleDOI
TL;DR: An enhancement of an existing perturbation technique, General Additive Data Perturbation, is developed that minimizes the risk of disclosure while ensuring that the results of commonly performed statistical analyses are identical and equal for both the original and the perturbed data.
Abstract: As modern organizations gather, analyze, and share large quantities of data, issues of privacy, and confidentiality are becoming increasingly important. Perturbation methods are used to protect confidentiality when confidential, numerical data are shared or disseminated for analysis. Unfortunately, existing perturbation methods are not suitable for protecting small data sets. With small data sets, existing perturbation methods result in reduced protection against disclosure risk due to sampling error. Sampling error may also produce different results from the analysis of perturbed data compared to the original data, reducing data utility. In this study, we develop an enhancement of an existing perturbation technique, General Additive Data Perturbation, that can be used to effectively mask both large and small data sets. The proposed enhancement minimizes the risk of disclosure while ensuring that the results of commonly performed statistical analyses are identical and equal for both the original and the perturbed data.

Journal ArticleDOI
TL;DR: This work provides a framework to analyze the value proposition of options to potential sellers, option-holder behavior implications on auction processes, and seller strategies to write and price options that maximize potential revenues.
Abstract: The scenario of established business sellers utilizing online auction markets to reach consumers and sell new products is becoming increasingly common. We propose a class of risk management tools, loosely based on the concept of financial options that can be employed by such sellers. While conceptually similar to options in financial markets, we empirically demonstrate that option instruments within auction markets cannot be developed employing similar methodologies, because the fundamental tenets of extant option pricing models do not hold within online auction markets. We provide a framework to analyze the value proposition of options to potential sellers, option-holder behavior implications on auction processes, and seller strategies to write and price options that maximize potential revenues. We then develop an approach that enables a seller to assess the demand for options under different option price and volume scenarios. We compare option prices derived from our approach with those derived from the Black-Scholes model (Black & Scholes, 1973) and discuss the implications of the price differences. Experiments based on actual auction data suggest that options can provide significant benefits under a variety of option-holder behavioral patterns.

Journal ArticleDOI
TL;DR: An adaptive decision process based on case-based decision theory (CBDT) for the price-capacity problem is developed and it is shown that a CBDT DM in this setting eventually finds the optimal solution.
Abstract: The subject of this article is the simultaneous choice of product price and manufacturing capacity if demand is stochastic and service-level sensitive. In this setting, capacity as well as price have an impact on demand because several aspects of service level depend on capacity. For example, delivery time will be reduced if capacity is increased given a constant demand rate. We illustrate the relationship between service level, capacity, and demand reaction by a stylized application problem from the after-sales services industry. The reaction of customers to variations in service level and price is represented by a kinked price-demand-rate function. We first derive the optimal price-capacity combination for the resulting decision problem under full information. Subsequently, we focus on a decision maker (DM) who lacks complete knowledge of the demand function. Hence the DM is unable to anticipate the service level and consequently cannot identify the optimal solution. However, the DM will acquire additional information during the sales process and use it in subsequent revisions of the price-capacity decision. Thus, this decision making is adaptive and based on experience. In contrast to the literature, which assumes certain repetitive procedures somewhat ad hoc, we develop an adaptive decision process based on case-based decision theory (CBDT) for the price-capacity problem. Finally, we show that a CBDT DM in our setting eventually finds the optimal solution, if the DM sets the price based on absorption costs and adequately adjusts the capacity with respect to the observed demand.

Journal ArticleDOI
TL;DR: A stochastic model for a high-level abstraction of a revenue management system that allows us to understand the potential of incorporating auctions in revenue management in the presence of forecast errors associated with key parameters is developed.
Abstract: The Internet is providing an opportunity to revenue management practitioners to exploit the potential of auctions as a new price distribution channel. We develop a stochastic model for a high-level abstraction of a revenue management system (RMS) that allows us to understand the potential of incorporating auctions in revenue management in the presence of forecast errors associated with key parameters. Our abstraction is for an environment where two market segments book in sequence and revenue management approaches consider auctions in none, one, or both segments. Key insights from our robust results are (i) limited auctions are best employed closest to the final sale date, (ii) counterbalancing forecast errors associated with overall traffic intensity and the proportion of customer arrivals in a segment is more important if an auction is adopted in that segment, and (iii) it is critically important not to err on the side of overestimating market willingness to pay.

Journal ArticleDOI
TL;DR: This article formulate models for the acquisition of bandwidth from a buyer's perspective with a model that allows varying contract durations under deterministic demand and without allowing shortages or overlapping contracts and a simpler model, which restricts contract lengths over the planning horizon to be equal.
Abstract: Significant advances in information technology have brought about increased demand for bandwidth. Buyers of bandwidth often encounter bandwidth prices that are decreasing over time. Additionally, bandwidth prices at any point in time are decreasing in total bandwidth purchased and length of contracts. Therefore, buyers face complex decisions in terms of the number of contracts to buy, their bandwidth, and their lengths. In this article, we formulate models for the acquisition of bandwidth from a buyer's perspective. We begin with a model that allows varying contract durations under deterministic demand and without allowing shortages or overlapping contracts. We then formulate a simpler model, which restricts contract lengths over the planning horizon to be equal. We also solve the problem under probabilistic demand and allowing for shortages, which are satisfied by buying additional bandwidth at a premium. We perform numerical sensitivity analysis to compare the results of the models and illustrate the results with numerical examples. The numerical analyses illustrate that using relatively simple equal-length contracts produces approximately the same results as the more complicated unequal-length contract strategy.