scispace - formally typeset
Search or ask a question
Author

Sasanka Sekhar Chanda

Bio: Sasanka Sekhar Chanda is an academic researcher from Indian Institute of Management Indore. The author has contributed to research in topics: Organizational learning & Computer science. The author has an hindex of 4, co-authored 15 publications receiving 43 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: This study investigates organizational outcomes when such predictions are not possible and managers intentionally focus their firm on either exploratory or exploitative innovation, and finds that multiple exploration–exploitation combinations lead to equivalent, maximum organizational knowledge.
Abstract: Extant research has provided ambiguous answers to the question as to what constitutes an ideal balance between exploration and exploitation, in stable and turbulent environments. Much of the literature emphasizes controlling organizational actions by means of predictions based on historical knowledge. In our study, we investigate organizational outcomes when such predictions are not possible and managers intentionally focus their firm on either exploratory or exploitative innovation. Using March's iconic computational simulation model, we find that multiple exploration---exploitation combinations lead to equivalent, maximum organizational knowledge, establishing a rational basis for managerial intentionality toward exploratory or exploitative innovation. We further find that onset of environmental turbulence impacts an organization focusing on exploratory innovation in a way that is different from the way an organization focusing on exploitative innovation is impacted. The former is enabled to carry out an increasing level of its core activity, exploration. The latter is required to dial down its core activity, exploitation. Our findings suggest a resolution to conflicting prescriptions regarding appropriate response to onset of environmental turbulence endemic in the literature.

13 citations

Journal ArticleDOI
TL;DR: It is argued that the probability of correctly fashioning the subset of key elements in the intermediate output may be a good measure of the likelihood of organizational success, and uses March’s iconic computational simulation model to demonstrate this principle.
Abstract: The difficulty in definitively linking outcomes of managerial action to organizational outcomes has been a festering issue in organizational research. The problem arises because it is not easy to separate the distinctive contributions of managers at intermediate stages, as well as the contribution of external factors beyond the control of managers. Specifically, certain managerial actions focusing on exploratory or exploitative innovation produce an intermediate output, organizational knowledge. From this base of organizational knowledge, further management actions craft the final output that eventually faces the market test. Drawing from complexity concepts, I argue that the probability of correctly fashioning the subset of key elements in the intermediate output may be a good measure of the probability of organizational success. I use March's iconic computational simulation model to demonstrate the merits of this principle. I model the effect of complexity on managerial intentionality towards exploratory and exploitative innovation. I elicit important insights for research and practice by comparing organizational knowledge outcomes with the outcomes for probability of organizational success, in stable and moderately turbulent environment.

9 citations

Journal ArticleDOI
TL;DR: In this article, a multinational corporation's (MNC) failure in implementing a firmwide information technology system (ITS) project was studied. But since the HQ personnel lacked nuanced understanding of the micro issues in the subsidiaries, their design efforts turned out to be inadequate.
Abstract: We study a multinational corporation's (MNC) failure in implementing a firm-wide information technology system (ITS) project. To counter heightened competitive pressures, the MNC sought to improve its supply chain responsiveness by implementing the ITS project. However, since the headquarters (HQ) personnel lacked nuanced understanding of the micro issues in the subsidiaries, their design efforts turned out to be inadequate. Organizational practices that restrict member behavior to recipes from past knowledge served to amplify the problem—by disfavoring cooperation. Our study suggests a need to rethink the notion of the HQ as the design place. Further, there is a case for subsidiary personnel being afforded a greater say in design of changes to their work processes, given their higher exposure to process variety. This may offset the HQ-subsidiary power imbalance noted in prior literature. © 2015 Wiley Periodicals, Inc.

7 citations

Journal ArticleDOI
TL;DR: A two-step method of model verification is suggested—beginning with replicating the model from the published description, then turning to the program code of the original implemented model to account for divergent results.
Abstract: To validate prior agent-based models, public disclosure of model code is necessary, but not sufficient. Conceptual model replication, involving independent reproduction of a model without referring...

7 citations

Journal ArticleDOI
TL;DR: It is demonstrated that the continuum conception concerns leveraging an organization’s internal knowledge heterogeneity where managers use their prior knowledge and experiences to formulate actions to attain the maximum possible extent of organizational knowledge at equilibrium.
Abstract: Extant research is vertically divided on the question whether exploration and exploitation constitute two ends of a continuum or whether they are orthogonal activities. We suggest that both characterizations are admissible, albeit under different sets of assumptions. Using March’s iconic model, we demonstrate that the continuum conception concerns leveraging an organization’s internal knowledge heterogeneity where managers use their prior knowledge and experiences to formulate actions to attain the maximum possible extent of organizational knowledge at equilibrium. In contrast, the orthogonal conception mainly concerns assimilating heterogeneous knowledge from sources outside the organization through risky experimentation, leading to order-creation in systems operating in far-from-equilibrium conditions. We further demonstrate that the change in outcome obtained by switching from low to high rate of exploitation is larger—and therefore easier to detect—for the continuum conception. We speculate that many managers and researchers favor conceptualizing exploration–exploitation in the continuum sense, for this reason. Moreover, companies obtain far higher organizational knowledge by functioning in the orthogonal mode, than what is attainable by functioning in the continuum mode. Organizations should, therefore, strive to create conditions that foster cultivation of outside knowledge through autonomous actions of employees.

3 citations


Cited by
More filters
01 Jan 2008
TL;DR: In this article, the authors argue that rational actors make their organizations increasingly similar as they try to change them, and describe three isomorphic processes-coercive, mimetic, and normative.
Abstract: What makes organizations so similar? We contend that the engine of rationalization and bureaucratization has moved from the competitive marketplace to the state and the professions. Once a set of organizations emerges as a field, a paradox arises: rational actors make their organizations increasingly similar as they try to change them. We describe three isomorphic processes-coercive, mimetic, and normative—leading to this outcome. We then specify hypotheses about the impact of resource centralization and dependency, goal ambiguity and technical uncertainty, and professionalization and structuration on isomorphic change. Finally, we suggest implications for theories of organizations and social change.

2,134 citations

Journal Article
TL;DR: This work develops a framework for understanding the robustness of interacting networks subject to cascading failures and presents exact analytical solutions for the critical fraction of nodes that, on removal, will lead to a failure cascade and to a complete fragmentation of two interdependent networks.
Abstract: Complex networks have been studied intensively for a decade, but research still focuses on the limited case of a single, non-interacting network. Modern systems are coupled together and therefore should be modelled as interdependent networks. A fundamental property of interdependent networks is that failure of nodes in one network may lead to failure of dependent nodes in other networks. This may happen recursively and can lead to a cascade of failures. In fact, a failure of a very small fraction of nodes in one network may lead to the complete fragmentation of a system of several interdependent networks. A dramatic real-world example of a cascade of failures (‘concurrent malfunction’) is the electrical blackout that affected much of Italy on 28 September 2003: the shutdown of power stations directly led to the failure of nodes in the Internet communication network, which in turn caused further breakdown of power stations. Here we develop a framework for understanding the robustness of interacting networks subject to such cascading failures. We present exact analytical solutions for the critical fraction of nodes that, on removal, will lead to a failure cascade and to a complete fragmentation of two interdependent networks. Surprisingly, a broader degree distribution increases the vulnerability of interdependent networks to random failure, which is opposite to how a single network behaves. Our findings highlight the need to consider interdependent network properties in designing robust networks.

132 citations

Journal ArticleDOI
01 Aug 1997
TL;DR: In this article, the authors examine both incremental learning curves as well as "revolutionary" learning and innovation in the area of product quality, and conclude that knowledge gained from knowledge obtained from both incremental and revolutionary learning can be used to improve product quality.
Abstract: This study examines both incremental learning curves as well as "revolutionary" learning and innovation in the area of product quality. One implication of the findings is that knowledge gained from...

93 citations

Journal ArticleDOI
TL;DR: I like to consider search procedures as those processes that are employed to locate relevant information in time, in space, and in kind, often against apparently insurmountable odds.
Abstract: Particularly because of the great amount of information that a social memory can contain, relevant information is quite often difficult to find and these difficulties become even larger when the needed information is dispersed widely in time and in space or when special barriers are erected to protect such information from being discoverd. I like to consider search procedures as those processes that are employed to locate relevant information in time, in space, and in kind, often against apparently insurmountable odds. Research and development problems provide the most prototypical examples of situations in which efficient search procedures are decisive in bridging an existing information gap. In order to incorporate into a design as much information as possible, it is quite common that a very large number of reports may have to be read, most being irrelevant in fact. Finding a solution then depends not so much on the retrieval of information rather on the efficiency of the search procedure available. Because research reports tend to be somewhat more standardized (at least clearly distinct and written in the same medium) computers have already provided useful selection aids for literature references. Another problem of search is criminal detection. This may involve identifying one out of thousands of widely dispersed and highly mobile suspects. Among the many clues that may become available during an investigation, most are likely to be unproductive and the successful conclusion of a case presumably depends on following the right clues early enough and without being sidetracked. Related to such situations is the problem which many intelligence departments face when trying to obtain information that someone else deliberately hides or encodes into an unrecognizable cipher. Less extreme though socially probably more

48 citations

01 Jan 2011
TL;DR: The 2009 Nobel Prize in economics jointly awarded to Oliver Williamson for his work on governance in organizations and the boundaries of the firm, and to Elinor Ostrom for her work on the governance of common pool resources as discussed by the authors.
Abstract: This paper reviews the 2009 Nobel Prize in Economics jointly awarded to Oliver Williamson for his work on governance in organizations and the boundaries of the firm, and to Elinor Ostrom for her work on the governance of common pool resources. We review the careers and the research contributions of Williamson and Ostrom to the theory and analysis of economic institutions of governance. Both winners of this Prize for 'economic governance' are thoroughly deserved, yet like the Hayek- Myrdal Prize of 1974 their respective approaches, methods and findings are almost diametrically opposed. Williamson offers a top-down contracts-based solution to the incentive problems of opportunism in corporate governance, whereas Ostrom offers a bottom-up communication-based solution to the governance opportunities of community resources. We offer some critical comments on Williamson's analytic work and discussion of the potential for further application of Ostrom's case-study based experimental methodology. We conclude with a suggested third nominee to make better sense of how these two great scholars' works fit together, namely George Richardson'

33 citations