Sasanka Sekhar Chanda
Bio: Sasanka Sekhar Chanda is an academic researcher from Indian Institute of Management Indore. The author has contributed to research in topic(s): Organizational learning & Scientific realism. The author has an hindex of 4, co-authored 15 publication(s) receiving 43 citation(s).
Topics: Organizational learning, Scientific realism, Organizational effectiveness, Subsidiary, Organization development
TL;DR: This study investigates organizational outcomes when such predictions are not possible and managers intentionally focus their firm on either exploratory or exploitative innovation, and finds that multiple exploration–exploitation combinations lead to equivalent, maximum organizational knowledge.
Abstract: Extant research has provided ambiguous answers to the question as to what constitutes an ideal balance between exploration and exploitation, in stable and turbulent environments. Much of the literature emphasizes controlling organizational actions by means of predictions based on historical knowledge. In our study, we investigate organizational outcomes when such predictions are not possible and managers intentionally focus their firm on either exploratory or exploitative innovation. Using March's iconic computational simulation model, we find that multiple exploration---exploitation combinations lead to equivalent, maximum organizational knowledge, establishing a rational basis for managerial intentionality toward exploratory or exploitative innovation. We further find that onset of environmental turbulence impacts an organization focusing on exploratory innovation in a way that is different from the way an organization focusing on exploitative innovation is impacted. The former is enabled to carry out an increasing level of its core activity, exploration. The latter is required to dial down its core activity, exploitation. Our findings suggest a resolution to conflicting prescriptions regarding appropriate response to onset of environmental turbulence endemic in the literature.
Sasanka Sekhar Chanda1•Institutions (1)
TL;DR: It is argued that the probability of correctly fashioning the subset of key elements in the intermediate output may be a good measure of the likelihood of organizational success, and uses March’s iconic computational simulation model to demonstrate this principle.
Abstract: The difficulty in definitively linking outcomes of managerial action to organizational outcomes has been a festering issue in organizational research. The problem arises because it is not easy to separate the distinctive contributions of managers at intermediate stages, as well as the contribution of external factors beyond the control of managers. Specifically, certain managerial actions focusing on exploratory or exploitative innovation produce an intermediate output, organizational knowledge. From this base of organizational knowledge, further management actions craft the final output that eventually faces the market test. Drawing from complexity concepts, I argue that the probability of correctly fashioning the subset of key elements in the intermediate output may be a good measure of the probability of organizational success. I use March's iconic computational simulation model to demonstrate the merits of this principle. I model the effect of complexity on managerial intentionality towards exploratory and exploitative innovation. I elicit important insights for research and practice by comparing organizational knowledge outcomes with the outcomes for probability of organizational success, in stable and moderately turbulent environment.
01 Nov 2016-Thunderbird International Business Review
Abstract: We study a multinational corporation's (MNC) failure in implementing a firm-wide information technology system (ITS) project. To counter heightened competitive pressures, the MNC sought to improve its supply chain responsiveness by implementing the ITS project. However, since the headquarters (HQ) personnel lacked nuanced understanding of the micro issues in the subsidiaries, their design efforts turned out to be inadequate. Organizational practices that restrict member behavior to recipes from past knowledge served to amplify the problem—by disfavoring cooperation. Our study suggests a need to rethink the notion of the HQ as the design place. Further, there is a case for subsidiary personnel being afforded a greater say in design of changes to their work processes, given their higher exposure to process variety. This may offset the HQ-subsidiary power imbalance noted in prior literature. © 2015 Wiley Periodicals, Inc.
01 Nov 2019-Strategic Organization
TL;DR: A two-step method of model verification is suggested—beginning with replicating the model from the published description, then turning to the program code of the original implemented model to account for divergent results.
Abstract: To validate prior agent-based models, public disclosure of model code is necessary, but not sufficient. Conceptual model replication, involving independent reproduction of a model without referring...
27 Mar 2015-Social Science Research Network
Abstract: Cognitive considerations arising from the functions of the Chief Executive Officer appear to have been ignored in strategy research. Taking a cue from Freeman’s stakeholder theory, I discuss the constituencies addressed by the CEO in performing his/her job function. I present a set of propositions based on extant research. I endeavor to commence a conversation that ultimately shapes the lines of inquiry regarding the contribution of managerial cognition to strategy.
01 Jan 2008-
Abstract: What makes organizations so similar? We contend that the engine of rationalization and bureaucratization has moved from the competitive marketplace to the state and the professions. Once a set of organizations emerges as a field, a paradox arises: rational actors make their organizations increasingly similar as they try to change them. We describe three isomorphic processes-coercive, mimetic, and normative—leading to this outcome. We then specify hypotheses about the impact of resource centralization and dependency, goal ambiguity and technical uncertainty, and professionalization and structuration on isomorphic change. Finally, we suggest implications for theories of organizations and social change.
15 Mar 2010-Bulletin of the American Physical Society
TL;DR: This work develops a framework for understanding the robustness of interacting networks subject to cascading failures and presents exact analytical solutions for the critical fraction of nodes that, on removal, will lead to a failure cascade and to a complete fragmentation of two interdependent networks.
Abstract: Complex networks have been studied intensively for a decade, but research still focuses on the limited case of a single, non-interacting network. Modern systems are coupled together and therefore should be modelled as interdependent networks. A fundamental property of interdependent networks is that failure of nodes in one network may lead to failure of dependent nodes in other networks. This may happen recursively and can lead to a cascade of failures. In fact, a failure of a very small fraction of nodes in one network may lead to the complete fragmentation of a system of several interdependent networks. A dramatic real-world example of a cascade of failures (‘concurrent malfunction’) is the electrical blackout that affected much of Italy on 28 September 2003: the shutdown of power stations directly led to the failure of nodes in the Internet communication network, which in turn caused further breakdown of power stations. Here we develop a framework for understanding the robustness of interacting networks subject to such cascading failures. We present exact analytical solutions for the critical fraction of nodes that, on removal, will lead to a failure cascade and to a complete fragmentation of two interdependent networks. Surprisingly, a broader degree distribution increases the vulnerability of interdependent networks to random failure, which is opposite to how a single network behaves. Our findings highlight the need to consider interdependent network properties in designing robust networks.
Daniel Z. Levin1•Institutions (1)
01 Aug 1997-
Abstract: This study examines both incremental learning curves as well as "revolutionary" learning and innovation in the area of product quality. One implication of the findings is that knowledge gained from...
01 Jan 1978-Communications
TL;DR: I like to consider search procedures as those processes that are employed to locate relevant information in time, in space, and in kind, often against apparently insurmountable odds.
Abstract: Particularly because of the great amount of information that a social memory can contain, relevant information is quite often difficult to find and these difficulties become even larger when the needed information is dispersed widely in time and in space or when special barriers are erected to protect such information from being discoverd. I like to consider search procedures as those processes that are employed to locate relevant information in time, in space, and in kind, often against apparently insurmountable odds. Research and development problems provide the most prototypical examples of situations in which efficient search procedures are decisive in bridging an existing information gap. In order to incorporate into a design as much information as possible, it is quite common that a very large number of reports may have to be read, most being irrelevant in fact. Finding a solution then depends not so much on the retrieval of information rather on the efficiency of the search procedure available. Because research reports tend to be somewhat more standardized (at least clearly distinct and written in the same medium) computers have already provided useful selection aids for literature references. Another problem of search is criminal detection. This may involve identifying one out of thousands of widely dispersed and highly mobile suspects. Among the many clues that may become available during an investigation, most are likely to be unproductive and the successful conclusion of a case presumably depends on following the right clues early enough and without being sidetracked. Related to such situations is the problem which many intelligence departments face when trying to obtain information that someone else deliberately hides or encodes into an unrecognizable cipher. Less extreme though socially probably more
01 Jan 2011-
Abstract: This paper reviews the 2009 Nobel Prize in Economics jointly awarded to Oliver Williamson for his work on governance in organizations and the boundaries of the firm, and to Elinor Ostrom for her work on the governance of common pool resources. We review the careers and the research contributions of Williamson and Ostrom to the theory and analysis of economic institutions of governance. Both winners of this Prize for 'economic governance' are thoroughly deserved, yet like the Hayek- Myrdal Prize of 1974 their respective approaches, methods and findings are almost diametrically opposed. Williamson offers a top-down contracts-based solution to the incentive problems of opportunism in corporate governance, whereas Ostrom offers a bottom-up communication-based solution to the governance opportunities of community resources. We offer some critical comments on Williamson's analytic work and discussion of the potential for further application of Ostrom's case-study based experimental methodology. We conclude with a suggested third nominee to make better sense of how these two great scholars' works fit together, namely George Richardson'
Related Authors (1)
Author's H-index: 4