Author
Bernard P. Zeigler
Other affiliations: University of Michigan, AmeriCorps VISTA, George Mason University ...read more
Bio: Bernard P. Zeigler is an academic researcher from University of Arizona. The author has contributed to research in topics: DEVS & Discrete event simulation. The author has an hindex of 47, co-authored 406 publications receiving 13318 citations. Previous affiliations of Bernard P. Zeigler include University of Michigan & AmeriCorps VISTA.
Papers published on a yearly basis
Papers
More filters
•
01 Jan 1976
TL;DR: In this paper, the authors present a rigorous mathematical foundation for modeling and simulation and provide a comprehensive framework for integrating the various simulation approaches employed in practice, including cellular automata, chaotic systems, hierarchical block diagrams, and Petri nets.
Abstract: From the Publisher:
Although twenty-five years have passed since the first edition of this classical text, the world has seen many advances in modeling and simulation, the need for a widely accepted framework and theoretical foundation is even more necessary today. Methods of modeling and simulation are fragmented across disciplines making it difficult to re-use ideas from other disciplines and work collaboratively in multidisciplinary teams. Model building and simulation has been made easier and faster by riding piggyback on advances in software and hardware. However, difficult and fundamental issues such as model credibility and interoperation have received less attention. These issues are now front and center under the impetus of the High Level Architecture (HLA) standard mandated by the U.S. DoD for all contractors and agencies.
This book concentrates on integrating the continuous and discrete paradigms for modeling and simulation. A second major theme is that of distributed simulation and its potential to support the co-existence of multiple formalisms in multiple model components. Prominent throughout are the fundamental concepts of modular and hierarchical model composition.
This edition presents a rigorous mathematical foundation for modeling and simulation. Also, it now provides a comprehensive framework for integrating the various simulation approaches employed in practice. Including such popular modeling methods as cellular automata, chaotic systems, hierarchical block diagrams, and Petri nets. A unifying concept, called the DEVS Bus, enables models, as expressed in their native formalisms, to be transparently mapped into the Discrete Event System Specification (DEVS). The book shows how to construct computationally efficient, object-oriented simulations of DEVS models on parallel and distributed environments. If you are doing integrative simulations, whether or not they are HLA compliant, this is the only book available to provide the foundation to understand, simplify and successfully accomplish your task.
Herbert Praehofer is an Assistant Professor at the Johannes Kepler University in Linz, Austria. He has over 50 publications in international journals and conference proceedings on Modeling and Computer Simulation, Systems Theory, and Software Engineering.
Tag Gon Kim is a Professor of Electrical Engineering at the Korea Advanced Institutes of Science and Technology (KAIST), Taejon, Korea. His research interests include discrete event systems modeling/simulation, computer/communication systems analysis, and object-oriented simulation engineering. He is a senior member of IEEE and SCS, and a member of ACM.
* Provides a comprehensive framework for continuous and discrete event modeling and simulation
* Explores the mathematical foundation of simulation modeling
* Discusses system morphisms for model abstraction and simplification
* Presents a new approach to discrete event simulation of continuous processes
* Includes parallel and distributed simulation of discrete event models
* Presentation of a concept to achieve simulator interoperability in the form of the DEVS-Bus
* Complete coverage necessary for compliance with High Level Architecture (HLA) standards
Bernard P Zeigler, is a Professor of Electrical & Computer Engineering at the University of Arizona and heads the Artificial Intelligence Simulation Research Group. He is the author of numerous books and publications, and he is the Editor-in-Chief of the Transactions of the Society for Computer Simulation International.
2,569 citations
•
18 Jan 2000
TL;DR: Part I: Basics.
Abstract: Part I: Basics. Introduction to Systems Modeling Concepts. Framework for Modeling and Simulation. Modeling Formalisms and Their Simulators. Introduction to Discrete Event System Specifications (DEVS). Hierarchy of System Specifications. Part II: Modeling Formalisms and Simulation Algorithms. Basic Formalisms: DEVS, DTSS, DESS. Basic Formalisms: Coupled Multicomponent Systems. Simulators for Basic Formalisms. Multiformalism Modeling and Simulation. DEVS-Based Extended Formalisms. Parallel and Distributed Discrete Event Simulation. Part III: System Morphisms: Abstraction, Representation, Approximation. Hierarchy of System Morphisms. Abstraction: Constructing Model Families. Verification, Validation, Approximate Morphisms: Living with Error. DEVS and DEVS-like Systems: Universality and Uniqueness. DEVS Representation of Systems. Part IV: System Design and Modeling and Simulation Environments. DEVS-Based Design Methodology. System Entity Structure/Model Base Framework. Collaboration and the Future.
1,169 citations
•
01 Mar 1984727 citations
Cited by
More filters
••
[...]
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).
13,246 citations
••
TL;DR: In this paper, the control of a class of discrete event processes, i.e., processes that are discrete, asynchronous and possibly non-deterministic, is studied. And the existence problem for a supervisor is reduced to finding the largest controllable language contained in a given legal language, where the control process is described as the generator of a formal language, while the supervisor is constructed from the grammar of a specified target language that incorporates the desired closed-loop system behavior.
Abstract: This paper studies the control of a class of discrete event processes, i.e. processes that are discrete, asynchronous and possibly nondeter-ministic. The controlled process is described as the generator of a formal language, while the controller, or supervisor, is constructed from the grammar of a specified target language that incorporates the desired closed-loop system behavior. The existence problem for a supervisor is reduced to finding the largest controllable language contained in a given legal language. Two examples are provided.
3,432 citations
••
TL;DR: This article considers the empirical data and then reviews the main approaches to modeling pedestrian and vehicle traffic, including microscopic (particle-based), mesoscopic (gas-kinetic), and macroscopic (fluid-dynamic) models.
Abstract: Since the subject of traffic dynamics has captured the interest of physicists, many surprising effects have been revealed and explained. Some of the questions now understood are the following: Why are vehicles sometimes stopped by ``phantom traffic jams'' even though drivers all like to drive fast? What are the mechanisms behind stop-and-go traffic? Why are there several different kinds of congestion, and how are they related? Why do most traffic jams occur considerably before the road capacity is reached? Can a temporary reduction in the volume of traffic cause a lasting traffic jam? Under which conditions can speed limits speed up traffic? Why do pedestrians moving in opposite directions normally organize into lanes, while similar systems ``freeze by heating''? All of these questions have been answered by applying and extending methods from statistical physics and nonlinear dynamics to self-driven many-particle systems. This article considers the empirical data and then reviews the main approaches to modeling pedestrian and vehicle traffic. These include microscopic (particle-based), mesoscopic (gas-kinetic), and macroscopic (fluid-dynamic) models. Attention is also paid to the formulation of a micro-macro link, to aspects of universality, and to other unifying concepts, such as a general modeling framework for self-driven many-particle systems, including spin systems. While the primary focus is upon vehicle and pedestrian traffic, applications to biological or socio-economic systems such as bacterial colonies, flocks of birds, panics, and stock market dynamics are touched upon as well.
3,117 citations
••
TL;DR: A proposed standard protocol for describing IBMs and ABMs, developed and tested by 28 modellers who cover a wide range of fields within ecology, and considered as a first step for establishing a more detailed common format of the description of IBm and ABM.
2,633 citations