scispace - formally typeset
Search or ask a question

Showing papers on "Network traffic simulation published in 1981"


Journal ArticleDOI
P. Pawlita1
TL;DR: The main parts focus on measurement objects, results, and implications of recent user traffic measurements in four conversational teleprocessing systems, mostly with medium-speed terminals like cathode ray tube (CRT) display terminals.
Abstract: Traffic statistics are a necessary base for network modeling, performance analysis, and network traffic control. This paper deals with results of recent traffic measurements in teleprocessing systems and their implications. In the introductory sections the range of traffic measurement problems in present networks and publications in the field of measurement results are surveyed. The main parts focus on measurement objects, results, and implications of recent user traffic measurements in four conversational teleprocessing systems, mostly with medium-speed terminals like cathode ray tube (CRT) display terminals. The first two moments, distributions, and regressions of random variables characterizing the structure of dialog cycles are presented. Some important properties are extreme traffic burstiness, frequently high coefficients of variation, and different characteristics of CRT terminals compared with teletypewriters. It is observed that new measurement tasks are arising and improved measurement systems are needed.

39 citations


Book
01 Jun 1981
TL;DR: The QRM workflow consists of the identification of hazards based on a systematic use of information and consists of 1 initiation 2 assessment 3 control 4 review and 5 communication of risks as shown in Figure 2.
Abstract: Computer Simulation Applications Discrete Event Simulation For Synthesis And Analysis Of Complex Systems *FREE* computer simulation applications discrete event simulation for synthesis and analysis of complex systems A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming.Modern computers have the ability to follow generalized sets of operations, called programs. These programs enable computers to perform an extremely wide range of tasks.Computer Wikipedia A computer is a machine that can be instructed to carry out sequences of arithmetic or logical operations automatically via computer programming Modern computers have the ability to follow generalized sets of operations called programs These programs enable computers to perform an extremely wide range of tasks Electrical Engineering amp Computer Science EECS Analysis of discrete time linear time invariant DTLTI systems in the time domain and using z transforms Introduction to techniques based on Discrete Time Discrete and Fast Fourier Transforms Electrical amp Systems Engineering Washington University E35 ESE 502 Mathematics of Modern Engineering II Fourier series and Fourier integral transforms and their applications to solving some partial differential equations heat and wave equations complex analysis and its applications to solving real valued problems analytic functions and their role Laurent series representation complex valued line integrals and their evaluation including the Computer science Wikipedia Computer science is the study of processes that interact with data and that can be represented as data in the form of programs It enables the use of algorithms to manipulate store and communicate digital information A computer scientist studies the theory of computation and the practice of designing software systems Its fields can be divided into theoretical and practical disciplines Department of Electrical Engineering and Computer Science On This Page Computer Engineering Computer Science and Engineering Electrical Engineering Undergraduate Major in Computer Engineering Program Educational Objectives Graduates of the Computer Engineering program will 1 be engaged in professional practice at or beyond the entry level or enrolled in high quality graduate programs building on a solid foundation in engineering mathematics Computer Science and Engineering lt University of Texas CSE 4303 COMPUTER GRAPHICS 3 Hours Theory and practice for the visual representation of data by computers including display devices output primitives planes and curved surfaces two and three dimensional transformations parallel and perspective viewing removal of hidden lines and surfaces illumination models ray tracing radiosity color models and computer animation The Future of Pharmaceutical Manufacturing Sciences Risk is defined as a combination of probability of occurrence and the severity of harm 26 The QRM workflow consists of 1 initiation 2 assessment 3 control 4 review and 5 communication of risks as shown in Figure 2 The assessment involves the identification of hazards based on a systematic use of information Resolve a DOI Name Type or paste a DOI name into the text box Click Go Your browser will take you to a Web page URL associated with that DOI name Send questions or comments to doi Topics in Statistical Data Analysis home ubalt edu The purpose of this page is to provide resources in the rapidly growing area of computer based statistical data analysis This site

23 citations



01 Jan 1981
TL;DR: A method of making long-range computer system workload predictions by identifying assumptions and considering the effect of a change on individual users, illustrated by an example involving message traffic in a large computer network is presented.
Abstract: A method of making long-range computer system workload predictions is presented. The method quantifies the effect of qualitative changes in computing by identifying assumptions and by considering the effect of a change on individual users. The method is illustrated by an example involving message traffic in a large computer network.

3 citations


Journal Article
TL;DR: This article discusses developments in traffic simulation techniques, their relationship with developments in the computer field, and a new traffic simulation system called TRAF.
Abstract: This article discusses developments in traffic simulation techniques, their relationship with developments in the computer field, and a new traffic simulation system called TRAF.

2 citations


Journal Article
TL;DR: A simple application of the TEXAS computer model by a traffic engineer in a small city is described and the effects of a change in one of the three parameters (traffic flow, intersection geometry, and intersection control) with only two runs are ascertained.
Abstract: This paper describes a simple application of the TEXAS computer model by a traffic engineer in a small city. (TEXAS is a microscopic model for simulation of traffic at a single intersection. it is currently available from the Texas State Department of Highways and Public Transportation.) TEXAS allows traffic engineers to evaluate changes in intersection parameters (traffic flow, intersection geometry, and intersection control) and to see what effect those changes have on the vehicles' and intersection's performance. TEXAS is comprised of three separate computer programs: GEOPRO, DVPRO, and SIMPRO (see Figure 1). GEOPRO takes geometric information about the intersection system (approach lengths, number of lanes per approach, lane geometry and type, and location of any sight distance restrictions) in a cartesian coordinate manner; it produces a list of possible paths down which vehicles will travel. This path information is used as input to SIMPRO. DVPRO also produces input for SIMPRO. This drivervehicle processor takes volume and headway distribution information and creates a time-ordered list of vehicles. Three types of drivers and 16 classes of vehicles are used. SIMPRO takes these two inputs and a third, which contains the description of intersection control (from unsigned to signed to signalized) and the duration of simulation. Vehicles are \"stepped through\" the system, and speed and delay statistics are gathered for each time increment for each vehicle. At the end of the simulation run, the statistics are summarized for the total intersection, for each approach, and for each turn movement in each approach. During a typical time increment, each car examines the vehicle in front, the adjacent lane(s), and the traffic control at the intersection. Then it makes 'a deterministic decision whether to speed up, slow down, start, stop, or change lanes. Because of the deterministic nature of the model, the traffic engineer is able to ascertain, the effects of a change in one of the three parameters (traffic flow, intersection geometry, and intersection control) with only two runs: \"before\" and \"after\". The following is a description of how I used the model in just this way and was able to make comparisons between two runs. Richardson is a Dallas suburb with a population of 80 000. Its 53 traffic signals are located at arterial intersections on a suburban grid and are, for the most part, noninterconnected and fully actuated. When these signals were installed, multiphase, fully actuated operation was the state of the practice. At many of the locations left-turn phasing was provided, even though during the peak period only three to five vehicles made the left turns each cycle. It had been observed that those three leftturning vehicles were causing unnecessary delays to the opposing through movement. With the increased emphasis today on reducing overall delay and fuel consumption, about 10 locations were targeted for protected left-turn removal in, one or both directions. On January 10, 1981, left-turn green arrows

1 citations


Journal ArticleDOI
Annie Passeron1, André Spizzichino1
TL;DR: It is shown that, under some hypotheses, a network built around a transit centre may be decomposed into elementary triangular networks for which traffic routing and circuit routing are treated at the same time.
Abstract: This paper concerns the definition of general rules for planning and decision making in the digitalization of a long distance telecommunications network. It is shown that, under some hypotheses, a network built around a transit centre may be decomposed into elementary triangular networks for which traffic routing and circuit routing are treated at the same time. In the proposed method the evolution in time of the existing network is optimized by minimizing the discounted cost and taking into account modularity constraints. In particular, using simple examples, the dependence of the transit threshold on the immediate environment is studied. Finally we show how the model may be used for setting up a network configurator.

Journal ArticleDOI
TL;DR: In this article, the authors examined the appropriate methods of network structuring, to protect from breakdowns due to outs in transmission lines and failures in transit switching centers whereas the disturbance should not affect more than 5% of the traffic and should not hinder priority traffic at all.
Abstract: After having describing the Paris area telephonic network, the authors examine the appropriate methods of network structuring, to protect from breakdowns due to outs in transmission lines and failures in transit switching centres whereas the disturbance should not affect more than 5% of the traffic and should not hinder priority traffic at all. The setting-up of a last choice network based on a specialized transit centre to deal with traffic overloads in normal conditions and with call routing in case of breakdown is recommended. The network design should provide means for directing the communications towards the above mentioned centre.