scispace - formally typeset
Search or ask a question

Showing papers on "Stochastic process published in 1994"


Book
01 Jan 1994

2,169 citations


Journal ArticleDOI
TL;DR: In this paper, a model combining a multivariate linear model for the underlying response with a logistic regression model was proposed for continuous longitudinal data with non-ignorable or informative dropout.
Abstract: A model is proposed for continuous longitudinal data with non-ignorable or informative drop-out (ID). The model combines a multivariate linear model for the underlying response with a logistic regression model for the drop-out process. The latter incorporates dependence of the probability of drop-out on unobserved, or missing, observations. Parameters in the model are estimated by using maximum likelihood (ML) and inferences drawn through conventional likelihood procedures. In particular, likelihood ratio tests can be used to assess the informativeness of the drop-out process through comparison of the full model with reduced models corresponding to random drop-out (RD) and completely random processes

1,034 citations


Journal ArticleDOI
TL;DR: A well-defined crossover is found between a L\'evy and a Gaussian regime, and that the crossover carries information about the relevant parameters of the underlying stochastic process.
Abstract: We introduce a class of stochastic process, the truncated L\'evy flight (TLF), in which the arbitrarily large steps of a L\'evy flight are eliminated. We find that the convergence of the sum of $n$ independent TLFs to a Gaussian process can require a remarkably large value of $n$---typically $n\ensuremath{\approx}{10}^{4}$ in contrast to $n\ensuremath{\approx}10$ for common distributions. We find a well-defined crossover between a L\'evy and a Gaussian regime, and that the crossover carries information about the relevant parameters of the underlying stochastic process.

799 citations


Journal ArticleDOI
TL;DR: The theory of linear mean square estimation for complex signals exhibits some connections with circularity, and it is shown that without this assumption, the estimation theory must be reformulated.
Abstract: Circularity is an assumption that was originally introduced for the definition of the probability distribution function of complex normal vectors. However, this concept can be extended in various ways for nonnormal vectors. The first purpose of the paper is to introduce and compare some possible definitions of circularity. From these definitions, it is also possible to introduce the concept of circular signals and to study whether or not the spectral representation of stationary signals introduces circular components. Therefore, the relationships between circularity and stationarity are analyzed in detail. Finally, the theory of linear mean square estimation for complex signals exhibits some connections with circularity, and it is shown that without this assumption, the estimation theory must be reformulated. >

541 citations


Journal ArticleDOI
TL;DR: This paper describes a method for statistical testing based on a Markov chain model of software usage that allows test input sequences to be generated from multiple probability distributions, making it more general than many existing techniques.
Abstract: Statistical testing of software establishes a basis for statistical inference about a software system's expected field quality. This paper describes a method for statistical testing based on a Markov chain model of software usage. The significance of the Markov chain is twofold. First, it allows test input sequences to be generated from multiple probability distributions, making it more general than many existing techniques. Analytical results associated with Markov chains facilitate informative analysis of the sequences before they are generated, indicating how the test is likely to unfold. Second, the test input sequences generated from the chain and applied to the software are themselves a stochastic model and are used to create a second Markov chain to encapsulate the history of the test, including any observed failure information. The influence of the failures is assessed through analytical computations on this chain. We also derive a stopping criterion for the testing process based on a comparison of the sequence generating properties of the two chains. >

433 citations


Journal ArticleDOI
TL;DR: A new realization of stochastic resonance is described, applicable to a broad class of systems, based on an underlying excitable dynamics with deterministic reinjection of sensory neurons in the crayfish.
Abstract: We describe a new realization of stochastic resonance, applicable to a broad class of systems, based on an underlying excitable dynamics with deterministic reinjection. A simple but general theory of such ``single-trigger'' systems is compared with analog simulations of the Fitzhugh-Nagumo model, as well as experimental data obtained from stimulated sensory neurons in the crayfish.

383 citations


Book
01 Jul 1994
TL;DR: The IID case and non IID observations show how the functional laws of small numbers, Von Mises conditions, and the peaks over threshold method can be applied to applications extreme value theory and extreme value analysis with XTREMES.
Abstract: Part 1 The IID case - functional laws of small numbers: functional laws of small numbers - bounds for the functional laws of small numbers, applications extreme value theory - Von Mises conditions, the peaks over threshold method, initial estimation of the class index estimation of conditional curves - Poisson process approach, the nonparametric case, the semiparametric case, extension to several points, a nearest neighbour alternative, optimal accuracy of estimators multivariate maxima - limiting distributions, representations and dependence functions, Max-Stable stochastic processes multivariate extremes - strong approximation of exceedances, further concepts of extremes, extreme value analysis with XTREMES - artificial data, real grouped data, real continuous data. Part 2 Non IID observations: introduction to the non IID case - definitions, stationary random sequences, independent random sequences, nonstationary random sequences extremes of random sequences - introduction and general theory, stationary sequences, independent sequences, nonstationary sequences extremes of Gaussian processes - stationary Gaussian processes, nonstationary Gaussian processes, empirial characteristic functions extensions for rare events - rare events of random sequences, the point process of exceedances, application to peaks over threshold, application to rare events, triangular arrays of rare events, multivariate extremes of non IID sequences statistics of extremes - application to ecological data, frost data appendix - user's guide to XTREMES - getting started, preparations, becoming acquainted with XTREMES, hot keys, window options and help menus XTREMES in action - implemented distributions, generating and reading data, nonparametric estimation, estimation in the alpha-mode, estimation in the beta-mode, simulations, important facilities and advice.

370 citations


Journal ArticleDOI
TL;DR: This analysis is a formulation in terms of coupled Langevin equations which allows in a natural way for the inclusion of external force fields and finds that this result is independent of the presence of weak quenched disorder.
Abstract: We consider the combined effects of a power law L\'evy step distribution characterized by the step index f and a power law waiting time distribution characterized by the time index g on the long time behavior of a random walker. The main point of our analysis is a formulation in terms of coupled Langevin equations which allows in a natural way for the inclusion of external force fields. In the anomalous case for f2 and g1 the dynamic exponent z locks onto the ratio f/g. Drawing on recent results on L\'evy flights in the presence of a random force field we also find that this result is independent of the presence of weak quenched disorder. For d below the critical dimension ${\mathit{d}}_{\mathit{c}}$=2f-2 the disorder is relevant, corresponding to a nontrivial fixed point for the force correlation function.

280 citations


Journal ArticleDOI
J.P. Bonnet, D. R. Cole1, J. Delville, Mark Glauser1, Lawrence Ukeiley1 
TL;DR: In this article, the root mean square (RMS) velocities are computed from the estimated and original velocity fields and comparisons are made, in order to quantitatively assess the technique, and the results show that the complementary technique, which combines LSE and POD, allows one to obtain time dependent information from the POD while reducing the amount of instantaneous data required.
Abstract: The Proper Orthogonal Decomposition (POD) as introduced by Lumley and the Linear Stochastic Estimation (LSE) as introduced by Adrian are used to identify structure in the axisymmetric jet shear layer and the 2-D mixing layer. In this paper we will briefly discuss the application of each method, then focus on a novel technique which employs the strengths of each. This complementary technique consists of projecting the estimated velocity field obtained from application of LSE onto the POD eigenfunctions to obtain estimated random coefficients. These estimated random coefficients are then used in conjunction with the POD eigenfunctions to reconstruct the estimated random velocity field. A qualitative comparison between the first POD mode representation of the estimated random velocity field and that obtained utilizing the original measured field indicates that the two are remarkably similar, in both flows. In order to quantitatively assess the technique, the root mean square (RMS) velocities are computed from the estimated and original velocity fields and comparisons made. In both flows the RMS velocities captured using the first POD mode of the estimated field are very close to those obtained from the first POD mode of the unestimated original field. These results show that the complementary technique, which combines LSE and POD, allows one to obtain time dependent information from the POD while greatly reducing the amount of instantaneous data required. Hence, it may not be necessary to measure the instantaneous velocity field at all points in spacesimultaneously to obtain the phase of the structures, but only at a few select spatial positions. Moreover, this type of an approach can possibly be used to verify or check low dimensional dynamical systems models for the POD coefficients (for the first POD mode) which are currently being developed for both of these flows.

273 citations


Journal ArticleDOI
TL;DR: In this paper, an optimality criterion based on a bias-variance trade-off is proposed to estimate the extremal index, a parameter in the interval [0, 1] that measures the degree of clustering of process.
Abstract: SUMMARY The extremal index is an important parameter measuring the degree of clustering of process. The extremal index, a parameter in the interval [0, 1], is the reciprocal of the mean cluster size. Apart from being of interest in its own right, it is a crucial parameter for determining the limiting distribution of extreme values from the process. In this paper we review current work on statistical estimation of the extremal index and consider an optimality criterion based on a bias-variance trade-off. Theoretical results are developed for a simple doubly stochastic process, and it is argued that the main formula obtained is valid for a much wider class of processes. The practical implications are examined through simulations and a real data example.

269 citations


Book
20 Jun 1994
TL;DR: In this article, the authors propose a deterministic LQ regulation based on Riccati-based solution via Polynomial Equations (PE) to solve the problem of linear systems.
Abstract: 1. Introduction. I. BASIC DETERMINISTIC THEORY OF LQ AND PREDICTIVE CONTROL. 2. Deterministic LQ Regulation - I: Riccati-Based Solution. 3. I/O Descriptions and Feedback Systems. 4. Deterministic LQ Regulation - II: Solution via Polynomial Equations. 5. Deterministic Receding Horizon Control. II. STATE ESTIMATION, SYSTEM IDENTIFICATION, LQ AND PREDICTIVE STOCHASTIC CONTROL. 6. Recursive State Filtering and System Identification. 7. LQ and Predictive Stochastic Control. III. ADAPTIVE CONTROL. 8. Single-Step-Ahead Self-Tuning Control. 9. Adaptive Predictive Control. APPENDICES. A. Some Results from Linear Systems Theory. B. Some Results of Polynomial Matrix Theory. C. Some Results on Linear Diophantine Equations. D. Probability Theory and Stochastic Processes. References. Some Often Used Abbreviations. Index.

Journal ArticleDOI
TL;DR: In this article, a detailed examination of the relationship between stochastic Lagrangian models and second-moment closures is performed, in terms of the second-order tensor that defines a stochastically Lagrangians model.
Abstract: A detailed examination is performed of the relationship between stochastic Lagrangian models—used in PDF methods—and second‐moment closures. To every stochastic Lagrangian model there is a unique corresponding second‐moment closure. In terms of the second‐order tensor that defines a stochastic Lagrangian model, corresponding models are obtained for the pressure‐rate‐of‐strain and the triple‐velocity correlations (that appear in the Reynolds‐stress equation), and for the pressure‐scrambling term in the scalar flux equation. There is an advantage in obtaining second‐moment closures via this route, because the resulting models automatically guarantee realizability. Some new stochastic Lagrangian models are presented that correspond (either exactly or approximately) to popular Reynolds‐stress models.

Journal ArticleDOI
TL;DR: In this article, the problem of finding the optimal sequence of starting and stopping times of a multi-activity production process, given the costs of opening, running, and closing the activities and assuming that the economic system is a stochastic process, is formulated as an extended impulse control problem and solved using stochochastic calculus.
Abstract: This paper considers the problem of finding the optimal sequence of opening (starting) and closing (stopping) times of a multi- activity production process, given the costs of opening, running, and closing the activities and assuming that the state of the economic system is a stochastic process. The problem is formulated as an extended impulse control problem and solved using stochastic calculus. As an application, the optimal starting and stopping strategy are explicitly found for a resource extraction when the price of the resource is following a geometric Brownian motion.

Journal ArticleDOI
TL;DR: In this article, a detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out, highlighting important similarities and differences between the data and the random cascade theory.
Abstract: Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three da...

Book
01 Jan 1994
TL;DR: In this article, the authors focus on a class of processes with continuous sample paths that possess the Markov property and present an exposition based on the theory of stochastic analysis.
Abstract: Focusing on one of the major branches of probability theory, this book treats the large class of processes with continuous sample paths that possess the Markov property. The exposition is based on the theory of stochastic analysis, which uses such notions as stochastic differentials and stochastic integrals.

Journal ArticleDOI
TL;DR: Statistical properties of mobile-to-mobile land communication channels have been developed, including the level-crossing rate and duration of fades of the envelope, the probability distribution of random FM, and the expected number of crossings of the random phase and random FM of the channel.
Abstract: Statistical properties of mobile-to-mobile land communication channels have been developed. In particular, the level-crossing rate and duration of fades of the envelope, the probability distribution of random FM, the expected number of crossings of the random phase and random FM of the channel, and the power spectrum of random FM of the channel have been considered. >

Journal ArticleDOI
TL;DR: In this paper, the random field is represented by a series of orthogonal functions, and is incorporated directly in the finite-element formulation and first-order reliability analysis, and its relationship with the Karhunen-Loeve expansion used in recent stochastic finite element studies is examined.
Abstract: A new approach for first‐order reliability analysis of structures with material parameters modeled as random fields is presented. The random field is represented by a series of orthogonal functions, and is incorporated directly in the finite‐element formulation and first‐order reliability analysis. This method avoids the difficulty of selecting a suitable mesh for discretizing the random field. A general continuous orthogonal series expansion of the random field is derived, and its relationship with the Karhunen‐Loeve expansion used in recent stochastic finite‐element studies is examined. The method is illustrated for a fixed‐end beam with bending rigidity modeled as a random field. A set of Legendre polynomials is used as the orthogonal base to represent the random field. Two types of correlation models are considered. The Karhunen‐Loeve expansion leads to a lower truncation error than does the Legendre expansion for a given number of terms, but one or two additional terms in the Legendre expansion yield...

Journal ArticleDOI
TL;DR: In this article, the authors considered the problem of price risk in the presence of an uncertain "random" market and proposed a simple and powerful formalism which allows them to generalize the analysis to a large class of stochastic processes, such as ARCH, jump or Levy processes.
Abstract: The ability to price risks and devise optimal investment strategies in the presence of an uncertain "random" market is the cornerstone of modern finance theory. We first consider the simplest such problem of a so-called "European call option" initially solved by Black and Scholes using Ito stochastic calculus for markets modelled by a log-Brownien stochastic process. A simple and powerful formalism is presented which allows us to generalize the analysis to a large class of stochastic processes, such as ARCH, jump or Levy processes. We also address the case of correlated Gaussian processes, which is shown to be a good description of three different market indices (MATIF, CAC40, FTSE100). Our main result is the introduction of the concept of an optimal strategy in the sense of (functional) minimization of the risk with respect to the portfolio. If the risk may be made to vanish for particular continuous uncorrelated 'quasiGaussian' stochastic processes (including Black and Scholes model), this is no longer the case for more general stochastic processes. The value of the residual risk is obtained and suggests the concept of risk-corrected option prices. In the presence of very large deviations such as in Levy processes, new criteria for rational fixing of the option prices are discussed. We also apply our method to other types of options, `Asian', `American', and discuss new possibilities (`doubledecker'...). The inclusion of transaction costs leads to the appearance of a natural characteristic trading time scale.

Journal ArticleDOI
TL;DR: It is proved that global convergence of the schemes is tied to sector conditions on the static nonlinearity of FIR (finite impulse response) models, and Gauss-Newton and stochastic gradient algorithms are suggested in the single-input/single-output case.
Abstract: Recursive identification algorithms, based on the nonlinear Wiener model, are presented. A recursive identification algorithm is first derived from a general parameterization of the Wiener model, using a stochastic approximation framework. Local and global convergence of this algorithm can be tied to the stability properties of an associated differential equation. Since inversion is not utilized, noninvertible static nonlinearities can be handled, which allows a treatment of, for example, saturating sensors and blind adaptation problems. Gauss-Newton and stochastic gradient algorithms for the situation where the static nonlinearity is known are then suggested in the single-input/single-output case. The proposed methods can outperform conventional linearizing inversion of the nonlinearity when measurement disturbances affect the output signal. For FIR (finite impulse response) models, it is also proved that global convergence of the schemes is tied to sector conditions on the static nonlinearity. In particular, global convergence of the stochastic gradient method is obtained, provided that the nonlinearity is strictly monotone. The local analysis, performed for IIR (infinite impulse response) models, illustrates the importance of the amplitude contents of the exciting signals. >

Posted Content
TL;DR: A survey of recent developments in the rapidly expanding field of asymptotic distribution theory, with a special emphasis on the problems of time dependence and heterogeneity is given in this paper.
Abstract: This is a survey of the recent developments in the rapidly expanding field of asymptotic distribution theory, with a special emphasis on the problems of time dependence and heterogeneity The book is designed to be useful on two levels First as a textbook and reference work, giving definitions of the relevant mathematical concepts, statements, and proofs of the important results from the probability literature, and numerous examples; and second, as an account of recent work in the field of particular interest to econometricians, including a number of important new results It is virtually self-contained, with all but the most basic technical prerequisites being explained in their context; mathematical topics include measure theory, integration, metric spaces, and topology, with applications to random variables, and an extended treatment of conditional probability Other subjects treated include: stochastic processes, mixing processes, martingales, mixingales, and near-epoch dependence; the weak and strong laws of large numbers; weak convergence; and central limit theorems for nonstationary and dependent processes The functional central limit theorem and its ramifications are covered in detail, including an account of the theoretical underpinnings (the weak convergence of measures on metric spaces), Brownian motion, the multivariate invariance principle, and convergence to stochastic integrals This material is of special relevance to the theory of cointegration

Book ChapterDOI
20 Jun 1994
TL;DR: A class of Stochastic Petri Nets (SPN) whose solution can be efficiently computed since it never requires the construction of the complete Markov chain of the underlying Markovian process.
Abstract: In a previous paper we have defined Superposed Stochastic Automata (SSA) [13], a class of Stochastic Petri Nets (SPN) whose solution can be efficiently computed since it never requires the construction of the complete Markov chain of the underlying Markovian process. The efficient solution of SSA is based on a method proposed by Plateau in [23] for the analysis of stochastic processes generated by the composition of stochastic automata. Efficient analysis is there achieved (both in terms of space and time) with a technique based on Kronecker (tensor) algebra for matrices.

Journal ArticleDOI
TL;DR: A computer program founded upon several fast, robust numerical procedures based on a number of statistical-estimation methods is presented, and it is found that the least-square minimi- zation method provided better quality fits in general, compared to the other two approaches.
Abstract: Construction operations are subject to a wide variety of fluctuations and interruptions Varying weather conditions, learning development on repetitive operations, equipment breakdowns, management interference, and other external factors may impact the production process in construction As a result of such interferences, the behavior of construction processes becomes subject to random variations This necessitates modeling construction operations as random processes during simulation Random processes in simulation include activity and processing times, arrival processes (eg weather patterns) and disruptions In the context of construction simulation studies, modeling a random input process is usually per- formed by selecting and fitting a sufficiently flexible probability distribution to that process based on sample data To fit a generalized beta distribution in this context, a computer program founded upon several fast, robust numerical procedures based on a number of statistical-estimation methods is presented In particular, the fol- lowing methods were derived and implemented: moment matching, maximum like- lihood, and least-square minimization It was found that the least-square minimi- zation method provided better quality fits in general, compared to the other two approaches The adopted fitting procedures have been implemented in BetaFit, an interactive, microcomputer-based software package, which is in the public domain The operation of BetaFit is discussed, and some applications of this package to the simulation of construction projects are presented

Journal ArticleDOI
TL;DR: In this paper, two random sparse array geometries and a sparse array with a Mills cross receive pattern were simulated and compared to a fully sampled aperture with the same overall dimensions.

Journal ArticleDOI
TL;DR: In this paper, the authors consider systems of particles hopping stochastically on d-dimensional lattices with space-dependent probabilities and derive duality relations, expressing the time evolution of a given initial configuration in terms of correlation functions of simpler dual processes.
Abstract: We consider systems of particles hopping stochastically on d-dimensional lattices with space-dependent probabilities. We map the master equation onto an evolution equation in a Fock space where the dynamics are given by a quantum Hamiltonian (continuous time) or a transfer matrix (discrete time). Using non-Abelian symmetries of these operators we derive duality relations, expressing the time evolution of a given initial configuration in terms of correlation functions of simpler dual processes. Particularly simple results are obtained for the time evolution of the density profile. As a special case we show that for any SU(2) symmetric system the two-point and three-point density correlation functions in the N-particle steady state can be computed from the probability distribution of a single particle moving in the same environment. We apply our results to various models, among them partial exclusion, a simple diffusion-reaction system, and the two-dimensional six-vertex model with space-dependent vertex weights. For a random distribution of the vertex weights one obtains a version of the random-barrier model describing diffusion of particles in disordered media. We derive exact expressions for the averaged two-point density correlation functions in the presence of weak, correlated disorder.

Book
30 Oct 1994
TL;DR: In this article, the authors apply the random vibration theory to the analysis and design of a wide range of structural and mechanical systems and operating environments, including the modeling and simulation of random processes, fatigue in random vibration, design in a random vibration environment and the response of aerospace structures to atmospheric turbulence.
Abstract: This publication covers applications of the random vibration theory to the analysis and design of a wide range of structural and mechanical systems and operating environments. These include the modelling and simulation of random processes, fatigue in random vibration, design in a random vibration environment and the response of aerospace structures to atmospheric turbulence. Also covered is the response of structures to earthquakes, wind and ocean waves, and statistical energy analysis. The authors concentrate on engineering and design aspects, including the use of approximations to develop practical design problems. They demonstrate that a wide range of problems is amenable to a unified treatment for earthquakes, wind and ocean waves.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the problem of identifying the probability distributions of real-valued random variables based on some Funcions of them, and they show that some models for Reliability and Survival Analysis can be found in some Econometric Models.
Abstract: Introduction. Identifiability of Probability Distributions of Real-Valued Random Variables Based on Some Funcions of Them. Identifiability of Probability Measures on Abstract Spaces. Identifiability for Some Types of Stochastic Processes. Generalized Convolutions. Identifiability in Some Econometric Models. Identifiability in Some Models for Reliability and Survival Analysis. Identifiability for Mixtures of Distributions. Chapter References. Index.

Journal ArticleDOI
TL;DR: In this paper, a Z 2 noise was introduced for the stochastic estimation of matrix inversion and discussed its superiority over other noises including the Gaussian noise, which is applied to the calculation of quark loops in lattice quantum chromodynamics.

Journal ArticleDOI
TL;DR: A unified numerical solution framework for stochastic Petri nets in which transition firing is immediate, exponentially distributed, or generally distributed is presented.

Journal ArticleDOI
01 Feb 1994
TL;DR: In this paper, a variety of constructions for free engineering models from "white or normal" limitations embodied in many current simulations are presented, such as nonstationary, nonnormal signal and interference waveforms, nonhomogeneous random scenes, non-homogeneous volumetric clutter realizations, and snapshots of randomly evolving, nothomogeneous scenes.
Abstract: This paper reviews how to construct sets of random numbers with particular amplitude distributions and correlations among values. These constructions support both high-fidelity Monte Carlo simulation and analytic design studies. A variety of constructions is presented to free engineering models from "white or normal" limitations embodied in many current simulations. The methods support constructions of conventional stationary and normally distributed processes, nonstationary, nonnormal signal and interference waveforms, nonhomogeneous random scenes, nonhomogeneous volumetric clutter realizations, and snapshots of randomly evolving, nonhomogeneous scenes. Each case will have specified amplitude statistics, e.g., normal, log-normal, uniform, Weibull, or discrete amplitude statistics; and selected correlation, e.g., white, pink, or patchy statistics, clouds. or speckles. Sets of random numbers with correlation, nonstationarities, various tails for the amplitude distributions, and multimodal distributions can be constructed. The paper emphasizes aspects of probability theory necessary to engineering modeling. >

Journal ArticleDOI
TL;DR: This paper studies the use of derivative estimation and optimization algorithms in stochastic approximation algorithms, to perform so-called "single-run optimizations" of steady-state systems, and obtains some properties of the derivative estimators that could be of independent interest.
Abstract: Approaches like finite differences with common random numbers, infinitesimal perturbation analysis, and the likelihood ratio method have drawn a great deal of attention recently as ways of estimating the gradient of a performance measure with respect to continuous parameters in a dynamic stochastic system. In this paper, we study the use of such estimators in stochastic approximation algorithms, to perform so-called "single-run optimizations" of steady-state systems. Under mild conditions, for an objective function that involves the mean system time in a GI/G/1 queue, we prove that many variant of these algorithms converge to the minimizer. In most cases, however, the simulation length must be increased from iteration to iteration, otherwise the algorithm may converge to the wrong value. One exception is a particular implementation of infinitesimal perturbation analysis, for which the single-run optimization converges to the optimum even with a fixed and small number of ends of service per iteration. As a by-product of our convergence proofs, we obtain some properties of the derivative estimators that could be of independent interest. Our analysis exploits the regenerative structure of the system, but our derivative estimation and optimization algorithms do not always take advantage of that regenerative structure. In a companion paper, we report numerical experiments with an M/M/1 queue, which illustrate the basis convergence properties and possible pitfalls of the various techniques.