scispace - formally typeset
Search or ask a question

Showing papers by "French Institute for Research in Computer Science and Automation published in 1982"


Journal ArticleDOI
TL;DR: This work shows how to prove (and disprove) theorems in the initial algebra of an equational variety by a simple extension of the Knuth-Bendix completion algorithm, and applications to proofs of programs computing over data structures, and to algebraic summation identities.

284 citations


Journal ArticleDOI
TL;DR: The average height of a binary tree with n internal nodes is shown to be asymptotic to 2√ πn .

277 citations


Journal ArticleDOI
TL;DR: In this article, the gradient field of the objective function on the constraint manifold is defined intrinsically and descent methods along geodesics, including the gradient projection and reduced gradient methods for special choices of coordinate systems, are analyzed.
Abstract: To generalize the descent methods of unconstrained optimization to the constrained case, we define intrinsically the gradient field of the objective function on the constraint manifold and analyze descent methods along geodesics, including the gradient projection and reduced gradient methods for special choices of coordinate systems. In particular, we generalize the quasi-Newton methods and establish their superlinear convergence; we show that they only require the updating of a reduced size matrix. In practice, the geodesic search is approximated by a tangent step followed by a constraints restoration or by a simple arc search again followed by a restoration step.

269 citations


Journal ArticleDOI
TL;DR: A criterion is given for measuring the tracking capability of recursive algorithms when applied to slowly time-varying systems; the optimal gain for a given disturbance is also calculated.
Abstract: A criterion is given for measuring the tracking capability of recursive algorithms when applied to slowly time-varying systems; the optimal gain for a given disturbance is also calculated. This criterion is seen to have some connection with the Fisher information matrix, and allows us to select a priori the best algorithm for identifying a given unknown parameter which may be subject to smooth unknown disturbances. Examples of applications are given in the areas of identification theory and data communication theory.

122 citations



Journal ArticleDOI
TL;DR: In this paper, a numerical simulation of the displacement of oil by water in a vertical porous slab is studied, where the water saturation is governed by a quasilinear diffusion-convection equation.

82 citations


Journal ArticleDOI
TL;DR: Several classical combinatorial quantities, including factorials, Bell numbers, tangent numbers...-have been shown to form eventually periodic sequences modulo any integer, are related to the existence of continued fraction expansions for corresponding ordinary (and divergent) generating functions.

62 citations


Journal ArticleDOI
TL;DR: In this article, the Von Karman equations were approximated by the mixed finite element scheme of Miyoshi and the solution arcs at a neighbourhood of the first eigenvalue of the linearized problem were followed by a continuation method.
Abstract: The purpose of this paper is to study the approximation of the Von Karman equations by the mixed finite element scheme of Miyoshi and to follow the solutions arcs at a neighbourhood of the first eigenvalue of the linearized problem. This last problem is solved by a continuation method.

53 citations


Book ChapterDOI
12 Jul 1982
TL;DR: The treatment relies on the saddle point method of complex analysis which is used here for extracting coefficients of a probability generating function, and on a particular technique that reveals periodic fluctuations in the behaviour of algorithms which are precisely quantified.
Abstract: We obtain average value and distribution estimates for the height of a class of trees that occurs in various contexts in computer algorithms : in trie searching, as index in several dynamic schemes and as an underlying partition structure in polynomial factorization algorithms. In particular, results given here completely solve the problem of analyzing Extendible Hashing for which practical conclusions are given. The treatment relies on the saddle point method of complex analysis which is used here for extracting coefficients of a probability generating function, and on a particular technique that reveals periodic fluctuations in the behaviour of algorithms which are precisely quantified.

40 citations


Journal ArticleDOI
TL;DR: This paper provides a specification that insures convergence even in cases where the curvature scalar tends to zero or infinity, and shows that the convergence is superlinear.
Abstract: This paper studies an algorithm for minimizing a convex function based upon a combination of polyhedral and quadratic approximation. The method was given earlier, but without a good specification for updating the algorithm's curvature matrix. Here, for the case of onedimensional minimization, we provide a specification that insures convergence even in cases where the curvature scalar tends to zero or infinity. Under mild additional assumptions, we show that the convergence is superlinear.

36 citations


Journal ArticleDOI
TL;DR: In this article, various classes of two-dimensional finite impulse response (FIR) filters that can be implemented with such techniques are studied and design procedures are presented for each of the classes.
Abstract: Sequentially convolving images with small size operators is a promising idea for performing image filtering. Various classes of two-dimensional finite impulse response (FIR) filters that can be implemented with such techniques are studied. Design procedures are presented for each of the classes. They are based upon minimizing L2and L∞criteria. Results are assessed in an image processing context.

Journal ArticleDOI
TL;DR: A method is proposed which determines the possible ways to grasp an object, defined by its silhouette, and applies, in particular, to all the automatic prehension problems when the object and/or its orientation are unknown.
Abstract: A method is proposed which determines the possible ways to grasp an object, defined by its silhouette. This method is quite general and versatile: the geometry of the object is arbitrary and a large class of grippers is allowed; no a priori information is needed, except the parameters of the gripper, and so the method applies, in particular, to all the automatic prehension problems when the object and/or its orientation are unknown. Suitable segmentation and parametrization of the silhouette yields an explicit solution and a very fast and simple algorithm.

Proceedings ArticleDOI
01 May 1982
TL;DR: It is shown how the techniques of Digital Image Analysis can be efficiently used in the interpretation of seismic cross-sections to cope with the characteristics of seismic signals.
Abstract: We show how the techniques of Digital Image Analysis can be efficiently used to help in the interpretation of seismic cross-sections. The problem of automatically finding homogeneous facies is in fact directly tied to Image Processing problems such as edge detection, texture analysis, segmentation and classification. We briefly overview some existing tools, outline some new techniques which had to be invented to cope with the characteristics of seismic signals and demonstrate on many examples the power of our approach.

Journal ArticleDOI
TL;DR: Some decomposition of the parsing of the sentences of context-free grammars into sequences of independant sub-tasks is proposed, for which this decomposition provides an efficient parser for a multiprocessing environment.
Abstract: Some decomposition of the parsing of the sentences of context-free grammars into sequences of independant sub-tasks is proposed. An example of grammar is presented, for which this decomposition provides an efficient parser for a multiprocessing environment. The average speed-up resulting from the parallelization of the parser of an arithmetic infix grammar is evaluated by means of probabilistic models and real world measurements.

Journal ArticleDOI
TL;DR: The approach developed here concerns the design of commands by the users themselves, and supports both evidence of a generation effect and the importance of structuring the commands.
Abstract: Computer commands have been created from natural language words for a broad range of naive and occasional users. These command languages have been investigated for their ease-of-use according to different perspectives. The approach developed here concerns the design of commands by the users themselves. A number of methodological problems are highlighted. The experimental simulation that was run supports both evidence of a generation effect and the importance of structuring the commands.

Proceedings ArticleDOI
29 Mar 1982
TL;DR: This work presents a model-independent treatment of some important data base problems defined by a set of data base updates rather than by the usual view definition mapping.
Abstract: We present a model-independent treatment of some important data base problems. Our approach uses the data base update as elementary concept. The basic tool is the data base view defined by a set of data base updates rather than by the usual view definition mapping.

Patent
17 Dec 1982
TL;DR: In this article, a system for the management of the physical memory of a processor which utilizes a base register which is loaded, for each virtual address of the memory, by a base address of a discriptive register corresponding to a task to be performed by the processor is presented.
Abstract: A system for the management of the physical memory of a processor which utilizes a base register which is loaded, for each virtual address of the memory, by a base address of a discriptive register corresponding to a task to be performed by the processor This system utilizes a descriptive register table, an adder receiving the binary value of the base address of the first descriptive register, and the binary value of the index corresponding to the first register The outputs of the adder address one of the inputs of the descriptive register table, thus selecting a segment descriptive register corresponding to the task to be performed Each of the descriptive registers of the table contains control bits sent to the processor which makes it possible for the processor to check whether, for the segment to which the processor must have access, the processor must operate in the local or overall mode and whether the processor must process an input-output operation or an access to the memory

Proceedings ArticleDOI
18 May 1982
TL;DR: MESSIDor is an interactive retrieval system that allows the simultaneous search of several bibliographic databases and converses with the system MESSIDOR with a single language, the MESSidOR language.
Abstract: MESSIDOR is an interactive retrieval system. It differs from current systems in that it allows the simultaneous search of several bibliographic databases. The databases may be on different sites such as ESA in Frascati, TELESYSTEMES in Sophia Antipolis... The databases may use different query languages such as QUEST, MISTRAL... However a user converses with the system MESSIDOR with a single language, the MESSIDOR language.The system's prototype is implemented on a MICRAL 80--30 micro-computer.

Book ChapterDOI
01 Jan 1982
TL;DR: Numerical methods for the simulation and the optimal control of free-boundary problems are presented in the case of the two-phase Stefan problem and these methods are illustrated for the continuous casting process.
Abstract: Numerical methods for the simulation and the optimal control of free-boundary problems are presented in the case of the two-phase Stefan problem. These methods are illustrated for the continuous casting process, more precisely the optimization of the water spray system.

Proceedings ArticleDOI
01 May 1982
TL;DR: A way to get rid of the fastidious computation of density and its maximums, in the case -very common- where a priori the location of maximums is roughly known is shown.
Abstract: The Hough transform is a mean to detect, from the edge-points of an image, the presence or absence of some parametric shapes the analytic formulation of which is already known, and to determine the value of the parameters vector. The principle of the method is to establish a good transform from the points in the image plane, to a curve in the parameter space, then to compute an experimental density in the parameter space, finally to find the values of parameters where the experimental density is maximum. In this paper, we show a way to get rid of the fastidious computation of density and its maximums, in the case -very common- where a priori the location of maximums is roughly known. Our method is based on the principle of a linearization of the equations of contours, with respect to the parameters, around each a priori parameter estimate. This linearization gives the opportunity to transform the problem, for each edge point, in a linear statistical estimator combined with an hypothesis-testing operator. This method can be used with high profit when a sequence of images is to be treated where the contours are slowly varying. Also, as pointed out by BALLARD, it is not necessary that the contours be simple-parametric curves as straight lines, cercles, ellipses, etc., but same results can be applied to the contours which shape is known but is moving, with rotation, translation and scaling.

Book ChapterDOI
01 Jan 1982
TL;DR: This paper studies the situation where the impulse cost is small, and there is a reduction of the dimension of the system when the system is neglected, and it is proved that by this way the authors do only a small error.
Abstract: It is possible to optimize such system by dynamic programming methods. Unfortunately this kind of method can be applied only to systems of small dimension. Perturbation technique is a way to decrease the dimensionality of the system. In this paper we study the situation where the impulse cost is small, and there is a reduction of the dimension of the system when we neglect the impulse cost. We give a result of continuity which proves that by this way we do only a small error.

Book ChapterDOI
06 Apr 1982
TL;DR: This paper presents the compilers producing system Perluette, a system based upon a formal semantics of programming languages that can specify and prove their implementations as representations of an algebraic data type into another one.
Abstract: Real compilers are usually ad hoc programs. They are costly to write and maintain, and too much complex to be proved correct. This paper presents the compilers producing system Perluette. This system is based upon a formal semantics of programming languages. Programming languages are considered to be algebraic data types. Then it becomes possible to specify and prove their implementations as representations of an algebraic data type into another one.

Proceedings ArticleDOI
02 Jun 1982
TL;DR: This paper presents and analyzes algorithms for computing semi-joins in a multiprocessor database machine, and shows that the method using semi-Joins is generally better.
Abstract: Semi-join is a relational operator that decreases the cost of processing queries involving binary operations. This is accomplished by initially selecting the data relevant to answer the queries and thereby reducing the size of the operand relations. This paper presents and analyzes algorithms for computing semi-joins in a multiprocessor database machine. First, an architecture model of a multiprocessor system is described. The model incorporates 1-0, CPU and messages transmission cost parameters to enable the evaluation of these algorithms in terms of their execution costs. Then two equi-semi-join algorithms are presented and compared and one inequi-semi-join algorithm is proposed. The execution cost of these algorithms are generally lineary proportional to the size of the operand and result relations and inversely proportional to the number of processors. Then, the method by joining two relations and the method by joining their semi-joins are compared. Finally it is shown that the method using semi-joins is generally better.

Proceedings ArticleDOI
01 May 1982
TL;DR: A new procedure for 2-D Separable Denominator Recursive (SDR) Filter aesign is introduced, based upon the minimization of mean-square error criteria between impulse responses, which develops a new approach based upon a single input-multi output 1-D filter approximation.
Abstract: A new procedure for 2-D Separable Denominator Recursive (SDR) Filter aesign is introduce. It is based upon the minimization of mean-square error criteria between impulse responses. The algorithm is two fold. First the finite impulse response of the prototype is approximated by a finite sum of separable filters using the Singular Value Decomposition Theorem as described in TREITEL & SHANKS (1) - Second the finite sum is approximated by an SDR filter. In this part we develop a new approach based upon a single input-multi output 1-D filter approximation. In the last part of the paper, we present experimental results that compare our new algorithm to related previous ones.

Journal ArticleDOI
TL;DR: A variational approach of a continuous multi-echelon inventory problem which allows us to determine the optimal instants and size of orders to be requested by each installation.

Book ChapterDOI
01 Jan 1982
TL;DR: The Dynamic Programming Equation arising in the problem of stochastic control in discrete time over an infinite horozon is studied and the structure of the set of solutions is dealt with.
Abstract: We study the Dynamic Programming Equation arising in the problem of stochastic control in discrete time over an infinite horozon. The data are unbounded, in which case the solution is not unique. Our results deal with the structure of the set of solutions.

Dissertation
08 Oct 1982
TL;DR: In this article, the notion de jeu de tests sur un contexte de test is defined, and a methode de pratique de test for programs is presented.
Abstract: Ce travail presente une modelisation originale de la notion de test de programmes, a partir de la logique egalitaire du premier ordre. Plusieurs applications concernant notamment les types abstraits algebriques et la validation automatique de specifications sont proposees. A partir d'une etude intuitive de la notion de test, nous degageons la notion de processus de test, fondee sur le principe du couplage. Nous definissions la notion de jeu de tests sur un contexte de test. Ses proprietes mathematiques sont etudiees: fiabilite, validite, absence de biais, acceptabilite. Plusieurs ordres partiels sont definis: finesses absolue et asymptotique. Les equivalences deduites donnent lieu a des theoremes importants. Une methode de pratique de test est construite a partir de cette theorie et appliquee a un programme de tri. Cette methode est particulierement adaptee a la validation d'un axiome d'un type abstrait algebrique sur une algebre. Un exemple est presente, et l'implantation d'un outil experimental utilisant cette methode, realise a titre experimental, est decrite. De nombreuses annexes sont jointes: un resume des travaux anterieurs sur le probleme, une bibliographie sur la validation des programmes par test, une introduction a la logique du premier ordre et un listage partiel de l'implantation realisee.

Journal ArticleDOI
TL;DR: HCS not only provides a significant compaction ratio but also renders processing operations such as map intersection or union easier and allows the use of variable scale.

Journal ArticleDOI
TL;DR: In this article, a controled perturbed Markov chain of transition matrix is studied, where the state space and the control set are finite and the solution expansion is polynomial.

Proceedings ArticleDOI
01 May 1982
TL;DR: It is shown that one of the key problems in Image Understanding is the matching of two symbolic structures, a model and the result of a segmentation and a formalism is presented which can deal with the inexact or fuzzy matching of such structures in a highly parallel fashion.
Abstract: We show that one of the key problems in Image Understanding is the matching of two symbolic structures, a model and the result of a segmentation. These symbolic structures are conveniently represented as labeled graphs and we present on two examples a formalism which can deal with the inexact or fuzzy matching of such structures in a highly parallel fashion.