scispace - formally typeset
Search or ask a question

Showing papers by "French Institute for Research in Computer Science and Automation published in 1993"


Proceedings Article
31 Dec 1993
TL;DR: Results from constrained optimization some results from algebraic geometry differential geometry are shown.
Abstract: Projective geometry modelling and calibrating cameras edge detection representing geometric primitives and their uncertainty stereo vision determining discrete motion from points and lines tracking tokens over time motion fields of curves interpolating and approximating three-dimensional data recognizing and locating objects and places answers to problems. Appendices: constrained optimization some results from algebraic geometry differential geometry.

2,744 citations


Journal ArticleDOI
01 Dec 1993
TL;DR: This paper shows how simple and parallel techniques can be combined to achieve this goal and deal with complex real world scenes and shows that the algorithm relies on correlation followed by interpolation and performs very well on difficult images such as faces and cluttered ground level scenes.
Abstract: To compute reliable dense depth maps, a stereo algorithm must preserve depth discontinuities and avoid gross errors. In this paper, we show how simple and parallel techniques can be combined to achieve this goal and deal with complex real world scenes. Our algorithm relies on correlation followed by interpolation. During the correlation phase the two images play a symmetric role and we use a validity criterion for the matches that eliminate gross errors: at places where the images cannot be correlated reliably, due to lack of texture of occlusions for example, the algorithm does not produce wrong matches but a very sparse disparity map as opposed to a dense one when the correlation is successful. To generate a dense depth map, the information is then propagated across the featureless areas, but not across discontinuities, by an interpolation scheme that takes image grey levels into account to preserve image features. We show that our algorithm performs very well on difficult images such as faces and cluttered ground level scenes. Because all the algorithms described here are parallel and very regular they could be implemented in hardware and lead to extremely fast stereo systems.

483 citations


Journal ArticleDOI
TL;DR: Estimates of Internet workload are consistent with the hypothesis of a mix of bulk traffic with larger packet size, and interactive traffic with smaller packet size and a phenomenon of compression of the probe packets similar to the acknowledgement compression phenomenon recently observed in TCP.
Abstract: We use the measured round trip delays of small UDP probe packets sent at regular time intervals to characterize the end-to-end packet delay and loss behavior in the Internet. By varying the interval between probe packets, it is possible to study the structure of the Internet load over different time scales. In this paper, the time scales of interest range from a few milliseconds to a few minutes. Our observations agree with results obtained by others using simulation and experimental approaches. For example, our estimates of Internet workload are consistent with the hypothesis of a mix of bulk traffic with larger packet size, and interactive traffic with smaller packet size. The interarrival time distribution for Internet packets is consistent with an exponential distribution. We also observe a phenomenon of compression (or clustering) of the probe packets similar to the acknowledgement compression phenomenon recently observed in TCP. Our results also show interesting and less expected behavior. For example, we find that the losses of probe packets are essentially random when the probe traffic uses a small fraction of the available bandwidth.

419 citations


Proceedings ArticleDOI
01 Sep 1993
TL;DR: An interactive texture tool is constructed, which is fast and easy to use, to manipulate atlases in texture space, and the tool’s large set of interactive operations on mapping functions are presented.
Abstract: This paper describes a new approach to texture mapping. A global method to lower the distortion of the mapped image is presented; by considering a general optimization function we view the mapping as an energy-minimization process. We have constructed an interactive texture tool, which is fast and easy to use, to manipulate atlases in texture space. We present the tool’s large set of interactive operations on mapping functions. We also introduce an algorithm which automatically generates an atlas for any type of object. These techniques allow the mapping of different textures onto the same object and handle non-continuous mapping functions, needed for complicated mapped objects. CR Categories and subject descriptors: I.3.3 [Computer Graphics] Picture/Image Generation. I.3.7 [Computer Graphics] Graphics and Realism - Color, Shading and Texture. Additional Keywords: Texture Mapping, Texture Map Distortion, Realistic Rendering, Interaction.

377 citations


01 Jun 1993
TL;DR: This technical report presents a brief summary of the research conducted towards the design of ergonomic criteria for the evaluation of human computer interfaces (HCI), and then, the full description of the most recent set of criteria (version 2.1) both in english and french.
Abstract: This technical report presents first a brief summary of the research conducted towards the design of ergonomic criteria for the evaluation of human computer interfaces (HCI), and then, the full description of the most recent set of criteria (version 2.1) both in english and french. The summary outlines the context in which the criteria were developed, the goal of the criteria approach, the experiments conducted, and the results obtained. The set of ergonomic criteria that resulted from this work consists of a list of 18 elementary criteria (including 9 main criteria). The criteria are presented along with their definitions, rationales, examples of guidelines, and comments setting out the distinctions between some of them.

364 citations


Journal ArticleDOI
TL;DR: A new scale-space based approach that combines useful properties from the Laplacian and Beaudet's measure (Beaudet 1978) is proposed in order to correct and detect exactly the corner position and an extension of this approach is developed to solve the problem of trihedral vertex characterization and detection.
Abstract: Corners and vertexes are strong and useful features in computer vision for scene analysis, stereo matching, and motion analysis. Here, we deal with the development of a computational approach to these important features. We consider first a corner model and study analytically its behavior once it has been smoothed using the well-known Gaussian filter. This allows us to clarify the behavior of some well-knowncornerness measure based approaches used to detect these points of interest. Most of these classical approaches appear to detect points that do not correspond to the exact position of the corner. A new scale-space based approach that combines useful properties from the Laplacian and Beaudet's measure (Beaudet 1978) is then proposed in order to correct and detect exactly the corner position. An extension of this approach is then developed to solve the problem of trihedral vertex characterization and detection. In particular, it is shown that a trihedral vertex has two elliptic maxima on extremal contrast surfaces if the contrast is sufficient, and this allows us to classify trihedral vertexes in 2 classes: “vertex,” and “vertex as corner.” The corner-detection approach developed is applied to accurately detect trihedral vertexes using an additional test in order to make a distinction between trihedral vertexes and corners. Many experiments have been carried out using noisy synthetic data and real images containing corners and vertexes. Most of the promising results obtained are used to illustrate the experimental section of this paper.

314 citations


Journal ArticleDOI
TL;DR: A set of stabilizing smooth time-varying feedbacks is derived, and simulation results are given.
Abstract: Many nonholonomic mechanical systems, such as common wheeled mobile robots, are controllable but cannot be stabilized to given positions and orientations bv using smooth pure-state feedback control...

297 citations


Journal ArticleDOI
TL;DR: This article presents a motion-based segmentation method relying on 2-D affine motion models and a statistical regularization approach which ensures stable motion- based partitions and results obtained on several real-image sequences corresponding to complex outdoor situations are reported.
Abstract: This article deals with analysis of the dynamic content of a scene from an image sequence irrespective of the static or dynamic nature of the camera. The tasks involved can be the detection of moving objects in a scene observed by a mobile camera, or the identification of the movements of some relevant components of the scene relatively to the camera. This problem basically requires a motion-based segmentation step. We present a motion-based segmentation method relying on 2-D affine motion models and a statistical regularization approach which ensures stable motion-based partitions. This can be done without the explicit estimation of optic flow fields. Besides these partitions are linked in time. Therefore, the motion interpretation process can be performed on more than two successive frames. The ability to follow a given coherently moving region within an interval of several images of the sequence makes the interpretation process more robust and more comprehensive. Identification of the kinematic components of the scene is induced from an intermediate layer accomplishing a generic qualitative motion labeling. No 3-D measurements are required. Results obtained on several real-image sequences corresponding to complex outdoor situations are reported.

258 citations


Journal ArticleDOI
TL;DR: This work presents a simple and unified technique to establish convergence of various minimization methods, as well as implementable forms such as bundle algorithms, including the classical subgradient relaxation algorithm with divergent series.
Abstract: We present a simple and unified technique to establish convergence of various minimization methods. These contain the (conceptual) proximal point method, as well as implementable forms such as bundle algorithms, including the classical subgradient relaxation algorithm with divergent series.

238 citations


Proceedings ArticleDOI
30 Mar 1993
TL;DR: The authors propose a lossless algorithm based on regularities, such as the presence of palindromes, in the DNA, which is far beyond classical algorithms.
Abstract: The authors propose a lossless algorithm based on regularities, such as the presence of palindromes, in the DNA. The results obtained, although not satisfactory, are far beyond classical algorithms. >

202 citations


Journal ArticleDOI
TL;DR: This paper proposes an approximation method based on Gibbs sampling which allows an effective derivation of Bayes estimators for hidden Markov models.

Book ChapterDOI
28 Oct 1993
TL;DR: A general form is derived for the interaction matrix associated with a set of image points, which makes all the contributions of camera parameters explicit and shows an apparently good robustness of the method.
Abstract: This paper addresses the problem of the influence of camera calibration errors on the performances of a control algorithm. More precisely, we examine the effects of errors in the values of intrinsic and extrinsic camera parameters on the stability and the transient behavior of a visual servoing scheme. After having recalled the basic principles of the visual servoing approach, we derive a general form for the interaction matrix associated with a set of image points, which makes all the contributions of camera parameters explicit. We then give briefly some examples of analytical stability results in very simple cases. Since any further theoretical analysis appears extremely difficult, we then present a purely experimental study of the problem. After having given a few conclusions taken from a simulation step, we select and comment some results of the experimental approach based on a robot/camera testbed. We observe an apparently good robustness of the method. General comments on the approach and guidelines for future work are given in the conclusion.

Proceedings ArticleDOI
29 Jul 1993
TL;DR: In this paper, the Delaunay triangulation of object contours is used for 3D reconstruction from cross-sections, and the reconstruction of complex shapes is improved by adding vertices on and inside contours.
Abstract: We propose a solution to the problem of 3D reconstruction from cross-sections, based on the Delaunay triangulation of object contours. Its properties--especially the close relationship to the medial axis--enable us to do a compact tetrahedrization resulting in a nearest-neighbor connection. The reconstruction of complex shapes is improved by adding vertices on and inside contours.

Journal ArticleDOI
TL;DR: It is still an open issue to decide which of the various architectures among shared-memory, shared-disk, and shared-nothing, is best for database management under various conditions.
Abstract: Parallel database systems attempt to exploit recent multiprocessor computer architectures in order to build high-performance and high-availability database servers at a much lower price than equivalent mainframe computers. Although there are commercial SQL-based products, a number of open problems hamper the full exploitation of the capabilities of parallel systems. These problems touch on issues ranging from those of parallel processing to distributed database management. Furthermore, it is still an open issue to decide which of the various architectures among shared-memory, shared-disk, and shared-nothing, is best for database management under various conditions. Finally, there are new issues raised by the introduction of higher functionality such as knowledge-based or object-oriented capabilities within a parallel database system.

Book ChapterDOI
30 Aug 1993
TL;DR: This paper contains the first (to the authors' knowledge) proposal to optimize nested queries in the object-oriented context and translates queries to nested algebraic expressions to allow for more efficient evaluation.
Abstract: Many declarative query languages for object-oriented databases allow nested subqueries This paper contains the first (to our knowledge) proposal to optimize them A two-phase approach is used to optimize nested queries in the object-oriented context The first phase—called dependency-based optimization—transforms queries at the query language level in order to treat common subexpressions and independent subqueries more efficiently The transformed queries are translated to nested algebraic expressions These entail nested loop evaluation which may be very inefficient Hence, the second phase unnests nested algebraic expres­sions to allow for more efficient evaluation

Journal ArticleDOI
TL;DR: Recent advances in parallel manipulators are summarized and various applications for this kind of manipulator are illustrated.
Abstract: Parallel manipulators have been increasingly developed over the last few years from a theoretical view point as well as for practical applications. In this paper, recent advances are summarized and various applications for this kind of manipulator are illustrated.

Journal ArticleDOI
TL;DR: This work study a conjecture stated in [6] about the numbers of non-zeros of the auto-correlation function and the Walsh transform of the function (−1)f(x), where f(x) is any boolean function on {0, 1}n.
Abstract: We study a conjecture stated in [6] about the numbers of non-zeros of, respectively, the auto-correlation function and the Walsh transform of the function (−1) f(x) , wheref(x) is any boolean function on {0, 1} n . The result that we obtain leads us to introduce the class of partially-bent functions. We study within these functions the propagation criterion. We characterize those partially-bent functions which are balanced and prove a relation between their number (which is unknown) and the number of non-balanced partially-bent functions on {0, 1} n−1. Eventually, we study their correlation immunity.

Proceedings ArticleDOI
11 May 1993
TL;DR: A physically based deformable model which can be used to track and analyze non-rigid motion of dynamic structures in time sequences of 2-D or 3-D medical images and provides a sound framework for modal analysis, which allows a compact representation of a general deformation by a reduced number of parameters.
Abstract: The authors present a physically based deformable model which can be used to track and analyze non-rigid motion of dynamic structures in time sequences of 2-D or 3-D medical images. The model considers an object undergoing an elastic deformation as a set of masses linked by springs, where the natural length of the springs is set equal to zero and is replaced by a set of constant equilibrium forces, which characterize the shape of the elastic structure in the absence of external forces. This model has the extremely nice property of yielding dynamic equations which are linear and decoupled for each coordinate, irrespective of the amplitude of the deformation. It provides a reduced algorithmic complexity, and a sound framework for modal analysis, which allows a compact representation of a general deformation by a reduced number of parameters. The power of the approach to segment, track and analyze 2-D and 3-D images is demonstrated by a set of experimental results on various complex medical images. >

Journal ArticleDOI
TL;DR: In this paper, a three-dimensional thermodynamic model is proposed for large deformations of viscoelastic incompressible solids, which leads to a nonlinear variational problem which is approximated in space by mixed finite elements and in time by a first order implicit scheme.

Proceedings Article
24 Aug 1993
TL;DR: The co.91 of query optimization is ufiected by both the ~eurch apocc und the search atmlegy of the opti,aizer, and the tmde-o# between optimiaution cost and pumllel ezecution coat using the DBS$ pumllC1 query optimizer is investigated.
Abstract: The co.91 of query optimization is ufiected by both the ~eurch apocc und the search atmlegy of the opti,aizer. I~I (1 pumllel ecectrlion environment, Ihe search ap~cc tends lo he much lurger than in the centmlized ca,sc-. This is due to the high number of ezecution okeruutivrs which implies a aignijiconl increase in the op~imisutiofc coal. 118 &a puper, we investigate the tmde-o# between optimiaution cost and pumllel ezecution coat using the DBS$ pumllC1 query optimizer. We describe its cost model which cuptrrrea ull esaentiol aapects of pumllel executions. We show how the coal melricu imply a aignijiconl increase in the search spuce and oplimirotion coat. Howeuer, inateud of restricting lhe aeurch spabe, which muy lead to loosing letter plans, we reduce the optimization cost Iv controlling the search slmtegy. We ezrend mndomiaed strulegiea to adopt well lo pamllel query optimization. In particular, we propose Toured Simulated Annealing which provides u letter tmdec$ letween optimisa#on cost and yuulity of Ura ~~rullel r+rculiorr plan.

Journal ArticleDOI
TL;DR: An original system, Open Robot Controller Computer-Aided Design (ORCCAD), for the computer-aided design of robot controllers is described, providing a coherent approach from a high-level specification down to its implementation, and offers several tools for design, display, and test.
Abstract: An original system, Open Robot Controller Computer-Aided Design (ORCCAD), for the computer-aided design of robot controllers is described. Accessed by three different user levels (system, control, and application), it provides a coherent approach from a high-level specification down to its implementation, and offers several tools for design, display, and test. Following a critical study of the main architectures reported in the literature, the basic principles and underlying concepts of ORCCAD are presented. The main entity considered is the robot task, an elementary control action associated with a local behavior controlled by a set of observers and modeled by a finite-state automaton. It is made of a set of real-time communicating tasks, called module tasks. The module task which handles the behavior of the robot task, is described using the synchronous language ESTEREL. The application level is defined as a set of synchronized robot tasks, also described using ESTEREL. Two detailed examples are discussed. >

Journal ArticleDOI
TL;DR: In this article, Monte-Carlo numerical experiments comparing both approaches, mixture and classification, in both assumptions, equal and unknown mixing proprotions are reported, and the differences between the finited sample and the asymptotic behaviour of both approaches are analyzed through additional simulations.
Abstract: Generally, the mixture and the classification approaches via maximum likelihood had been contrasted under different underlying assumptions.In the classification approach, the mixing proportions are assumed to be equal whereas, in the mixture approach, there are supposed to be unknown.In this paper, Monte-Carlo numerical experiments comparing both approaches, mixture and classification, in both assumptions, equal and unknown mixing proprotions are reported.These numerical experiments exhibited that assumptions on the mixing proportions is a more sensitive factor than the choice of the clustering approach, especially in the small setting.Morever, the differences between the finited sample and the asymptotic behaviour of both approaches are analyzed through additional simulations.

Book
01 Jan 1993
TL;DR: The grammatical point of view shows that many useful extensions of Horn clauses incorporated in Prolog without theoretical justification correspond to well-established grammatical concepts, which gives a natural framework for defining extensions to the concept of logic program.
Abstract: We presented a grammatical view of logic programming where logic programs are considered grammars. This gives a natural framework for defining extensions to the concept of logic program. The paper shows that many useful extensions of Horn clauses incorporated in Prolog without theoretical justification correspond to well-established grammatical concepts. In particular the notion of DCG is a special case of W-grammar, modes are related to dependency relation of AG's, domain declarations of Turbo Prolog can be seen as a metagrammar of W-grammar and Prolog arithmetics fits naturally in the framework of RAG's with non-term interpretations. The grammatical point of view shows also a possibility of further extensions, not incorporated in Prolog, like a natural use of external procedures in logic programs. It also opens for the use of “grammatical techniques” like parsing or attribute evaluation in implementation of logic programs. On the other hand, the comparison of the formalisms shows that resolution techniques can be used for some grammars which were considered practically intractable, like RAG's or W-grammars. Last but not least, the grammatical point of view makes it possible to apply in logic programming some proof techniques developed originally for proving correctness of attribute grammars.

Journal ArticleDOI
TL;DR: It is shown that the total program using primitive recursive functionals obtained out of a structural proof of termination leads to an (at first) surprisingly efficient algorithm.

Proceedings Article
16 Aug 1993
TL;DR: An abstract instruction set for a constraint solver over finite domains, which can be smoothly integrated in the WAM architecture based on the use of a single primitive constraint X in r which embeds the core propagation mechanism.
Abstract: We present an abstract instruction set for a constraint solver over finite domains, which can be smoothly integrated in the WAM architecture. It is based on the use of a single primitive constraint X in r which embeds the core propagation mechanism. Complex user constraints such as linear equations or inequations are compiled into X in r expressions which encode the propagation scheme chosen to solve the constraint. The uniform treatment of a single primitive constraint leads to a better understanding of the overall constraint solving process and makes possible three main global optimizations which encompass many previous particular optimizations of "black box" finite domains solvers. Implementation results show that this approach combines both simplicity and efficiency. Our clp(FD) system is more than twice as fast as CHIP on average, with peak speedup reaching seve

Proceedings ArticleDOI
01 Dec 1993
TL;DR: A model for data organized as graphs is presented and an algebraic language based on regular expressions over the types of the node and edge labels and supporting a restricted form of recursion is introduced.
Abstract: We present a model for data organized as graphs. Regular expressions over the types of the node and edge labels are used to qualify connected subgraphs. An algebraic language based on these regular expressions and supporting a restricted form of recursion is introduced. A natural application of this model and its query language is hypertext querying.

Journal ArticleDOI
01 Oct 1993-Tellus A
TL;DR: This paper presents the implementation of the strategy used in the weather forecasting arpege/ifs project to produce the adjoint code from the code representing the numerical model, and describes the Odyssee system, an open system built as a toolkit, written in a high-level programming language adapted to this purpose.
Abstract: This paper describes the design of Odyssee , a system for fortran programs manipulations and its application to automatic differentiation. The Odyssee system manipulates fortran programs as symbolic objects. It is an open system built as a toolkit, written in a high-level programming language adapted to this purpose. The use of a variational method to perform data assimilation requires the computation of the gradient of a cost function represented by a large-size fortran program. The usual drawback in the reverse automatic differentiation method is the storage requirement. The Odyssee system allows one to implement storage/recomputation strategies in order to fit the needed compromizes. We present the implementation of the strategy used in the weather forecasting arpege/ifs project to produce the adjoint code from the code representing the numerical model. Odyssee produces the same code as the hand-written adjoint code for thearpege/ifs project. DOI: 10.1034/j.1600-0870.1993.00016.x

Proceedings ArticleDOI
02 May 1993
TL;DR: It is shown that by adding four sensors on the passive joints, a unique closed-form solution of the posture of the end-effector can be obtained for the most general case.
Abstract: In the most general case the measurement of the link lengths of a six-degree-of-freedom parallel manipulator is not sufficient to determine the actual unique posture of its platform. It is shown that by adding four sensors on the passive joints, a unique closed-form solution of the posture of the end-effector can be obtained for the most general case. It is shown that three sensors are sufficient for a particular mechanical architecture. >

Proceedings ArticleDOI
15 Jun 1993
TL;DR: The development of an efficient model-based approach to detect and characterize precisely important features such as edges, corners and vertices is discussed and some efficient models associated to each of these features directly from the image by searching the parameters of the model that best approximate the observed grey level image intensities.
Abstract: The development of an efficient model-based approach to detect and characterize precisely important features such as edges, corners and vertices is discussed. The key is to propose some efficient models associated to each of these features directly from the image by searching the parameters of the model that best approximate the observed grey level image intensities. Due to the large amount of time required by a first approach that assumes the blur of the imaging acquisition system to be describable by a 2-D Gaussian filter, different solutions that drastically reduce this computational time are considered and developed. The problem of the initialization phase in the minimization process is considered, and an original and efficient solution is proposed. A large number of experiments involving real images are conducted in order to test and compare the reliability, the robustness, and the efficiency of the proposed approaches. >

Proceedings ArticleDOI
01 Jun 1993
TL;DR: It is shown how classical datalog semantics can be used directly and very simply to provide semantics to a syntactic extension of datalog with methods, classes, inheritance, overloading and late binding.
Abstract: We show how classical datalog semantics can be used directly and very simply to provide semantics to a syntactic extension of datalog with methods, classes, inheritance, overloading and late binding. Several approaches to resolution are considered, implemented in the model, and formally compared. They range from resolution in C++ style to original kinds of resolution suggested by the declarative nature of the language. We show connections to view specification and a further extension allowing runtime derivation of the class hierarchy.