scispace - formally typeset
Search or ask a question
Topic

Jacobian matrix and determinant

About: Jacobian matrix and determinant is a research topic. Over the lifetime, 9334 publications have been published within this topic receiving 183747 citations. The topic is also known as: Jacobi matrix & Jacobi system.


Papers
More filters
Book
01 Jan 2006
TL;DR: In this paper, the Jacobian is used to describe the relationship between rigid motions and homogeneous transformations, and a linear algebraic approach is proposed for vision-based control of dynamical systems.
Abstract: Preface. 1. Introduction. 2. Rigid Motions and Homogeneous Transformations. 3. Forward and Inverse Kinematics. 4. Velocity Kinematics-The Jacobian. 5. Path and Trajectory Planning. 6. Independent Joint Control. 7. Dynamics. 8. Multivariable Control. 9. Force Control. 10. Geometric Nonlinear Control. 11. Computer Vision. 12. Vision-Based Control. Appendix A: Trigonometry. Appendix B: Linear Algebra. Appendix C: Dynamical Systems. Appendix D: Lyapunov Stability. Index.

3,100 citations

Book
01 Apr 1988
TL;DR: In this article, the authors discuss the properties of Vectors and Matrices, the Vec-Operator, the Moore-Penrose Inverse Miscellaneous Matrix Results, and the Linear Regression Model.
Abstract: Preface MATRICES: Basic Properties of Vectors and Matrices Kronecker Products, the Vec-Operator and the Moore- Penrose Inverse Miscellaneous Matrix Results DIFFERENTIALS: THE THEORY: Mathematical Preliminaries Differentials and Differentiability The Second Differential Static Optimization DIFFERENTIALS: THE PRACTICE: Some Important Differentials First- Order Differentials and Jacobian Matrices Second-Order Differentials and Hessian Matrices INEQUALITIES: Inequalities THE LINEAR MODEL: Statistical Preliminaries The Linear Regression Model Further Topics in the Linear Model APPLICATIONS TO MAXIMUM LIKELIHOOD ESTIMATION: Maximum Likelihood Estimation Simultaneous Equations Topics in Psychometrics Subject Index Bibliography.

2,868 citations

Proceedings ArticleDOI
23 Jun 2013
TL;DR: A Supervised Descent Method (SDM) is proposed for minimizing a Non-linear Least Squares (NLS) function and achieves state-of-the-art performance in the problem of facial feature detection.
Abstract: Many computer vision problems (e.g., camera calibration, image alignment, structure from motion) are solved through a nonlinear optimization method. It is generally accepted that 2nd order descent methods are the most robust, fast and reliable approaches for nonlinear optimization of a general smooth function. However, in the context of computer vision, 2nd order descent methods have two main drawbacks: (1) The function might not be analytically differentiable and numerical approximations are impractical. (2) The Hessian might be large and not positive definite. To address these issues, this paper proposes a Supervised Descent Method (SDM) for minimizing a Non-linear Least Squares (NLS) function. During training, the SDM learns a sequence of descent directions that minimizes the mean of NLS functions sampled at different points. In testing, SDM minimizes the NLS objective using the learned descent directions without computing the Jacobian nor the Hessian. We illustrate the benefits of our approach in synthetic and real examples, and show how SDM achieves state-of-the-art performance in the problem of facial feature detection. The code is available at www.humansensing.cs. cmu.edu/intraface.

2,138 citations

Journal ArticleDOI
TL;DR: The aim of this paper is to present the reader with a perspective on how JFNK may be applicable to applications of interest and to provide sources of further practical information.

1,803 citations

Journal ArticleDOI
TL;DR: VODE is a new initial value ODE solver for stiff and nonstiff systems that uses variable-coefficient Adams-Moulton and Backward Differentiation Formula methods in Nordsieck form, treating the Jacobian as full or banded.
Abstract: VODE is a new initial value ODE solver for stiff and nonstiff systems. It uses variable-coefficient Adams-Moulton and Backward Differentiation Formula (BDF) methods in Nordsieck form, as taken from the older solvers EPISODE and EPISODEB, treating the Jacobian as full or banded. Unlike the older codes, VODE has a highly flexible user interface that is nearly identical to that of the ODEPACK solver LSODE.In the process, several algorithmic improvements have been made in VODE, aside from the new user interface. First, a change in stepsize and/or order that is decided upon at the end of one successful step is not implemented until the start of the next step, so that interpolations performed between steps use the more correct data. Second, a new algorithm for setting the initial stepsize has been included, which iterates briefly to estimate the required second derivative vector. Efficiency is often greatly enhanced by an added algorithm for saving and reusing the Jacobian matrix J, as it occurs in the Newton m...

1,601 citations


Network Information
Related Topics (5)
Nonlinear system
208.1K papers, 4M citations
86% related
Differential equation
88K papers, 2M citations
84% related
Linear system
59.5K papers, 1.4M citations
84% related
Matrix (mathematics)
105.5K papers, 1.9M citations
83% related
Partial differential equation
70.8K papers, 1.6M citations
83% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023475
2022978
2021400
2020398
2019409
2018397