scispace - formally typeset
Search or ask a question
Author

Jack Sherman

Bio: Jack Sherman is an academic researcher. The author has contributed to research in topics: Generalised logistic function & Gompertz function. The author has an hindex of 2, co-authored 2 publications receiving 1124 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, simplified methods for fitting a Gompertz curve and a modified exponential curve are described, together with the one described by Spurr and Arnold1 for fitting logistic curve, which are useful in determining which type of growth curve is most appropriate for a given set of data.
Abstract: This paper describes simplified methods for fitting a Gompertz curve and a modified exponential curve. These methods, together with the one described by Spurr and Arnold1 for fitting a logistic curve, are useful in determining which type of growth curve is most appropriate for a given set of data.

9 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The history of these fomulas is presented and various applications to statistics, networks, structural analysis, asymptotic analysis, optimization, and partial differential equations are discussed.
Abstract: The Sherman–Morrison–Woodbury formulas relate the inverse of a matrix after a small-rank perturbation to the inverse of the original matrix. The history of these fomulas is presented and various applications to statistics, networks, structural analysis, asymptotic analysis, optimization, and partial differential equations are discussed. The Sherman-Morrison-Woodbury formulas express the inverse of a matrix after a small rank perturbation in terms of the inverse of the original matrix. This paper surveys the history of these formulas and we examine some applications where these formulas are helpful

1,026 citations

Journal ArticleDOI
TL;DR: In this paper, the inverse of the sum of two matrices, one of them being nonsingular, has been studied and new expressions for the inverse were derived for the non-singular case.
Abstract: Available expressions are reviewed and new ones derived for the inverse of the sum of two matrices, one of them being nonsingular. Particular attention is given to $({\text{{\bf A}}} + {\text{{\bf ...

834 citations

Posted Content
TL;DR: In this article, a background-aware correlation filter is proposed to model how both the foreground and background of the object varies over time, which can be used for real-time tracking.
Abstract: Correlation Filters (CFs) have recently demonstrated excellent performance in terms of rapidly tracking objects under challenging photometric and geometric variations. The strength of the approach comes from its ability to efficiently learn - "on the fly" - how the object is changing over time. A fundamental drawback to CFs, however, is that the background of the object is not be modelled over time which can result in suboptimal results. In this paper we propose a Background-Aware CF that can model how both the foreground and background of the object varies over time. Our approach, like conventional CFs, is extremely computationally efficient - and extensive experiments over multiple tracking benchmarks demonstrate the superior accuracy and real-time performance of our method compared to the state-of-the-art trackers including those based on a deep learning paradigm.

725 citations

Proceedings ArticleDOI
01 Oct 2017
TL;DR: This work proposes a Background-Aware CF based on hand-crafted features (HOG] that can efficiently model how both the foreground and background of the object varies over time, and superior accuracy and real-time performance of the method compared to the state-of-the-art trackers.
Abstract: Correlation Filters (CFs) have recently demonstrated excellent performance in terms of rapidly tracking objects under challenging photometric and geometric variations. The strength of the approach comes from its ability to efficiently learn - on the fly - how the object is changing over time. A fundamental drawback to CFs, however, is that the background of the target is not modeled over time which can result in suboptimal performance. Recent tracking algorithms have suggested to resolve this drawback by either learning CFs from more discriminative deep features (e.g. DeepSRDCF [9] and CCOT [11]) or learning complex deep trackers (e.g. MDNet [28] and FCNT [33]). While such methods have been shown to work well, they suffer from high complexity: extracting deep features or applying deep tracking frameworks is very computationally expensive. This limits the real-time performance of such methods, even on high-end GPUs. This work proposes a Background-Aware CF based on hand-crafted features (HOG [6]) that can efficiently model how both the foreground and background of the object varies over time. Our approach, like conventional CFs, is extremely computationally efficient- and extensive experiments over multiple tracking benchmarks demonstrate the superior accuracy and real-time performance of our method compared to the state-of-the-art trackers.

679 citations

Journal ArticleDOI
TL;DR: In this paper, the authors show that for the most commonly used covariance functions, the matrix $C$ can be hierarchically factored into a product of block low-rank updates of the identity matrix, yielding an $\mathcal {O} (n\,\log^2, n)$ algorithm for inversion.
Abstract: A number of problems in probability and statistics can be addressed using the multivariate normal (Gaussian) distribution. In the one-dimensional case, computing the probability for a given mean and variance simply requires the evaluation of the corresponding Gaussian density. In the $n$ -dimensional setting, however, it requires the inversion of an $n \times n$ covariance matrix, $C$ , as well as the evaluation of its determinant, $\det (C)$ . In many cases, such as regression using Gaussian processes, the covariance matrix is of the form $C = \sigma ^2 I + K$ , where $K$ is computed using a specified covariance kernel which depends on the data and additional parameters (hyperparameters). The matrix $C$ is typically dense, causing standard direct methods for inversion and determinant evaluation to require $\mathcal {O}(n^3)$ work. This cost is prohibitive for large-scale modeling. Here, we show that for the most commonly used covariance functions, the matrix $C$ can be hierarchically factored into a product of block low-rank updates of the identity matrix, yielding an $\mathcal {O} (n\,\log^2\, n)$ algorithm for inversion. More importantly, we show that this factorization enables the evaluation of the determinant $\det (C)$ , permitting the direct calculation of probabilities in high dimensions under fairly broad assumptions on the kernel defining $K$ . Our fast algorithm brings many problems in marginalization and the adaptation of hyperparameters within practical reach using a single CPU core. The combination of nearly optimal scaling in terms of problem size with high-performance computing resources will permit the modeling of previously intractable problems. We illustrate the performance of the scheme on standard covariance kernels.

545 citations