scispace - formally typeset
Search or ask a question
Author

Kapil Ahuja

Other affiliations: Virginia Tech
Bio: Kapil Ahuja is an academic researcher from Indian Institute of Technology Indore. The author has contributed to research in topics: Linear system & Generalized minimal residual method. The author has an hindex of 10, co-authored 59 publications receiving 338 citations. Previous affiliations of Kapil Ahuja include Virginia Tech.


Papers
More filters
Journal ArticleDOI
TL;DR: Rec recycling BiCG is introduced, a BiCG method that recycles two Krylov subspaces from one pair of dual linear systems to the next pair and builds the foundation for developing recycling variants of other bi-Lanczos based methods, such as CGS, BiCGSTAB, QMR, and TFQMR.
Abstract: Science and engineering problems frequently require solving a sequence of dual linear systems. Besides having to store only a few Lanczos vectors, using the biconjugate gradient method (BiCG) to solve dual linear systems has advantages for specific applications. For example, using BiCG to solve the dual linear systems arising in interpolatory model reduction provides a backward error formulation in the model reduction framework. Using BiCG to evaluate bilinear forms---for example, in quantum Monte Carlo (QMC) methods for electronic structure calculations---leads to a quadratic error bound. Since our focus is on sequences of dual linear systems, we introduce recycling BiCG, a BiCG method that recycles two Krylov subspaces from one pair of dual linear systems to the next pair. The derivation of recycling BiCG also builds the foundation for developing recycling variants of other bi-Lanczos based methods, such as CGS, BiCGSTAB, QMR, and TFQMR. We develop an augmented bi-Lanczos algorithm and a modified two-te...

54 citations

Journal ArticleDOI
TL;DR: In this article, the recycling BiCG algorithm is extended to BiCGSTAB, which uses a recycle space, which is built from left and right approximate invariant subspaces.
Abstract: Krylov subspace recycling is a process for accelerating the convergence of sequences of linear systems. Based on this technique, the recycling BiCG algorithm has been developed recently. Here, we now generalize and extend this recycling theory to BiCGSTAB. Recycling BiCG focuses on efficiently solving sequences of dual linear systems, while the focus here is on efficiently solving sequences of single linear systems (assuming nonsymmetric matrices for recycling BiCG and recycling BiCGSTAB). As compared with other methods for solving sequences of single linear systems with nonsymmetric matrices (e.g., recycling variants of GMRES), BiCG-based recycling algorithms, like recycling BiCGSTAB, have the advantage that they involve a short-term recurrence and hence do not suffer from storage issues and are also cheaper with respect to the orthogonalizations. We modify the BiCGSTAB algorithm to use a recycle space, which is built from left and right approximate invariant subspaces. Using our algorithm for a parametr...

38 citations

Journal ArticleDOI
TL;DR: A Localized Multiple Kernel learning approach for Anomaly Detection (LMKAD) using OCC, where the weight for each kernel is assigned locally and the parameters of the gating function and one-class classifier are optimized simultaneously through a two-step optimization process.
Abstract: Multi-kernel learning has been well explored in the recent past and has exhibited promising outcomes for multi-class classification and regression tasks. In this paper, we present a multiple kernel learning approach for the One-class Classification (OCC) task and employ it for anomaly detection. Recently, the basic multi-kernel approach has been proposed to solve the OCC problem, which is simply a convex combination of different kernels with equal weights. This paper proposes a Localized Multiple Kernel learning approach for Anomaly Detection ( L M K A D ) using OCC, where the weight for each kernel is assigned locally. Proposed L M K A D approach adapts the weight for each kernel using a gating function. The parameters of the gating function and one-class classifier are optimized simultaneously through a two-step optimization process. We present the empirical results of the performance of L M K A D on 25 benchmark datasets from various disciplines. This performance is evaluated against existing Multi Kernel Anomaly Detection ( M K A D ) algorithm, and four other existing kernel-based one-class classifiers to showcase the credibility of our approach. L M K A D achieves significantly better Gmean scores while using a lesser number of support vectors compared to M K A D . Friedman test is also performed to verify the statistical significance of the results claimed in this paper.

32 citations

Journal ArticleDOI
TL;DR: A new, hybrid approach is proposed that combines the cheap iterations of BiCGStab with the robustness of rGCROT, and is evaluated on a turbulent channel flow problem and on a porous medium flow problem.

30 citations

Journal ArticleDOI
TL;DR: In this paper, a variation of HOG and Gabor filter combination called Histogram of Oriented Texture (HOT) was proposed for classification of mammogram patches as normal-abnormal and benign-malignant.
Abstract: Breast cancer is becoming pervasive with each passing day. Hence, its early detection is a big step in saving the life of any patient. Mammography is a common tool in breast cancer diagnosis. The most important step here is classification of mammogram patches as normal–abnormal and benign–malignant. Texture of a breast in a mammogram patch plays a significant role in these classifications. We propose a variation of Histogram of Gradients (HOG) and Gabor filter combination called Histogram of Oriented Texture (HOT) that exploits this fact. We also revisit the Pass Band - Discrete Cosine Transform (PB-DCT) descriptor that captures texture information well. All features of a mammogram patch may not be useful. Hence, we apply a feature selection technique called Discrimination Potentiality (DP). Our resulting descriptors, DP-HOT and DP-PB-DCT, are compared with the standard descriptors. Density of a mammogram patch is important for classification, and has not been studied exhaustively. The Image Retrieval in Medical Application (IRMA) database from RWTH Aachen, Germany is a standard database that provides mammogram patches, and most researchers have tested their frameworks only on a subset of patches from this database. We apply our two new descriptors on all images of the IRMA database for density wise classification, and compare with the standard descriptors. We achieve higher accuracy than all of the existing standard descriptors (more than 92%).

30 citations


Cited by
More filters
Journal ArticleDOI

3,734 citations

Journal ArticleDOI
TL;DR: Model reduction aims to reduce the computational burden by generating reduced models that are faster and cheaper to simulate, yet accurately represent the original large-scale system behavior as mentioned in this paper. But model reduction of linear, nonparametric dynamical systems has reached a considerable level of maturity, as reflected by several survey papers and books.
Abstract: Numerical simulation of large-scale dynamical systems plays a fundamental role in studying a wide range of complex physical phenomena; however, the inherent large-scale nature of the models often leads to unmanageable demands on computational resources. Model reduction aims to reduce this computational burden by generating reduced models that are faster and cheaper to simulate, yet accurately represent the original large-scale system behavior. Model reduction of linear, nonparametric dynamical systems has reached a considerable level of maturity, as reflected by several survey papers and books. However, parametric model reduction has emerged only more recently as an important and vibrant research area, with several recent advances making a survey paper timely. Thus, this paper aims to provide a resource that draws together recent contributions in different communities to survey the state of the art in parametric model reduction methods. Parametric model reduction targets the broad class of problems for wh...

1,230 citations

Book
01 Sep 2014
TL;DR: It is quite impossible to include in a single volume of reasonable size, an adequate and exhaustive discussion of the calculus in its more advanced stages, so it becomes necessary, in planning a thoroughly sound course in the subject, to consider several important aspects of the vast field confronting a modern writer.
Abstract: WITH the ever-widening scope of modern mathematical analysis and its many ramifications, it is quite impossible to include, in a single volume of reasonable size, an adequate and exhaustive discussion of the calculus in its more advanced stages. It therefore becomes necessary, in planning a thoroughly sound course in the subject, to consider several important aspects of the vast field confronting a modern writer. The limitation of space renders the selection of subject-matter fundamentally dependent upon the aim of the course, which may or may not be related to the content of specific examination syllabuses. Logical development, too, may lead to the inclusion of many topics which, at present, may only be of academic interest, while others, of greater practical value, may have to be omitted. The experience and training of the writer may also have, more or less, a bearing on both these considerations.Advanced CalculusBy Dr. C. A. Stewart. Pp. xviii + 523. (London: Methuen and Co., Ltd., 1940.) 25s.

881 citations

Journal ArticleDOI
TL;DR: This review aims to identify the common underlying principles and the assumptions that are often made implicitly by various methods in deep learning, and draws connections between classic “shallow” and novel deep approaches and shows how this relation might cross-fertilize or extend both directions.
Abstract: Deep learning approaches to anomaly detection have recently improved the state of the art in detection performance on complex datasets such as large collections of images or text. These results have sparked a renewed interest in the anomaly detection problem and led to the introduction of a great variety of new methods. With the emergence of numerous such methods, including approaches based on generative models, one-class classification, and reconstruction, there is a growing need to bring methods of this field into a systematic and unified perspective. In this review we aim to identify the common underlying principles as well as the assumptions that are often made implicitly by various methods. In particular, we draw connections between classic 'shallow' and novel deep approaches and show how this relation might cross-fertilize or extend both directions. We further provide an empirical assessment of major existing methods that is enriched by the use of recent explainability techniques, and present specific worked-through examples together with practical advice. Finally, we outline critical open challenges and identify specific paths for future research in anomaly detection.

310 citations