scispace - formally typeset
Search or ask a question
Author

Raghunathan Rengaswamy

Bio: Raghunathan Rengaswamy is an academic researcher from Indian Institute of Technology Madras. The author has contributed to research in topics: Proton exchange membrane fuel cell & Fault detection and isolation. The author has an hindex of 39, co-authored 210 publications receiving 9632 citations. Previous affiliations of Raghunathan Rengaswamy include Indian Institute of Technology Bombay & Bosch.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, an univariate interval halving technique is fused with Mahalanobis distance to develop a multivariate tool that accounts for interactions between variables, which can be used for reliable CLPA and/or for user defined benchmarking of control loops.
Abstract: Control loop performance assessment (CLPA) techniques assume that the data being analyzed is generated during steady state operation with fixed plant dynamics and controller parameters. However, in industrial settings one often encounters environmental and feedstock variations which can induce significant changes in the plant dynamics. Availability of data sets corresponding to fixed configurations is therefore questionable in industrial scenarios, in which case it becomes imperative to extract the same from routine plant operating data. This article proposes a technique for segmenting multivariate control loop data into portions corresponding to fixed steady state operation of the system. The proposed technique exploits the fact that changes in the operating region of the system lead to changes in variance-covariance matrix of multivariate control loop data. The univariate interval halving technique is fused with Mahalanobis distance to develop a multivariate tool that accounts for interactions between variables. The resulting data segments can be used for reliable CLPA and/or for user defined benchmarking of control loops. A multivariate control loop performance index is also proposed that requires significantly less data as compared to one of the previously proposed techniques. The proposed technique requires only routine operating data from the plant, and is tested on benchmark systems in the literature with simulations. Experimental validation on a model predictive control system aimed at maintaining the temperature profile of a metal plate demonstrates applicability of the technique to industrial systems. The proposed technique acts as a tool for preprocessing data relevant to CLPA and can be applied to large scale interacting multivariate systems. © 2017 American Institute of Chemical Engineers AIChE J, 63: 3311–3328, 2017

12 citations

Journal ArticleDOI
TL;DR: A genetic algorithm optimization-based design tool for discovering very large-scale integration of discrete microfluidic networks for a given objective function and can be a significant step toward drastically cutting down on the laborious trial-and-error design process.

12 citations

Proceedings ArticleDOI
11 Jun 2008
TL;DR: This contribution presents the use of two well known techniques of multi-objective optimization to solve for a plant friendly input design where the plant friendly objective is to keep input move sizes low.
Abstract: In optimal input design problems, the designer seeks to solve for maximally informative inputs to be used as perturbation signals in system identification experiments. Plant- friendly identification experiments are those that satisfy plant or operator constraints on experiment time, input and output amplitudes or input move sizes. These have been reported to be in direct conflict with requirements for good identification. Hence plant-friendly input design is inherently multi-objective in nature. In this contribution, we present the use of two well known techniques of multi-objective optimization to solve for a plant friendly input design where the plant friendly objective is to keep input move sizes low. We relax the constraint on the input move sizes by constraining the variance of the move size instead. Both techniques result in convex optimization problems which can be solved efficiently using powerful algorithms.

11 citations

Proceedings ArticleDOI
01 Jan 2004
TL;DR: In this article, a recursive nonlinear dynamic data reconciliation (RNDDR) formulation is presented, which extends the capability of the EKF by allowing for incorporation of algebraic constraints and bounds.
Abstract: The task of improving the quality of the data so that it is consistent with material and energy balances is called reconciliation. Since chemical processes often operate dynamically in nonlinear regimes, techniques like extended Kalman filter (EKF) and nonlinear dynamic data reconciliation (NDDR) have been developed. There are various issues that arise with the use of either of these techniques: EKF cannot handle inequality or equality constraints, while the NDDR has high computational cost. In this paper, a recursive nonlinear dynamic data reconciliation (RNDDR) formulation is presented. The RNDDR formulation extends the capability of the EKF by allowing for incorporation of algebraic constraints and bounds. The RNDDR is evaluated with four case studies that have been previously studied by Haseltine and Rawlings. It has been shown that the EKF fails in constructing reliable state estimates in all the four cases due to the inability in handling algebraic constraints. Reliable state estimates are achieved by the RNDDR formulation in all the cases in presence of large initialization errors.

11 citations

Journal ArticleDOI
TL;DR: In this paper , the authors identify and categorize various hybrid models available in the literature that integrate machine learning models with different forms of domain knowledge, and summarize benefits such as enhanced predictive power, extrapolation capabilities and other advantages of combining the two approaches.
Abstract: Model building and parameter estimation are traditional concepts widely used in chemical, biological, metallurgical, and manufacturing industries. Early modeling methodologies focused on mathematically capturing the process knowledge and domain expertise of the modeler. The models thus developed are termed first principles models (or white-box models). Over time, computational power became cheaper, and massive amounts of data became available for modeling. This led to the development of cutting edge machine learning models (black-box models) and artificial intelligence (AI) techniques. Hybrid models (gray-box models) are a combination of first principles and machine learning models. The development of hybrid models has captured the attention of researchers as this combines the best of both modeling paradigms. Recent attention to this field stems from the interest in explainable AI (XAI), a critical requirement as AI systems become more pervasive. This work aims at identifying and categorizing various hybrid models available in the literature that integrate machine-learning models with different forms of domain knowledge. Benefits such as enhanced predictive power, extrapolation capabilities, and other advantages of combining the two approaches are summarized. The goal of this article is to consolidate the published corpus in the area of hybrid modeling and develop a comprehensive framework to understand the various techniques presented. This framework can further be used as the foundation to explore rational associations between several models.

11 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Christopher M. Bishop1
01 Jan 2006
TL;DR: Probability distributions of linear models for regression and classification are given in this article, along with a discussion of combining models and combining models in the context of machine learning and classification.
Abstract: Probability Distributions.- Linear Models for Regression.- Linear Models for Classification.- Neural Networks.- Kernel Methods.- Sparse Kernel Machines.- Graphical Models.- Mixture Models and EM.- Approximate Inference.- Sampling Methods.- Continuous Latent Variables.- Sequential Data.- Combining Models.

10,141 citations

01 Apr 2003
TL;DR: The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it as mentioned in this paper, and also presents new ideas and alternative interpretations which further explain the success of the EnkF.
Abstract: The purpose of this paper is to provide a comprehensive presentation and interpretation of the Ensemble Kalman Filter (EnKF) and its numerical implementation. The EnKF has a large user group, and numerous publications have discussed applications and theoretical aspects of it. This paper reviews the important results from these studies and also presents new ideas and alternative interpretations which further explain the success of the EnKF. In addition to providing the theoretical framework needed for using the EnKF, there is also a focus on the algorithmic formulation and optimal numerical implementation. A program listing is given for some of the key subroutines. The paper also touches upon specific issues such as the use of nonlinear measurements, in situ profiles of temperature and salinity, and data which are available with high frequency in time. An ensemble based optimal interpolation (EnOI) scheme is presented as a cost-effective approach which may serve as an alternative to the EnKF in some applications. A fairly extensive discussion is devoted to the use of time correlated model errors and the estimation of model bias.

2,975 citations

Journal ArticleDOI
TL;DR: A bibliographical review on reconfigurable fault-tolerant control systems (FTCS) is presented, with emphasis on the reconfiguring/restructurable controller design techniques.

2,455 citations