Author
Palaniappan Ramu
Other affiliations: University of Florida
Bio: Palaniappan Ramu is an academic researcher from Indian Institute of Technology Madras. The author has contributed to research in topic(s): Steganography & Reliability (statistics). The author has an hindex of 13, co-authored 41 publication(s) receiving 847 citation(s). Previous affiliations of Palaniappan Ramu include University of Florida.
Papers
More filters
TL;DR: Modelling with Stakeholders is updated and builds on Voinov and Bousquet, 2010, and structured mechanisms to examine and account for human biases and beliefs in participatory modelling are suggested.
Abstract: This paper updates and builds on 'Modelling with Stakeholders' Voinov and Bousquet, 2010 which demonstrated the importance of, and demand for, stakeholder participation in resource and environmental modelling. This position paper returns to the concepts of that publication and reviews the progress made since 2010. A new development is the wide introduction and acceptance of social media and web applications, which dramatically changes the context and scale of stakeholder interactions and participation. Technology advances make it easier to incorporate information in interactive formats via visualization and games to augment participatory experiences. Citizens as stakeholders are increasingly demanding to be engaged in planning decisions that affect them and their communities, at scales from local to global. How people interact with and access models and data is rapidly evolving. In turn, this requires changes in how models are built, packaged, and disseminated: citizens are less in awe of experts and external authorities, and they are increasingly aware of their own capabilities to provide inputs to planning processes, including models. The continued acceleration of environmental degradation and natural resource depletion accompanies these societal changes, even as there is a growing acceptance of the need to transition to alternative, possibly very different, life styles. Substantive transitions cannot occur without significant changes in human behaviour and perceptions. The important and diverse roles that models can play in guiding human behaviour, and in disseminating and increasing societal knowledge, are a feature of stakeholder processes today. Display Omitted Participatory modelling has become mainstream in resource and environmental management.We review recent contributions to participatory environmental modelling to identify the tools, methods and processes applied.Global internet connectivity, social media and crowdsourcing create opportunities for participatory modelling.We suggest structured mechanisms to examine and account for human biases and beliefs in participatory modelling.Advanced visualization tools, gaming, and virtual environments improve communication with stakeholders.
325 citations
TL;DR: In this article, a convex hull approach is adopted to isolate the points corresponding to unwanted bifurcations in the design space, which is applied to a tube impacting a rigid wall representing a transient dynamic problem.
Abstract: Nonlinear problems such as transient dynamic problems exhibit structural responses that can be discontinuous due to numerous bifurcations. This hinders gradient-based or response surface-based optimization. This paper proposes a novel approach to split the design space into regions where the response is continuous. This makes traditional optimization viable. A convex hull approach is adopted to isolate the points corresponding to unwanted bifurcations in the design space. The proposed approach is applied to a tube impacting a rigid wall representing a transient dynamic problem. Since nonlinear behavior is highly sensitive to small variations in design, reliability-based design optimization is performed. The proposed method provides the designer an optimal design with a prescribed dynamic behavior.
61 citations
TL;DR: An approach that uses discrete wavelet transform to decompose signals and singular value decomposition (SVD) to embed the secret information into the decomposed ECG signal and the observations validate that HH is the ideal sub-band to hide data.
Abstract: ECG Steganography provides secured transmission of secret information such as patient personal information through ECG signals. This paper proposes an approach that uses discrete wavelet transform to decompose signals and singular value decomposition (SVD) to embed the secret information into the decomposed ECG signal. The novelty of the proposed method is to embed the watermark using SVD into the two dimensional (2D) ECG image. The embedding of secret information in a selected sub band of the decomposed ECG is achieved by replacing the singular values of the decomposed cover image by the singular values of the secret data. The performance assessment of the proposed approach allows understanding the suitable sub-band to hide secret data and the signal degradation that will affect diagnosability. Performance is measured using metrics like Kullback---Leibler divergence (KL), percentage residual difference (PRD), peak signal to noise ratio (PSNR) and bit error rate (BER). A dynamic location selection approach for embedding the singular values is also discussed. The proposed approach is demonstrated on a MIT-BIH database and the observations validate that HH is the ideal sub-band to hide data. It is also observed that the signal degradation (less than 0.6 %) is very less in the proposed approach even with the secret data being as large as the sub band size. So, it does not affect the diagnosability and is reliable to transmit patient information.
58 citations
TL;DR: The novelty of the proposed approach is to use CACO in ECG Steganography, to identify Multiple Scaling Factors (MSFs) that will provide a better tradeoff compared to uniform Single Scaling Factor (SSF) and the results validate that the tradeoff curve obtained through MSFs is better than the tradeoffs obtained for any SSF.
Abstract: ECG steganography is performed using DWT-SVD and quantization watermarking scheme.Imperceptibility-robustness tradeoff is investigated.Continuous Ant Colony Optimization provides optimized Multiple Scaling Factors.MSFs are superior to SSF in providing better imperceptibility-robustness tradeoff. ECG Steganography ensures protection of patient data when ECG signals embedded with patient data are transmitted over the internet. Steganography algorithms strive to recover the embedded patient data entirely and to minimize the deterioration in the cover signal caused by the embedding. This paper presents a Continuous Ant Colony Optimization (CACO) based ECG Steganography scheme using Discrete Wavelet Transform and Singular Value Decomposition. Quantization techniques allow embedding the patient data into the ECG signal. The scaling factor in the quantization techniques governs the tradeoff between imperceptibility and robustness. The novelty of the proposed approach is to use CACO in ECG Steganography, to identify Multiple Scaling Factors (MSFs) that will provide a better tradeoff compared to uniform Single Scaling Factor (SSF). The optimal MSFs significantly improve the performance of ECG steganography which is measured by metrics such as Peak Signal to Noise Ratio, Percentage Residual Difference, Kullback-Leibler distance and Bit Error Rate. Performance of the proposed approach is demonstrated on the MIT-BIH database and the results validate that the tradeoff curve obtained through MSFs is better than the tradeoff curve obtained for any SSF. The results also advocate appropriate SSFs for target imperceptibility or robustness.
49 citations
TL;DR: An attempt has been made to use curvelet transforms which permit identifying the coefficients that store the crucial information about diagnosis in ECG steganography to validate that coefficients around zero are ideal for watermarking to minimize deterioration and there is no loss in the data retrieved.
Abstract: ECG steganography allows secured transmission of patient data that are tagged to the ECG signals. Signal deterioration leading to loss of diagnosis information and inability to retrieve patient data fully are the major challenges with ECG steganography. In this work, an attempt has been made to use curvelet transforms which permit identifying the coefficients that store the crucial information about diagnosis. The novelty of the proposed approach is the usage of curvelet transform for ECG steganography, adaptive selection of watermark location and a new threshold selection algorithm. It is observed that when coefficients around zero are modified to embed the watermark, the signal deterioration is the least. In order to avoid overlap of watermark, an n × n sequence is used to embed the watermark. The imperceptibility of the watermark is measured using metrics such as Peak Signal to Noise Ratio, Percentage Residual Difference and Kullback-Leibler distance. The ability to extract the patient data is measured by the Bit Error Rate. Performance of the proposed approach is demonstrated on the MIT-BIH database and the results validate that coefficients around zero are ideal for watermarking to minimize deterioration and there is no loss in the data retrieved. For an increased patient data size, the cover signal deteriorates but the Bit Error Rate is zero. Therefore the proposed approach does not affect diagnosability and allows reliable steganography.
48 citations
Cited by
More filters
Journal Article•
755 citations
TL;DR: In this article, the authors present a brief survey on some of the most relevant developments in the field of optimization under uncertainty, including reliability-based optimization, robust design optimization and model updating.
Abstract: This article presents a brief survey on some of the most relevant developments in the field of optimization under uncertainty. In particular, the scope and the relevance of the papers included in this Special Issue are analyzed. The importance of uncertainty quantification and optimization techniques for producing improved models and designs is thoroughly discussed. The focus of the discussion is in three specific research areas, namely reliability-based optimization, robust design optimization and model updating. The arguments presented indicate that optimization under uncertainty should become customary in engineering design in the foreseeable future. Computational aspects play a key role in analyzing and modeling realistic systems and structures.
442 citations
TL;DR: A comprehensive review of Uncertainty-Based Multidisciplinary Design Optimization (UMDO) theory and the state of the art in UMDO methods for aerospace vehicles is presented.
Abstract: This paper presents a comprehensive review of Uncertainty-Based Multidisciplinary Design Optimization (UMDO) theory and the state of the art in UMDO methods for aerospace vehicles. UMDO has been widely acknowledged as an advanced methodology to address competing objectives of aerospace vehicle design, such as performance, cost, reliability and robustness. However the major challenges of UMDO, namely the computational complexity and organizational complexity caused by both time-consuming disciplinary analysis models and UMDO algorithms, still greatly hamper its application in aerospace engineering. In recent years there is a surge of research in this field aiming at solving these problems. The purpose of this paper is to review these existing approaches systematically, highlight research challenges and opportunities, and help guide future efforts. Firstly, the UMDO theory preliminaries are introduced to clarify the basic UMDO concepts and mathematical formulations, as well as provide a panoramic view of the general UMDO solving process. Then following the UMDO solving process, research progress of each key step is separately surveyed and discussed, specifically including uncertainty modeling, uncertainty propagation and analysis, optimization under uncertainty, and UMDO procedure. Finally some conclusions are given, and future research trends and prospects are discussed.
366 citations