scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Reliability Hardening Mechanisms in Cyber-Physical Digital-Microfluidic Biochips

TL;DR: An algorithm that minimizes the number of checkpoints and determines their locations to cover every path in a given droplet-routing solution is proposed, which provides reliability-hardening mechanisms for a wide class of cyber-physical DMFBs.
Abstract: In the area of biomedical engineering, digital-microfluidic biochips (DMFBs) have received considerable attention because of their capability of providing an efficient and reliable platform for conducting point-of-care clinical diagnostics. System reliability, in turn, mandates error-recoverability while implementing biochemical assays on-chip for medical applications. Unfortunately, the technology of DMFBs is not yet fully equipped to handle error-recovery from various microfluidic operations involving droplet motion and reaction. Recently, a number of cyber-physical systems have been proposed to provide real-time checking and error-recovery in assays based on the feedback received from a few on-chip checkpoints. However, to synthesize robust feedback systems for different types of DMFBs, certain practical issues need to be considered such as co-optimization of checkpoint placement, error-recoverability, and layout of droplet-routing pathways. For application-specific DMFBs, we propose here an algorithm that minimizes the number of checkpoints and determines their locations to cover every path in a given droplet-routing solution. Next, for general-purpose DMFBs, where the checkpoints are pre-deployed in specific locations, we present a checkpoint-aware routing algorithm such that every droplet-routing path passes through at least one checkpoint to enable error-recovery and to ensure physical routability of all droplets. Furthermore, we also propose strategies for executing the algorithms in reliable mode to enhance error-recoverability. The proposed methods thus provide reliability-hardening mechanisms for a wide class of cyber-physical DMFBs.
Citations
More filters
Proceedings ArticleDOI
23 Apr 2019
TL;DR: A systematic algorithm is presented for the assignment of checkpoints required for error-recovery of available bioprotocols in case of hardware Trojans attacks in performing operations by biochip to enhance the security concerns of digital microfluidic biochips.
Abstract: Present security study involving analysis of manipulation of individual droplets of samples and reagents by digital microfluidic biochip has remarked that the biochip design flow is vulnerable to piracy attacks, hardware Trojans attacks, overproduction, Denial-of-Service attacks, and counterfeiting. Attackers can introduce bioprotocol manipulation attacks against biochips used for medical diagnosis, biochemical analysis, and frequent diseases detection in healthcare industry. Among these attacks, hardware Trojans have created a major threatening issue in its security concern with multiple ways to crack the sensitive data or alter original functionality by doing malicious operations in biochips. In this paper, we present a systematic algorithm for the assignment of checkpoints required for error-recovery of available bioprotocols in case of hardware Trojans attacks in performing operations by biochip. Moreover, it can guide the placement and timing of checkpoints so that the result of an attack is reduced, and hence enhance the security concerns of digital microfluidic biochips. Comparative study with traditional checkpoint schemes demonstrate the superiority of the proposed algorithm without overhead of the bioprotocol completion time with higher error detection accuracy.

7 citations


Cites background from "Reliability Hardening Mechanisms in..."

  • ...[18], [19] minimize the number of checkpoints to some extend, but unfortunately their technique can not always do early detection of HTs insertion as checkpoints are assigned only at common electrodes of biochip....

    [...]

Journal ArticleDOI
TL;DR: A new microelectrode cell (MC) design is presented such that the droplet-sensing operation can be enabled/disabled for individual MCs and a wear-leveling synthesis method is proposed to ensure uniform utilization of MCs on MEDA.
Abstract: A digital microfluidic biochip (DMFB) enables the miniaturization of immunoassays, point-of-care clinical diagnostics, DNA sequencing, and other laboratory procedures in biochemistry. A recent generation of biochips uses a micro-electrode-dot-array (MEDA) architecture, which provides fine-grained control of droplets and seamlessly integrates microelectronics and microfluidics using CMOS technology and a TSMC fabrication process. To ensure that bioassays are carried out on MEDA biochips efficiently, high-level synthesis algorithms have recently been proposed. However, as in the case of conventional DMFBs, microelectrodes are likely to fail when they are heavily utilized, and previous methods fail to consider reliability issues. In this article, we first present a new microelectrode cell (MC) design such that the droplet-sensing operation can be enabled/disabled for individual MCs. Next, “partial update” and “partial sensing” operations are presented based on an IEEE Std. 1687 IJTAG network design. Finally, wear-leveling synthesis method is proposed to ensure uniform utilization of MCs on MEDA. A comprehensive set of simulation results demonstrate the effectiveness of the proposed hardware design and design automation methods.

4 citations

Journal ArticleDOI
TL;DR: In this article, the authors proposed an improved whale optimization algorithm (IWOA), which can reduce the excessive use of an electrode and reuse electrodes in an average manner to optimize the longest lifetime of digital microfluidic biochips.

1 citations

References
More filters
Book
01 Jan 1979
TL;DR: The second edition of a quarterly column as discussed by the authors provides a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book "Computers and Intractability: A Guide to the Theory of NP-Completeness,” W. H. Freeman & Co., San Francisco, 1979.
Abstract: This is the second edition of a quarterly column the purpose of which is to provide a continuing update to the list of problems (NP-complete and harder) presented by M. R. Garey and myself in our book ‘‘Computers and Intractability: A Guide to the Theory of NP-Completeness,’’ W. H. Freeman & Co., San Francisco, 1979 (hereinafter referred to as ‘‘[G&J]’’; previous columns will be referred to by their dates). A background equivalent to that provided by [G&J] is assumed. Readers having results they would like mentioned (NP-hardness, PSPACE-hardness, polynomial-time-solvability, etc.), or open problems they would like publicized, should send them to David S. Johnson, Room 2C355, Bell Laboratories, Murray Hill, NJ 07974, including details, or at least sketches, of any new proofs (full papers are preferred). In the case of unpublished results, please state explicitly that you would like the results mentioned in the column. Comments and corrections are also welcome. For more details on the nature of the column and the form of desired submissions, see the December 1981 issue of this journal.

40,020 citations

Book
01 Jan 1990
TL;DR: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures and presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers.
Abstract: From the Publisher: The updated new edition of the classic Introduction to Algorithms is intended primarily for use in undergraduate or graduate courses in algorithms or data structures. Like the first edition,this text can also be used for self-study by technical professionals since it discusses engineering issues in algorithm design as well as the mathematical aspects. In its new edition,Introduction to Algorithms continues to provide a comprehensive introduction to the modern study of algorithms. The revision has been updated to reflect changes in the years since the book's original publication. New chapters on the role of algorithms in computing and on probabilistic analysis and randomized algorithms have been included. Sections throughout the book have been rewritten for increased clarity,and material has been added wherever a fuller explanation has seemed useful or new information warrants expanded coverage. As in the classic first edition,this new edition of Introduction to Algorithms presents a rich variety of algorithms and covers them in considerable depth while making their design and analysis accessible to all levels of readers. Further,the algorithms are presented in pseudocode to make the book easily accessible to students from all programming language backgrounds. Each chapter presents an algorithm,a design technique,an application area,or a related topic. The chapters are not dependent on one another,so the instructor can organize his or her use of the book in the way that best suits the course's needs. Additionally,the new edition offers a 25% increase over the first edition in the number of problems,giving the book 155 problems and over 900 exercises thatreinforcethe concepts the students are learning.

21,651 citations


"Reliability Hardening Mechanisms in..." refers methods in this paper

  • ...In this subsection, we present the details of the proposed checkpoint assignment algorithm based on a matching problem of a bipartite graph [6]....

    [...]

  • ...Since the general clique-partitioning problem is known to be NP-hard [9], we use a heuristic based on the union-find algorithm [6], which takes O (N (3)) time where N is the number of droplets on the chip....

    [...]

Journal ArticleDOI
TL;DR: How heuristic information from the problem domain can be incorporated into a formal mathematical theory of graph searching is described and an optimality property of a class of search strategies is demonstrated.
Abstract: Although the problem of determining the minimum cost path through a graph arises naturally in a number of interesting applications, there has been no underlying theory to guide the development of efficient search procedures. Moreover, there is no adequate conceptual framework within which the various ad hoc search strategies proposed to date can be compared. This paper describes how heuristic information from the problem domain can be incorporated into a formal mathematical theory of graph searching and demonstrates an optimality property of a class of search strategies.

10,366 citations


"Reliability Hardening Mechanisms in..." refers methods in this paper

  • ...The required shortest routing paths stated above are obtained by invoking the A∗ search algorithm [11]....

    [...]

Book
31 Jul 2009
TL;DR: Pseudo-code explanation of the algorithms coupled with proof of their accuracy makes this book a great resource on the basic tools used to analyze the performance of algorithms.
Abstract: If you had to buy just one text on algorithms, Introduction to Algorithms is a magnificent choice. The book begins by considering the mathematical foundations of the analysis of algorithms and maintains this mathematical rigor throughout the work. The tools developed in these opening sections are then applied to sorting, data structures, graphs, and a variety of selected algorithms including computational geometry, string algorithms, parallel models of computation, fast Fourier transforms (FFTs), and more. This book's strength lies in its encyclopedic range, clear exposition, and powerful analysis. Pseudo-code explanation of the algorithms coupled with proof of their accuracy makes this book is a great resource on the basic tools used to analyze the performance of algorithms.

2,972 citations