scispace - formally typeset
Search or ask a question

Showing papers by "Stevens Institute of Technology published in 2002"


Journal ArticleDOI
TL;DR: In this article, the prospect of small fuel cells replacing batteries in portable equipment is considered in terms of their prospective energy density, technological feasibility, safety and cost, and fuel cells seem to be best suited to applications where significantly more energy storage is required than at present in portable devices.

655 citations


Journal ArticleDOI
TL;DR: In this article, Fourier transform infrared spectroscopy in attenuated total reflection (FTIR-ATR) was used to estimate the fraction of hydrogen-bonded carboxylic groups.
Abstract: Robust multilayers can be formed on solid surfaces, and subsequently destroyed by changing the environmental conditions, by the layer-by-layer sequential assembly of monomolecular films of a polyacid and polybase from aqueous solution. Interlayer hydrogen bonding produces stable multilayers up to the point where altered pH or other environmental stimulus introduces an unacceptably large electrical charge within them. This is demonstrated for the polyacids poly(acrylic acid), PAA, and poly(methacrylic acid), PMAA, and for the polybases poly(vinylpyrrolidone), PVPON, and poly(ethylene oxide), PEO, in D2O. The adsorption was quantified by Fourier transform infrared spectroscopy in attenuated total reflection (FTIR-ATR). The ratio between suppressed ionization of the carboxylic groups within the film and their ionization in solution, as directly measured by FTIR-ATR, was used to estimate the fraction of hydrogen-bonded carboxylic groups; this was ∼0.5 in PVPON/PMAA but only ∼0.1 in the PEO/PMAA system, though...

517 citations


Journal ArticleDOI
TL;DR: The results suggested the combined effects of phosphate, silicate, and bicarbonate caused the high mobility of arsenic in Bangladesh water.

374 citations



Posted Content
TL;DR: Assessing the variants of managerial variables and their impact on project success for various types of projects, serves also a step toward the establishment of a typological theory of projects.
Abstract: Although the causes for project success and failure have been the subject of many studies, no conclusive evidence or common agreement has been achieved so far. One criticism involves the universalistic approach used often in project management studies, according to which all projects are assumed to be similar. A second problem is the issue of subjectiveness, and sometimes weakly defined success measures; yet another concern is the limited number of managerial variables examined by previous research. In the present study we use a project-specific typological approach, a multidimensional criteria for assessing project success, and a multivariate statistical analysis method. According to our typology projects were classified according to their technological uncertainty at project initiation and their system scope which is their location on a hierarchical ladder of systems and subsystems. For each of the 127 projects in our study that were executed in Israel, we recorded 360 managerial variables and 13 success measures. The use of a very detailed data and multivariate methods such as canonical correlation and eigenvector analysis enables us to account for all the interactions between managerial and success variables and to address a handful of perspectives, often left unanalyzed by previous research. Assessing the variants of managerial variables and their impact on project success for various types of projects, serves also a step toward the establishment of a typological theory of projects. Although some success factors are common to all projects, our study identified project-specific lists of factors, indicating for example, that high-uncertainty projects must be managed differently than low-uncertainty projects, and high-scope projects differently than low-scope projects.

323 citations


Posted Content
TL;DR: In this article, the authors present the results of an empirical study devoted to this question and examine the extent of usage of some risk management practices, such as risk identification, probabilistic risk analysis, planning for uncertainty and trade-off analysis, the difference in application across different types of projects and their impact on various project success dimensions.
Abstract: In times of increased competition and globalization, project success becomes even more critical to business performance, and yet many projects still suffer delays, overruns, and even failure. Ironically, however, risk management tools and techniques, which have been developed to improve project success, are used too little, and many still wonder how helpful they are. In this paper we present the results of an empirical study devoted to this question. Based on data collected on over 100 projects performed in Israel in a variety of industries, we examine the extent of usage of some risk management practices, such as risk identification, probabilistic risk analysis, planning for uncertainty and trade-off analysis, the difference in application across different types of projects, and their impact on various project success dimensions. Our findings suggest that risk management practices are still not widely used. Only a limited number of projects in our study have used any kind of risk management practices and many have only used some, but not all the available tools. When used, risk management practices seem to be working, and appear to be related to project success. We also found that risk management practices were more applicable to higher risk projects. The impact of risk management is mainly on better meeting time and budget goals and less on product performance and specification. In this case, we also found some differences according levels of technological uncertainty. Our conclusion is that risk management is still at its infancy and that at this time, more awareness to the application, training, tool development, and research on risk management is needed.

318 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide a comprehensive review of recent developments that will likely enable important advances in areas such as optical communications, ultra-high resolution spectroscopy and applications to ultrahigh sensitivity gas-sensing systems.
Abstract: Following an introduction to the history of the invention of the quantum cascade (QC) laser and of the band-structure engineering advances that have led to laser action over most of the mid-infrared (IR) and part of the far-IR spectrum, the paper provides a comprehensive review of recent developments that will likely enable important advances in areas such as optical communications, ultrahigh resolution spectroscopy and applications to ultrahigh sensitivity gas-sensing systems We discuss the experimental observation of the remarkably different frequency response of QC lasers compared to diode lasers, ie, the absence of relaxation oscillations, their high-speed digital modulation, and results on mid-IR optical wireless communication links, which demonstrate the possibility of reliably transmitting complex multimedia data streams Ultrashort pulse generation by gain switching and active and passive modelocking is subsequently discussed Recent data on the linewidth of free-running QC lasers (/spl sim/150 kHz) and their frequency stabilization down to 10 kHz are presented Experiments on the relative frequency stability (/spl sim/5 Hz) of two QC lasers locked to optical cavities are discussed Finally, developments in metallic waveguides with surface plasmon modes, which have enabled extension of the operating wavelength to the far IR are reported

293 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present the results of an empirical study devoted to this question and examine the extent of usage of some risk management practices, such as risk identification, probabilistic risk analysis, planning for uncertainty and trade-off analysis, the difference in application across different types of projects and their impact on various project success dimensions.
Abstract: In times of increased competition and globalization, project success becomes even more critical to business performance, and yet many projects still suffer delays, overruns, and even failure. Ironically, however, risk management tools and techniques, which have been developed to improve project success, are used too little, and many still wonder how helpful they are. In this paper we present the results of an empirical study devoted to this question. Based on data collected on over 100 projects performed in Israel in a variety of industries, we examine the extent of usage of some risk management practices, such as risk identification, probabilistic risk analysis, planning for uncertainty and trade-off analysis, the difference in application across different types of projects, and their impact on various project success dimensions. Our findings suggest that risk management practices are still not widely used. Only a limited number of projects in our study have used any kind of risk management practices and many have only used some, but not all the available tools. When used, risk management practices seem to be working, and appear to be related to project success. We also found that risk management practices were more applicable to higher risk projects. The impact of risk management is mainly on better meeting time and budget goals and less on product performance and specification. In this case, we also found some differences according levels of technological uncertainty. Our conclusion is that risk management is still at its infancy and that at this time, more awareness to the application, training, tool development, and research on risk management is needed.

270 citations


Journal ArticleDOI
TL;DR: In this article, the authors use a project-specific typological approach, a multidimensional criteria for assessing project success, and a multivariate statistical analysis method to assess project success.
Abstract: Although the causes for project success and failure have been the subject of many studies, no conclusive evidence or common agreement has been achieved so far. One criticism involves the universalistic approach used often in project management studies, according to which all projects are assumed to be similar. A second problem is the issue of subjectiveness, and sometimes weakly defined success measures; yet another concern is the limited number of managerial variables examined by previous research. In the present study we use a project-specific typological approach, a multidimensional criteria for assessing project success, and a multivariate statistical analysis method. According to our typology projects were classified according to their technological uncertainty at project initiation and their system scope which is their location on a hierarchical ladder of systems and subsystems. For each of the 127 projects in our study that were executed in Israel, we recorded 360 managerial variables and 13 success measures. The use of a very detailed data and multivariate methods such as canonical correlation and eigenvector analysis enables us to account for all the interactions between managerial and success variables and to address a handful of perspectives, often left unanalyzed by previous research. Assessing the variants of managerial variables and their impact on project success for various types of projects, serves also a step toward the establishment of a typological theory of projects. Although some success factors are common to all projects, our study identified project-specific lists of factors, indicating for example, that high-uncertainty projects must be managed differently than low-uncertainty projects, and high-scope projects differently than low-scope projects.

263 citations


Book ChapterDOI
25 Mar 2002
TL;DR: This paper studies the problem of approximate XML query matching, based on tree pattern relaxations, and devise efficient algorithms to evaluate relaxed tree patterns, and designs data pruning algorithms where intermediate query results are filtered dynamically during the evaluation process.
Abstract: Tree patterns are fundamental to querying tree-structured data like XML Because of the heterogeneity of XML data, it is often more appropriate to permit approximate query matching and return ranked answers, in the spirit of Information Retrieval, than to return only exact answers In this paper, we study the problem of approximate XML query matching, based on tree pattern relaxations, and devise efficient algorithms to evaluate relaxed tree patterns We consider weighted tree patterns, where exact and relaxed weights, associated with nodes and edges of the tree pattern, are used to compute the scores of query answers We are interested in the problem of finding answers whose scores are at least as large as a given threshold We design data pruning algorithms where intermediate query results are filtered dynamically during the evaluation process We develop anoptimization that exploits scores of intermediate results to improve query evaluation efficiency Finally, we show experimentally that our techniques outperform rewriting-based and post-pruning strategies

208 citations


Journal ArticleDOI
TL;DR: In this article, the role of deviation measures and risk measures in optimization is analyzed, and the possible influence of "acceptably free lunches" is brought out, and optimality conditions based on concepts of convex analysis are derived in support of a variety of potential applications, such as portfolio optimization and variants of linear regression in statistics.
Abstract: General deviation measures, which include standard deviation as a special case but need not be symmetric with respect to ups and downs, are defined and shown to correspond to risk measures in the sense of Artzner, Delbaen, Eber and Heath when those are applied to the dierence between a random variable and its expectation, instead of to the random variable itself. A property called expectation-boundedness of the risk measure is uncovered as essential for this correspondence. It is shown to be satisfied by conditional value-at-risk and by worst-case risk, as well as various mixtures, although not by ordinary value-at-risk. Interpretations are developed in which inequalities that are “acceptably sure”, relative to a designated acceptance set, replace inequalities that are “almost sure” in the usual sense being violated only with probability zero. Acceptably sure inequalities fix the standard for a particular choice of a deviation measure. This is explored in examples that rely on duality with an associated risk envelope, comprised of alternative probability densities. The role of deviation measures and risk measures in optimization is analyzed, and the possible influence of “acceptably free lunches” is thereby brought out. Optimality conditions based on concepts of convex analysis, but relying on the special features of risk envelopes, are derived in support of a variety of potential applications, such as portfolio optimization and variants of linear regression in statistics.

Proceedings ArticleDOI
06 Nov 2002
TL;DR: Project-based learning (PBL) is an instructional approach that is gaining increasing interest within the engineering education community as mentioned in this paper, which facilitates the development of many "soft skills" demanded from engineering graduates, as embodied in the ABET EC 2000.
Abstract: Project-based learning (PBL) is an instructional approach that is gaining increasing interest within the engineering education community. The benefits of PBL include enhanced student participation in the learning process (active learning and self-learning), enhanced communication skills, addressing of a wider set of learning styles, and promotion of critical and proactive thinking. PBL also facilitates the development of many of the "soft skills" demanded from engineering graduates, as embodied in the ABET EC 2000. Examples include effective teaming skills, project management, communications, ethics, engineering economics, etc. At Stevens Institute of Technology the undergraduate engineering curriculum has undergone significant revisions to reflect the latest trend towards enhancement of traditional lecture-based courses with both a design spine and a laboratory experience propagating through the entire educational program. Project-based learning is also being integrated throughout the curriculum. An initial implementation of PBL and its preliminary assessment in a freshman-level course on Mechanics of Solids and a junior-level course on Mechanisms and Machine Dynamics is presented.

Posted Content
TL;DR: In this paper, the optimal team member personal style for new product development success is discussed, and a five-factor model is proposed to understand how personality relates to different kinds of behavior and is used as a framework for reviewing literature related to team performance.
Abstract: The paper centers on the optimal team member personal style for new product development success. Although the role of personality in team performance is not well understood, research suggests that personality plays a critical role in the effective performance of teams. Personality variables should be especially important for new product development (NPD) teams which typically include highly coordinated activities among multidisciplinary members. The five-factor model provides a consistent structure for understanding how personality relates to different kinds of behavior and is used as a framework for reviewing literature related to team performance. The same model is then used to form a set of research propositions that can serve to guide future research on the role of personality in NPD teams. Because the literature suggests that the role of personality is dependent upon the type of task involved, we differentiate our research propositions for two specific types of new product development: incremental innovation and radical innovation. We offer research propositions for the average level of each of the five-factor model variables and performance in the two types of teams. Finally, we suggest a set of research propositions for the effect of heterogeneity of personality on performance in radical and incremental innovation teams.

Journal ArticleDOI
TL;DR: Microwave-assisted rapid organic reactions constitute an emerging technology that could make industrially important organic syntheses more eco-friendly than conventional reactions as mentioned in this paper, and are potentially valuable as they reduce the need for organic solvents and also increase the atom economy by improving product selectivity and chemical yield.
Abstract: Microwave-assisted rapid organic reactions constitute an emerging technology that could make industrially important organic syntheses more eco-friendly than conventional reactions. In our laboratory Microwave-Induced Organic Reaction Enhancement (MORE) chemistry techniques have been developed that are safe since all reactions are conducted in open systems to avoid any chance of explosions that have been observed in sealed systems. MORE chemistry can be conducted without an added solvent if one or more of the reactants is a liquid that absorbs microwaves efficiently. When it is necessary to add a dipolar solvent for transferring microwave energy to the reactants, it is adequate to add just enough solvent to form a slurry at room temperature. The growing concern about the effect of organic solvents and chemical wastes on the environment is attracting attention to non-traditional synthetic approaches that might 'reduce pollution at the source'. In this context MORE chemistry techniques are potentially valuable as they reduce the need for organic solvents and also increase 'atom economy' by improving product selectivity and chemical yield.

Journal ArticleDOI
TL;DR: To resolve the scalar ambiguity intrinsic to all blind schemes, a semi-blind implementation of the Capon receiver is proposed, which capitalizes on periodically inserted pilots and the interference suppression ability of theCapon filters, for (slowly) time-varying channels.
Abstract: We present in this paper a linear blind multiuser receiver, referred to as the Capon receiver, for code-division multiple-access (CDMA) systems utilizing multiple transmit antennas and space-time (ST) block coding. The Capon receiver is designed by exploiting signal structures imposed by both spreading and ST coding. We highlight the unique ST coding induced structure, which is shown to be critical in establishing several analytical results, including self-interference (i.e., spatially mixed signals of the same user) cancellation, receiver output signal-to-interference-and-noise ratio (SINR), and blind channel estimation of the Capon receiver. To resolve the scalar ambiguity intrinsic to all blind schemes, we propose a semi-blind implementation of the Capon receiver, which capitalizes on periodically inserted pilots and the interference suppression ability of the Capon filters, for (slowly) time-varying channels. Numerical examples are presented to compare the Capon receiver with several other training-assisted and (semi-)blind receivers and to illustrate the performance gain of ST-coded CDMA systems over those without ST coding.

Journal ArticleDOI
12 Dec 2002
TL;DR: A fast algorithm is presented, CDM, that identifies and eliminates local redundancies due to ICs, based on propagating ”information labels” up the tree pattern, which can be applied prior to ACIM for improving the minimization efficiency.
Abstract: Tree patterns form a natural basis to query tree-structured data such as XML and LDAP. To improve the efficiency of tree pattern matching, it is essential to quickly identify and eliminate redundant nodes in the pattern. In this paper, we study tree pattern minimization both in the absence and in the presence of integrity constraints (ICs) on the underlying tree-structured database. In the absence of ICs, we develop a polynomial-time query minimization algorithm called CIM, whose efficiency stems from two key properties: (i) a node cannot be redundant unless its children are; and (ii) the order of elimination of redundant nodes is immaterial. When ICs are considered for minimization, we develop a technique for query minimization based on three fundamental operations: augmentation (an adaptation of the well-known chase procedure), minimization (based on homomorphism techniques), and reduction. We show the surprising result that the algorithm, referred to as ACIM, obtained by first augmenting the tree pattern using ICs, and then applying CIM, always finds the unique minimal equivalent query. While ACIM is polynomial time, it can be expensive in practice because of its inherent non-locality. We then present a fast algorithm, CDM, that identifies and eliminates local redundancies due to ICs, based on propagating ”information labels” up the tree pattern. CDM can be applied prior to ACIM for improving the minimization efficiency. We complement our analytical results with an experimental study that shows the effectiveness of our tree pattern minimization techniques.

Journal ArticleDOI
TL;DR: In this paper, the authors measured the center-of-mass diffusion coefficient of uncharged flexible linear chains adsorbed at the solid-liquid interface at dilute surface coverage.
Abstract: We report direct measurement of the center-of-mass diffusion coefficient, D, of uncharged flexible linear chains adsorbed at the solid−liquid interface at dilute surface coverage. We find D ∼ N-3/2 (N is degree of polymerization) when N was varied by more than an order of magnitude (N = 48, 113, 244, 456, and 693) and the scatter of the data was low. The experimental system was poly(ethylene glycol), PEG, adsorbed from dilute aqueous solution onto a self-assembled hydrophobic monolayer, condensed octadecyltriethoxysilane. The method of measurement was fluorescence correlation spectroscopy of a rhodamine green derivative dye that was end-attached to one sole end of the adsorbed PEG chains. The observed scaling implies the diffusion time τ ∼ N3 if Rg ∼ N3/4 as expected for a chain in good solvent in two dimensions (Rg is the radius of gyration), but a variety of other theoretical approaches to describe the dynamical scaling are also plausible. The multiplicity of plausible dynamical transport scenarios is c...

Journal ArticleDOI
TL;DR: This article explored the antecedent factors that impact new product development team stability as well as its consequences and found that the most direct antecedents of team stability are goal stability and goal support.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a five-factor model for understanding how personality relates to different kinds of behavior and used it as a framework for reviewing literature related to team performance and suggested a set of research propositions for the effect of heterogeneity of personality on performance in radical and incremental innovation teams.

Proceedings ArticleDOI
01 Jan 2002
TL;DR: Denotational semantics for a Java-like language with pointers, subclassing and dynamic dispatch, class oriented visibility control, recursive types and methods, and privilege-based access control are given in this article.
Abstract: Denotational semantics is given for a Java-like language with pointers, subclassing and dynamic dispatch, class oriented visibility control, recursive types and methods, and privilege-based access control. Representation independence (relational parametricity) is proved, using a semantic notion of confinement similar to ones for which static disciplines have been recently proposed.

Book ChapterDOI
20 Aug 2002
TL;DR: This paper devise efficient algorithms that optimally determine when the recursive check can be eliminated, and when it can be simplified to just a local check on the element's attributes, without violating the access control policy.
Abstract: The rapid emergence of XML as a standard for data exchange over the Web has led to considerable interest in the problem of securing XML documents. In this context, query evaluation engines need to ensure that user queries only use and return XML data the user is allowed to access. These added access control checks can considerably increase query evaluation time. In this paper, we consider the problem of optimizing the secure evaluation of XML twig queries. We focus on the simple, but useful, multi-level access control model, where a security level can be either specified at an XML element, or inherited from its parent. For this model, secure query evaluation is possible by rewriting the query to use a recursive function that computes an element's security level. Based on security information in the DTD, we devise efficient algorithms that optimally determine when the recursive check can be eliminated, and when it can be simplified to just a local check on the element's attributes, without violating the access control policy. Finally, we experimentally evaluate the performance benefits of our techniques using a variety of XML data and queries.

Journal ArticleDOI
06 Dec 2002-Science
TL;DR: The Ancient Roots of Modern Science: From the Babylonians to the Maya as mentioned in this paper is an account of the contributions of non-Western societies to modern science, from the Mesoamerican to the Mayan civilization.
Abstract: Lost Discoveries . The Ancient Roots of Modern Science --from the Babylonians to the Maya. Dick Teresi. Simon and Schuster, New York, 2002. 463 pp. $27, C$41. ISBN 0-684-83718-8. Writing for a popular audience, Teresi offers an account of the contributions of non-Western societies to modern science.

Journal ArticleDOI
TL;DR: In this article, experimental results for an optical free-space high-speed link using direct modulated mid-infrared (DMIR) quantum cascade (QC) lasers are presented.
Abstract: Experimental results for an optical free-space high-speed link using direct modulated mid-infrared (/spl lambda/ = 8.1 /spl mu/m) quantum cascade lasers are presented. A total of 800 digitally encoded multimedia channels were transmitted. The reliability of the system against weather influence (fog) was experimentally compared to that of a near-infrared (/spl lambda/ = 0.85 /spl mu/m) link.

Journal ArticleDOI
TL;DR: The proposed approach overcomes the difficulty of orthogonal identification codes required by the original protocol, thereby improving channel utilization and system capacity, while being insensitive to multipath effects and synchronization errors.
Abstract: This paper deals with the multiuser medium access problem in the packet radio environment. Under the framework of network diversity multiple access (NDMA), a previously proposed medium access method, a blind collision resolution scheme is proposed employing rotational invariance and factor analysis techniques. The proposed approach (dubbed B-NDMA for blind NDMA) overcomes the difficulty of orthogonal identification codes required by the original protocol, thereby improving channel utilization and system capacity, while being insensitive to multipath effects and synchronization errors. Performance issues of the proposed technique are addressed both analytically and numerically.

Posted Content
TL;DR: In this article, the authors formalize representation independence for classes, in an imperative, object-oriented language with pointers, subclassing and dynamic dispatch, class oriented visibility control, recursive types and methods, and a simple form of module.
Abstract: Dedicated to the memory of Edsger W.Dijkstra. Representation independence or relational parametricity formally characterizes the encapsulation provided by language constructs for data abstraction and justifies reasoning by simulation. Representation independence has been shown for a variety of languages and constructs but not for shared references to mutable state; indeed it fails in general for such languages. This paper formulates representation independence for classes, in an imperative, object-oriented language with pointers, subclassing and dynamic dispatch, class oriented visibility control, recursive types and methods, and a simple form of module. An instance of a class is considered to implement an abstraction using private fields and so-called representation objects. Encapsulation of representation objects is expressed by a restriction, called confinement, on aliasing. Representation independence is proved for programs satisfying the confinement condition. A static analysis is given for confinement that accepts common designs such as the observer and factory patterns. The formalization takes into account not only the usual interface between a client and a class that provides an abstraction but also the interface (often called ``protected'') between the class and its subclasses.

Book ChapterDOI
01 Jan 2002

Journal ArticleDOI
TL;DR: The seismo-acoustic technique intrinsically detects buried containers, it can discriminate mines from noncompliant false targets such as rocks, tree roots, chunks of metal, bricks, etc.
Abstract: A novel technique for detection and discrimination of artificial objects, such as land mines, pipes, containers, etc., buried in the ground, has been developed and tested. The developed approach utilizes vibration (using seismic or airborne acoustic waves) of buried objects, remote measurements of soil surface vibration (using laser or microwave vibrometers), and processing of the measured vibration to extract mine’s “vibration signatures.” The technique does not depend upon the material from which the mine is fabricated whether it be metal, plastic, wood, or any other material. It depends upon the fact that a mine is a “container” whose purpose is to contain explosive materials and associated detonation apparatus. The mine container is in contact with the soil in which it is buried. The container is an acoustically compliant article, whose compliance is notably different from the compliance of the surrounding soil. Dynamic interaction of the compliant container and soil on top of it leads to specific linear and nonlinear effects used for mine detection and discrimination. The mass of the soil on top of a compliant container creates a classical mass–spring system with a well-defined resonance response. Besides, the connection between mass (soil) and spring (mine) is not elastic (linear) but rather nonlinear, due to the separation of the soil/mine interface in the tensile phase of applied dynamic stress. These two effects, constituting the mine’s vibration signature have been measured in numerous laboratory and field tests, which proved that the resonance and nonlinear responses of a mine/soil system can be used for detection and discrimination of buried mines. Thus, the fact that the mine is buried is turned into a detection advantage. Because the seismo-acoustic technique intrinsically detects buried containers, it can discriminate mines from noncompliant false targets such as rocks, tree roots, chunks of metal, bricks, etc. This was also confirmed experimentally in laboratory and field tests.

Patent
04 Nov 2002
TL;DR: In this article, a non-thermal plasma discharge device is disposed upstream of a suspension media (e.g., a filter, electrostatic precipitator, carbon bed).
Abstract: A sterilization and decontamination system in which a non-thermal plasma discharge device is disposed upstream of a suspension media (e.g., a filter, electrostatic precipitator, carbon bed). The plasma discharge device generates a plasma that is emitted through apertures (e.g., capillaries or slits) in the primary dielectric. Plasma generated active sterilizing species when exposed to contaminants or undesirable particulate matter is able to deactivate or reduce such matter in contaminated fluid stream and/or on objects. Thus, the undesirable contaminants in the fluid to be treated are first reduced during their exposure to the plasma generated active sterilizing species in the plasma region of the discharge device. Furthermore, the plasma generated active sterilizing species are carried downstream to suspension media and upon contact therewith deactivate the contaminants collected on the suspension media itself. Advantageously, the suspension media may be cleansed in situ. To increase the sterilization efficiency an additive, free or carrier gas (e.g., alcohol, water, dry air) may be injected into the apertures defined in the primary dielectric. These additives increase the concentration of plasma generated active sterilizing agents while reducing the byproduct of generated undesirable ozone pollutants. Downstream of the filter the fluid stream may be further treated by being exposed to a catalyst media or additional suspension media to further reduce the amount of undesirable particulate matter.

Journal ArticleDOI
TL;DR: In this article, a wide-angle X-ray diffraction (WAXD)-based quantitative phase analysis method was used to characterize the variations of the concentrations of the insulating binder and the conductive particles around their mean values as a function of mixing time in an intensive batch mixer.
Abstract: The development of the electrical properties of composites as a function of the degree of mixedness of a conductive filler distributed into an insulating polymer is investigated. A wide-angle X-ray diffraction (WAXD)-based quantitative phase analysis method was used to characterize the variations of the concentrations of the insulating binder and the conductive particles around their mean values as a function of mixing time in an intensive batch mixer. Increasing the time and hence, the specific energy input, during the mixing process results in a more homogeneous spatial distribution of the conductive filler in the polymeric matrix, which in turn results in a decrease of the volume conductivity of the composite. The decreasing conductivity of the composite is attributed to the better coating and hence the isolation of the conductive particles from each other, thus hindering the formation of a conductive network “percolation”. Overall, these results suggest that the control of the electrical properties of conductive composites could benefit from a good understanding and adequate control of the dynamics of the mixing process and the resulting degree of mixedness of the conductive particles in the polymer matrix.