Bio: Chris Aldrich is an academic researcher from Colorado School of Mines. The author has contributed to research in topics: Froth flotation & Artificial neural network. The author has an hindex of 37, co-authored 238 publications receiving 5065 citations. Previous affiliations of Chris Aldrich include Curtin University & University of Melbourne.
Papers published on a yearly basis
TL;DR: In this article, the authors used carrier magnetic materials for more effective separation of water and solids, as well as the oxidation pretreatment that is also used to sterilize the water.
Abstract: Acid mine water from a South African gold mine was characterised and treated by the precipitation of heavy metals with lime and sulphides, followed by ion exchange. The novelty of the proposed process lies in the use of carrier magnetic materials for more effective separation of water and solids, as well as the oxidation pretreatment that is also used to sterilize the water. The process can generate very, pure water from acid mine water with a great flexibility and an acceptable cost. The oxidation and precipitation of heavy metals with lime and subsequent sulphide-carrier magnetic separation appeared to be particularly suitable for the removal of heavy metal ions from the effluent of the particular gold mine that was investigated. The cation exchange resin IR120 can be used to reduce the salinity of the effluent of mine water after removal of heavy metals by precipitation. Low cost sulphuric acid can be used as the cation resin regenerator. The anion exchange resin A375 could reduce the anions (sulphate, chloride, bromide and fluoride) to acceptably low levels in the mine water after precipitation of heavy metals. A combination of sodium hydroxide and saturated lime solution can be used as the anion resin regenerator. A mixture of acidic gypsum from the cation elution section and alkaline gypsum from the anion elution section could generate high quality gypsum as byproduct, which could be sold as a valuable raw material to the gypsum industry, to offset process cost. Although these experiments were conducted on the acid mine water of a specific mine, the process could be extended to other mine waters contaminated with heavy metals and high salinities.
TL;DR: Machine vision has been used to extract froth characteristics, both physical (e.g. bubble size) and dynamic (froth velocity) from digital images and present these results to operators and/or use the results as inputs to process control systems.
Abstract: Research and development into the application of machine vision in froth flotation systems has continued since its introduction in the late 1980s. Machine vision is able to accurately and rapidly extract froth characteristics, both physical (e.g. bubble size) and dynamic (froth velocity) in nature, from digital images and present these results to operators and/or use the results as inputs to process control systems. Currently, machine vision has been implemented on several industrial sites worldwide and the technology continues to benefit from advances in computer technology. Effort continues to be directed into linking concentrate grade with measurable attributes of the froth phase, although this is proving difficult. As a result other extracted variables, such as froth velocity, have to be used to infer process performance. However, despite more than 20 years of development, a long-term, fully automated control system using machine vision is yet to materialise. In this review, the various methods of data extraction from images are investigated and the associated challenges facing each method discussed. This is followed by a look at how machine vision has been implemented into process control structures and a review of some of the commercial froth imaging systems currently available. Lastly, the review assesses future trends and draws several conclusions on the current status of machine vision technology.
TL;DR: In this article, the adsorption of heavy metals onto biomaterial derived from the marine alga Ecklonia maxima was investigated via batch experiments and the results indicated that the activated biomass derived from E. maxima could be used as an efficient biosorbent for the treatment of waste waters containing heavy metals.
Abstract: The adsorption of heavy metals onto biomaterial derived from the marine alga Ecklonia maxima was investigated via batch experiments. The adsorption equilibria of Cu, Pb and Cd could be represented by Langmuir isotherms and the capacity of fresh alga for Cu, Pb and Cd was approximately 85–94, 227–243 and 83.5 mg/g dry alga, respectively. The rate of adsorption onto the marine alga was high. The alga particle size played an important role in the adsorption behaviour. The coarse alga particles had a higher adsorption capacity and slower adsorption kinetics and could be regenerated without significant loss of capacity. In contrast, the fine alga particles had a lower adsorption capacity and faster adsorption kinetics and could not be regenerated without significant loss of capacity. Comparison with a commercial resin indicated that the activated biomass derived from E. maxima could be used as an efficient biosorbent for the treatment of waste waters containing heavy metals.
TL;DR: In this article, the removal of pollutants from acid mine drainage using metallurgical by-product slags was studied in laboratory scale, where the calcium glass type of slags had high surface area and porosity.
Abstract: The removal of pollutants from acid mine drainage using metallurgical by-product slags was studied in laboratory scale. Metallurgical by-product furnace slags were used as sorbents for metal ions and dispersed air column flotation was employed for the solid/liquid separation of the loaded slags. Batch sorption/pH/kinetic studies were conducted using simulated Cu and Pb bearing wastewater. The calcium glass type of slags had high surface area and porosity. Promising result was succeeded from the combined process of slag sorption/flotation on the treatment of an acid mine drainage from a South African gold mine.
TL;DR: The heavy metal uptake by the tobacco dust may be interpreted as metal-H ion exchange or metal ion surface complexation adsorption or both, and the surface changes appeared to have resulted from a loss of some of the structures on the surface of the particles, owing to leaching in the acid metal ion solution.
Abstract: A typical lignocellulosic agricultural residue, namely tobacco dust, was investigated for its heavy metal binding efficiency. The tobacco dust exhibited a strong capacity for heavy metals, such as Pb(II), Cu(II), Cd(II), Zn(II) and Ni(II), with respective equilibrium loadings of 39.6, 36.0, 29.6, 25.1 and 24.5mg of metal per g of sorbent. Moreover, the heavy metals loaded onto the biosorbent could be released easily with a dilute HCl solution. Zeta potential and surface acidity measurements showed that the tobacco dust was negatively charged over a wide pH range (pH>2), with a strong surface acidity and a high OH(-) adsorption capacity. Changes in the surface morphology of the tobacco dust as visualized by atomic force microscopy suggested that the sorption of heavy metal ions on the tobacco could be associated with changes in the surface properties of the dust particles. These surface changes appeared to have resulted from a loss of some of the structures on the surface of the particles, owing to leaching in the acid metal ion solution. However, Fourier transform infrared spectroscopy (FTIR) showed no substantial change in the chemical structure of the tobacco dust subjected to biosorption. The heavy metal uptake by the tobacco dust may be interpreted as metal-H ion exchange or metal ion surface complexation adsorption or both.
TL;DR: This book by a teacher of statistics (as well as a consultant for "experimenters") is a comprehensive study of the philosophical background for the statistical design of experiment.
Abstract: THE DESIGN AND ANALYSIS OF EXPERIMENTS. By Oscar Kempthorne. New York, John Wiley and Sons, Inc., 1952. 631 pp. $8.50. This book by a teacher of statistics (as well as a consultant for \"experimenters\") is a comprehensive study of the philosophical background for the statistical design of experiment. It is necessary to have some facility with algebraic notation and manipulation to be able to use the volume intelligently. The problems are presented from the theoretical point of view, without such practical examples as would be helpful for those not acquainted with mathematics. The mathematical justification for the techniques is given. As a somewhat advanced treatment of the design and analysis of experiments, this volume will be interesting and helpful for many who approach statistics theoretically as well as practically. With emphasis on the \"why,\" and with description given broadly, the author relates the subject matter to the general theory of statistics and to the general problem of experimental inference. MARGARET J. ROBERTSON
TL;DR: It is evident from the literature survey articles that ion-exchange, adsorption and membrane filtration are the most frequently studied for the treatment of heavy metal wastewater.
Abstract: Heavy metal pollution has become one of the most serious environmental problems today. The treatment of heavy metals is of special concern due to their recalcitrance and persistence in the environment. In recent years, various methods for heavy metal removal from wastewater have been extensively studied. This paper reviews the current methods that have been used to treat heavy metal wastewater and evaluates these techniques. These technologies include chemical precipitation, ion-exchange, adsorption, membrane filtration, coagulation-flocculation, flotation and electrochemical methods. About 185 published studies (1988-2010) are reviewed in this paper. It is evident from the literature survey articles that ion-exchange, adsorption and membrane filtration are the most frequently studied for the treatment of heavy metal wastewater.
TL;DR: This paper attempts to summarise and review the recent research and developments in diagnostics and prognostics of mechanical systems implementing CBM with emphasis on models, algorithms and technologies for data processing and maintenance decision-making.
Abstract: Condition-based maintenance (CBM) is a maintenance program that recommends maintenance decisions based on the information collected through condition monitoring. It consists of three main steps: data acquisition, data processing and maintenance decision-making. Diagnostics and prognostics are two important aspects of a CBM program. Research in the CBM area grows rapidly. Hundreds of papers in this area, including theory and practical applications, appear every year in academic journals, conference proceedings and technical reports. This paper attempts to summarise and review the recent research and developments in diagnostics and prognostics of mechanical systems implementing CBM with emphasis on models, algorithms and technologies for data processing and maintenance decision-making. Realising the increasing trend of using multiple sensors in condition monitoring, the authors also discuss different techniques for multiple sensor data fusion. The paper concludes with a brief discussion on current practices and possible future trends of CBM.
TL;DR: In this paper, a taxonomy of recent contributions related to explainability of different machine learning models, including those aimed at explaining Deep Learning methods, is presented, and a second dedicated taxonomy is built and examined in detail.
Abstract: In the last few years, Artificial Intelligence (AI) has achieved a notable momentum that, if harnessed appropriately, may deliver the best of expectations over many application sectors across the field. For this to occur shortly in Machine Learning, the entire community stands in front of the barrier of explainability, an inherent problem of the latest techniques brought by sub-symbolism (e.g. ensembles or Deep Neural Networks) that were not present in the last hype of AI (namely, expert systems and rule based models). Paradigms underlying this problem fall within the so-called eXplainable AI (XAI) field, which is widely acknowledged as a crucial feature for the practical deployment of AI models. The overview presented in this article examines the existing literature and contributions already done in the field of XAI, including a prospect toward what is yet to be reached. For this purpose we summarize previous efforts made to define explainability in Machine Learning, establishing a novel definition of explainable Machine Learning that covers such prior conceptual propositions with a major focus on the audience for which the explainability is sought. Departing from this definition, we propose and discuss about a taxonomy of recent contributions related to the explainability of different Machine Learning models, including those aimed at explaining Deep Learning methods for which a second dedicated taxonomy is built and examined in detail. This critical literature analysis serves as the motivating background for a series of challenges faced by XAI, such as the interesting crossroads of data fusion and explainability. Our prospects lead toward the concept of Responsible Artificial Intelligence, namely, a methodology for the large-scale implementation of AI methods in real organizations with fairness, model explainability and accountability at its core. Our ultimate goal is to provide newcomers to the field of XAI with a thorough taxonomy that can serve as reference material in order to stimulate future research advances, but also to encourage experts and professionals from other disciplines to embrace the benefits of AI in their activity sectors, without any prior bias for its lack of interpretability.