scispace - formally typeset
Search or ask a question

Showing papers by "Edinburgh Napier University published in 1997"


Proceedings Article
24 Mar 1997
TL;DR: A fresh look is presented at the nature of complexity in the building of computer based systems with a wide range of reasons all the way from hardware failures through software errors right to major system level mistakes.
Abstract: Every organisation from the scale of whole countries down to small companies has a list of system developments which have ended in various forms of disaster. The nature of the failures varies but typical examples are: cost overruns; timescale overruns and sometimes, loss of life. The post-mortems to these systems reveal a wide range of reasons all the way from hardware failures, through software errors right to major system level mistakes. More importantly a large number of these systems share one attribute: complexity. This paper presents a fresh look at the nature of complexity in the building of computer based systems.

620 citations


Journal ArticleDOI
TL;DR: It is concluded that the free radical activity of PM10 particles is derived either from a fraction that is not centrifugeable on a bench centrifuge, or that the radical generating system is released into solution.
Abstract: The purpose of this study was to test the hypothesis that particulate matter < or = 10 microns in aerodynamic diameter (PM10) particles have the ability to generate free radical activity at their surface. We collected PM10 filters from the Edinburgh, United Kingdom, Enhanced Urban Network sampling site, removed particles from the filter, and tested their ability to cause free radical damage to supercoiled plasmid DNA. We found that the PM10 particles did cause damage to the DNA that was mediated by hydroxyl radicals, as shown by inhibition of the injury with mannitol. The PM10-associated hydroxyl radical activity was confirmed using a high-performance liquid chromatography-based assay to measure the hydroxyl radical adduct of salicylic acid. Desferrioxamine abolished the hydroxyl radical-mediated injury, which suggests that iron was involved. Analysis of PM10 filters confirmed the presence of large amounts of iron and leaching studies confirmed that the PM10 samples could release substantial amounts of Fe(III) and lesser amounts of Fe(II). To investigate the size of the particles involved in the hydroxyl radical injury, we centrifuged the suspension of PM10 to clarity, tested the clear supernatant, and found that it had all of the suspension activity. We conclude, therefore, that the free radical activity is derived either from a fraction that is not centrifugeable on a bench centrifuge, or that the radical generating system is released into solution.

237 citations


Journal ArticleDOI
01 Jan 1997
TL;DR: The background to approaches to adaptively controlling one or more of the Genetic Algorithm operators is described, and a framework for their classification is suggested, based on the learning strategy used to control them, and what facets of the algorithm are susceptible to adaptation.
Abstract: Genetic Algorithms are a class of powerful, robust search techniques based on genetic inheritance and the Darwinian metaphor of “Natural Selection”. These algorithms maintain a finite memory of individual points on the search landscape known as the “population”. Members of the population are usually represented as strings written over some fixed alphabet, each of which has a scalar value attached to it reflecting its quality or “fitness”. The search may be seen as the iterative application of a number of operators, such as selection, recombination and mutation, to the population with the aim of producing progressively fitter individuals. These operators are usually static, that is to say that their mechanisms, parameters, and probability of application are fixed at the beginning and constant throughout the run of the algorithm. However, there is an increasing body of evidence that not only is there no single choice of operators which is optimal for all problems, but that in fact the optimal choice of operators for a given problem will be time-variant i.e. it will depend on such factors as the degree of convergence of the population. Based on theoretical and practical approaches, a number of authors have proposed methods of adaptively controlling one or more of the operators, usually invoking some kind of “meta-learning” algorithm, in order to try and improve the performance of the Genetic Algorithm as a function optimiser. In this paper we describe the background to these approaches, and suggest a framework for their classification, based on the learning strategy used to control them, and what facets of the algorithm are susceptible to adaptation. We then review a number of significant pieces of work within the context of this setting, and draw some conclusions about the relative merits of various approaches and promising directions for future work.

206 citations


Journal ArticleDOI
TL;DR: Evidence is provided that PM10 has free radical activity and causes lung inflammation and epithelial injury, and the mechanism for the adverse effects of particulate air pollution on patients with airway diseases is unknown, which supports the hypothesis thatPM10 induces oxidant stress, causing inflammation and injury to airway epithelium.
Abstract: Epidemiologic studies have reported associations between fine particulate air pollution, especially particles less than 10 mm in diameter (PM10), and the development of exacerbations of asthma and ...

173 citations


Journal ArticleDOI
TL;DR: In this paper, the first principal component (PRINI) analysis was used for describing forest structure at each site, and the results indicated that the pattern of proportional abundance of tropical butterfly species may be used as an 'instantaneous' indicator of forest disturbance.
Abstract: Butterfly assemblages within lowland monsoon forest were compared at four sites on Sumba, Indonesia that differed in terms of protection and exhibited associated differences in levels of human disturbance. A numerical method employing principal components analysis was devised for describing forest structure at each site. The first principal component (PRINI) grouped attributes tending towards dense forest with closely-spaced trees, a closed canopy and a poorly developed field layer, with trees that tended to be large with the point of inversion in the upper half of the trunk. The highest values for PRINI were recorded within protected forest, and PRINI values were considered to be a useful index of forest disturbance at each site. Species diversity of butterflies was highest in unprotected secondary forest, but was not affected by lower levels of disturbance. Those species occurring at highest density in secondary forest generally had wide geographical distributions, whereas those species occurring at highest density in undisturbed primary forest had restricted ranges of distribution, in most cases with a separate subspecies on Sumba. Overall, an index of biogeographical distinctiveness decreased with increasing disturbance, and this supports the hypothesis that the most characteristic species of undisturbed climax forest have the smallest geographical ranges of distribution. Species abundance data for butterflies fitted a log-normal distribution at all but the most disturbed site. These results indicate that the pattern of proportional abundance of tropical butterfly species may be used as an 'instantaneous' indicator of forest disturbance, and that changes in the structure of tropical forests in S.E. Asia resulting from human disturbance, even within partially-protected forest, may result in the presence of butterfly assemblages of higher species diversity but of lower biogeographical distinctiveness, and therefore of lower value in terms of the conservation of global biodiversity.

154 citations


Journal ArticleDOI
TL;DR: A mixed Lorentzian‐Gaussian (Voigt) lineshape model was developed that gave more accurate results with synthetic FIDs and gave significantly different peak areas between the methods.
Abstract: Quantification of NMR visible metabolites by spectral modeling usually assumes a Lorentzian or Gaussian lineshape, despite the fact that experimental lineshapes are neither. To minimize systematic fitting errors, a mixed Lorentzian-Gaussian (Voigt) lineshape model was developed. When tested with synthetic FIDs, the Voigt lineshape model gave more accurate results (maximum error 2%) than either Lorentzian (maximum error 20%) or Gaussian models (maximum error 12%). The three lineshape models gave substantially different peak areas in an in vitro experiment, with the Voigt model having a much lower X 2 (2.1 compared with 5.2 for the Lorentzian model and 6.2 for the Gaussian model). In a group of 10 healthy volunteers, fitting of 1 H spectra from cerebral white matter gave significantly different peak areas between the methods. Even when area ratios were taken, the Lorentzian model gave higher values (+5% for NAA/choline and +2% for NAA/creatine) than the Voigt lineshape model, whereas the Gaussian model gave lower values (-2% and -1%, respectively).

125 citations


Journal ArticleDOI
TL;DR: The analysis of a data base created by merging road casualty information and census data for the former Lothian region in Scotland found that the casualty rates amongst residents from areas classified as relatively deprived were significantly higher than those from relatively affluent areas.

116 citations


Journal ArticleDOI
TL;DR: In this paper, a probit stochastic method (SAM) is proposed for traffic assignment, which does not require path enumeration and does not take into account the correlation between alternative routes.
Abstract: Stochastic methods of traffic assignment have received much less attention in the literature than those based on deterministic user equilibrium (UE). The two best known methods for stochastic assignment are those of Burrell and Dial, both of which have certain weaknesses which have limited their usefulness. Burrell's is a Monte Carlo method, whilst Dial's logit method takes no account of the correlation, or overlap,between alternative routes. This paper describes, firstly, a probit stochastic method (SAM) which does not suffer from these weaknesses and which does not require path enumeration. While SAM has a different route-finding methodology to Burrell, it is shown that assigned flows are similar. The paper then goes on to show how, by incorporating capacity restraint (in the form of link-based cost-flow functions) into this stochastic loading method, a new stochastic user equilibrium (SUE) model can be developed. The SUE problem can be expressed as a mathematical programming problem, and its solution found by an iterative search procedure similar to that of the Frank-Wolfe algorithm commonly used to solve the UE problem. The method is made practicable because quantities calculated during the stochastic loading process make the SUE objective function easy to compute. As a consequence, at each iteration, the optimal step length along the search direction can be estimated using a simple interpolation method. The algorithm is demonstrated by applying it successfully to a number of test problems, in which the algorithm shows good behaviour. It is shown that, as the values of parameters describing the variability and degree of capacity restraint are varied, the SUE solution moves smoothly between the UE and pure stochastic solutions.

111 citations


Book ChapterDOI
14 Jul 1997
TL;DR: This paper takes a critical look at the alternatives for assisting users to navigate information spaces and concludes by outlining a research agenda for navigation support.
Abstract: The issue of how users can navigate their way through large information spaces is one that is crucial to the ever expanding and interlinking of computer systems There are many ways of dealing with the issue cf navigation The use of appropriate metaphors is one, virtual reality and 3D interfaces another A third is to provide adaptive interfaces based on individual differences in users navigational ability This paper takes a critical look at the alternatives for assisting users to navigate information spaces and concludes by outlining a research agenda for navigation support

106 citations


Journal ArticleDOI
TL;DR: It is concluded that the intrinsic free radical activity is the major determinant of transcription factor activation and therefore gene expression in alveolar macrophages.
Abstract: We studied asbestos, vitreous fiber (MMVF10), and refractory ceramic fiber (RCF1) from the Thermal Insulation Manufacturers' Association fiber repository regarding the following: free radical damage to plasmid DNA, iron release, ability to deplete glutathione (GSH), and activate redox-sensitive transcription factors in macrophages. Asbestos had much more free radical activity than any of the man-made vitreous fibers. More Fe3+ was released than Fe2+ and more of both was released at pH 4.5 than at pH 7.2. Release of iron from the different fibers was generally not a good correlate of ability to cause free radical injury to the plasmid DNA. All fiber types caused some degree of oxidative stress, as revealed by depletion of intracellular GSH. Amosite asbestos upregulated nuclear binding of activator protein 1 transcription factor to a greater level than MMVF10 and RCF1; long-fiber amosite was the only fiber to enhance activation of the transcription factor nuclear factor kappa B (NF kappa B). The use of cysteine methyl ester and buthionine sulfoximine to modulate GSH suggested that GSH homeostasis was important in leading to activation of transcription factors. We conclude that the intrinsic free radical activity is the major determinant of transcription factor activation and therefore gene expression in alveolar macrophages. Although this was not related to iron release or ability to deplete macrophage GSH at 4 hr, GSH does play a role in activation of NF kappa B.

93 citations


Journal ArticleDOI
TL;DR: In this paper, a technique for detecting gender bias in cases where student raters have awarded marks to same and opposite sex peers is described, and illustrated by data from two case studies.
Abstract: Concerns relating to the reliability of teacher and student peer assessments are discussed, and some correlational analyses comparing student and teacher marks described. The benefits of the use of multiple ratings are elaborated. The distinction between gender differences and gender bias is drawn, and some studies which have reported gender bias are reviewed. The issue of ‘blind marking’ is addressed. A technique for detecting gender bias in cases where student raters have awarded marks to same and opposite sex peers is described, and illustrated by data from two case studies. Effect sizes were found to be very small, indicating an absence of gender bias in these two cases. Results are discussed in relation to task and other contextual variables. The authors conclude that the technique described can contribute to the good practice necessary to ensure the success of peer assessment in terms of pedagogical benefits and reliable and fair marking outcomes.

Journal ArticleDOI
TL;DR: This paper describes the approach to the development of an Internet-based course designed for distance education and provides general observations on the opportunities and constraints which the web provides and on the pedagogic issues which arise when using this delivery mechanism.
Abstract: The phenomenal growth of the Internet over the last few years, coupled with the development of various multimedia applications which exploit the Internet presents exciting opportunities for educators. In the context of distance education, the World Wide Web provides a unique challenge as a new delivery mechanism for course material allowing students to take a course (potentially) from anywhere in the world. In this paper, we describe our approach to the development of an Internet-based course designed for distance education. Using this experience, we provide general observations on the opportunities and constraints which the web provides and on the pedagogic issues which arise when using this delivery mechanism.We have found that the process of developing web-based courses is one area which requires careful consideration as technologies and tools for both the authoring and the delivery of courses are evolving so rapidly. We have also found that current tools are severely lacking in a number of important respects?particularly with respect to the design of pedagogically sound courseware.

Journal ArticleDOI
TL;DR: Investigation of the effects of changing C:N:P loading rate and retention time on pond performance as measured by nutrient removal and dry matter biomass found that increased loading rate was related to increase in nitrogen removal, however more complete nitrification occurred at low COD loading rates.
Abstract: Small pilot ponds in a glasshouse at the Scottish Agricultural College (Auchincruive) were used to investigate the effects of changing C:N:P loading rate and retention time on pond performance as measured by nutrient removal and dry matter biomass. One experiment investigated ponds operated at two C:N:P ratios: low (9:7:1) and high (104:10:1) and two retention times (4 and 7 days θ. Increasing retention time from 4 to 7 days increased the concentration of total (dry matter) and algal (chlorophyll a) biomass and the degree of nitrification. It also increased removal of phosphorus, but had no effect on nitrogen or COD removal. Cyanobacteria predominated in ponds operated at both 4 and 7 days, and the density of cyanobacteria increased with increased retention time. Nitrogen removal was independent of C:N:P ratio; indeed the lower C:N:P ratio favoured increased nitrification. A high C:N:P ratio increased phosphorus and COD removal and increased the concentration of algal biomass (chlorophyll a), but had little effect on total biomass (dry matter). A second experiment varied COD loading rate (600, 350 and 100 kg COD ha-1 d-1) while maintaining a constant retention time (either 5 or 7 days θ). Species composition was independent of retention time. The longer retention time increased both total and algal biomass concentration and also percentage of nitrogen removed. Nitrification was independent of retention time. Increasing loading rate increased dry matter production and resulted in a predominance of cyanobacteria over Chlorophyceae. Increased loading rate was related to increase in nitrogen removal, however more complete nitrification occurred at low COD loading rates. Phosphorus removal in the pond with 5-day (θ) remained constant independent of loading rate, but in the pond with 7-day θ phosphorus removal increased with increased COD loading. COD removal was independent of both retention time and loading rate.

Journal ArticleDOI
TL;DR: Eight successive models of the information chain each incorporating contemporary thoughts and experiences are described, analogous to ecology's food chain, taking the Royal Society Scientific Information Conference, 1948, as the seminal point.
Abstract: Reviews the literature of the information chain, analogous to ecology’s food chain, taking the Royal Society Scientific Information Conference, 1948, as the seminal point. Describes eight successive models of the information chain each incorporating contemporary thoughts and experiences. Each model is labelled with the year to which it may be said to refer: Distribution of Scientific Information 1948; Document Network 1967; Dissemination of Scientific and Technical Information 1978; Structure of Scientific Literature 1979; Ecosystem of Scientific Communication 1980; Information Chain 1988; Information Chain 1989; Pathways of Information Flow 1993. Although of wide applicability, the focus of interest for information scientists tends to be the communication of learned information.

Journal ArticleDOI
TL;DR: In this article, the authors report the findings from a study of entrepreneurial activity in the small hotel sector in a Scottish town, St Andrews, and suggest that if a significant proportion of the sector overall is representative of the small entrepreneurial firm, this may have positive consequences for local economic prosperity.
Abstract: Reports the findings from a study of entrepreneurial activity in the small hotel sector in a Scottish town, St Andrews. Applies bodies of theory on the small entrepreneurial firm, developed for other sectors of the economy, to an examination of small firm activity in the hotel sector. The central thesis is that small hotel entrepreneurs will have had to adopt a business‐oriented approach to ensure the success, or at least the survival, of their firms. Findings from a survey of the small hotel sector in St Andrews provide some evidence to support this thesis, and conflict with those of an earlier study of the small hotel sector in the Bournemouth area in the 1980s. Recommends that further research should consider the nature of entrepreneurial activity in the small hotel sector generally. If a significant proportion of the sector overall is representative of the small entrepreneurial firm, this may have positive consequences for local economic prosperity in many areas.

Journal ArticleDOI
01 Sep 1997-Thorax
TL;DR: The spore derived toxin may exert its effect through its ability to diffuse rapidly into the lung lining fluid, diminish the macrophage oxidative burst, and play a part in allowing A fumigatus to persist in the lung and manifest its well known pathogenic effects.
Abstract: BACKGROUND: The fungus Aspergillus fumigatus, whose spores are present ubiquitously in the air, causes a range of diseases in the human lung. A small molecular weight (

Journal ArticleDOI
TL;DR: In this paper, conditions for local and asymptotic stability of the car-following model were established for the linear model and for the nonlinear model by linearization and numerical integrations.
Abstract: This paper investigates the stability of the classical car-following model (for example, Chandler et al., Operations Research, 6, 165–184, 1958; Herman et al., Operations Research, 7, 86–106, 1959; Wilhelm and Schmidt, Transportation Engineering Journal (ASCE) 99, 923–933, 1973). Conditions for local and asymptotic stability as defined in the references cited are established for the linear model. These differ from those in the literature in two ways. First, it will be shown that, in the autonomous model when the product of the coefficient of proportionality α and the reaction time τ is less than or equal to 1/e, there exist oscillatory solutions with higher frequencies than 2π, although there are none with lower frequencies. Secondly, asymptotic stability is considered along with local stability. The derived condition for asymptotic stability is both necessary and sufficient. In addition, the condition depends on the frequency of the forcing term, with the sufficient condition ατ 1 2 for the asymptotic stability found in the literature being included as a special case. The nonlinear model is considered by linearization and numerical integrations. Some practical values of parameters are tested for the stability of the model. The analyses in this paper are extended to consider different values of α and τ for different drivers in the line.

Journal ArticleDOI
TL;DR: The mother's age, whether contraception has ever been used, the death of a child at any time, whether the woman has ever worked, religion, region of residence, and female independence are the important covariates for explaining recent fertility in Bangladesh.

Journal ArticleDOI
08 Nov 1997-BMJ
TL;DR: The distribution of gestational age has changed noticeably over the past decade because of the gradual introduction of its assessment by ultrasonography and trends in birth weights adjusted for gestation may be misleading.
Abstract: Over the past decade the weights of babies born in the United Kingdom have been increasing,1 which may have implications for the pattern of adult disease.2 From 1980 to 1992 the mean birth weight of live singleton births in Scotland increased steadily from 3326 g to 3382 g. We investigated factors that may explain this trend. The distribution of gestational age has changed noticeably over the past decade because of the gradual introduction of its assessment by ultrasonography.3 In 1980 around 42% of all live births occurred at 40 weeks' gestation; by 1992 this had fallen to 32%. Thus trends in birth weights adjusted for gestation may be misleading and are not considered here. We assessed data on live, singleton births in …

Journal ArticleDOI
TL;DR: In this paper, the authors argue that the people element is the cornerstone of successful yield management practices and suggest a matrix framework for an organizational, team and individual approach to yield management based on commitment, focus and boundaries.
Abstract: Yield management is a process based on forecasting, strategy and people. Most research has converged on the forecasting and strategy elements, neglecting the people element of yield management. Argues that the people element is the cornerstone of successful yield management practices and suggests a matrix framework for an organizational, team and individual approach to yield management based on commitment, focus and boundaries. Explains yield management as a human activity system, therefore adopting a systems theory framework in analysis.

Book ChapterDOI
07 Apr 1997
TL;DR: The CALTROP program is presented, which provides a test of the feasibility of representing a decision tree as a linear chromosome and applying a genetic algorithm to the optimisation of the decision tree with respect to the classification of test sets of example data.
Abstract: The CALTROP program which is presented in this paper provides a test of the feasibility of representing a decision tree as a linear chromosome and applying a genetic algorithm to the optimisation of the decision tree with respect to the classification of test sets of example data. The unit of the genetic alphabet (the “caltrop”) is a 3-integer string corresponding to a subtree of the decision tree. The program offers a user a choice of mating strategies and mutation rates. Test runs with different data sets show that the decision trees produced by the CALTROP program usually compare favourably with those produced by the popular automatic induction algorithm, ID3.

Journal ArticleDOI
TL;DR: It is demonstrated here that ultrafine TiO2 and PM10 both have hydroxyl radical activity and that UFTiO2 is capable of causing hydroxym radical-mediated membrane damage to erythrocytes; fine TiO 2 has much less of these properties.
Abstract: Epidemiological evidence has been accumulating showing a strong relationship between particulate environmental air pollution (PM10) and end-points of respiratory ill health such as attacks of asthma, COPD, diminished lung function and cardio-vascular deaths (Pope et al., 1995). To date there has been no plausible biological hypothesis to explain this relationship at the very low airborne mass concentrations of particulate air pollution that are found (< 50 \ig ml"). We recently hypothesised (Seaton et al., 1995) that an ultrafine (< 100 nm diameter) component of PM10 is responsible for its adverse effects. This is based on the initial studies of Oberdorster and colleagues (Feirin et al., 1992) who demonstrated that titanium dioxide in the ultrafine form (20 nm diameter) was highly inflammogenic to the lungs of rats compared to fine (200 nm diameter) TiO2 particles at the same airborne mass concentration. We now hypothesise that the adverse effects of PM10 on the lung result from free radical activity at the surface of an ultrafine fraction. We further hypothesise that the interstitialisation that was seen with UFTiO2 (Ferin et al., 1992) could similarly occur with the ultrafine component of PM10. If the ultrafine material has free radical activity then the increased surface area that is presented to the epithelial surface by a relatively small mass of ultrafine particles could compromise epithelial integrity leading to interstitialisation. We demonstrate here that ultrafine TiO2 and PM10 both have hydroxyl radical activity and that UFTiO2 is capable of causing hydroxyl radical-mediated membrane damage to erythrocytes; fine TiO2 has much less of these properties. Additionally PM10 hydroxyl radical activity is either in the ultrafine fraction or is released in soluble form.

Journal ArticleDOI
TL;DR: In this paper, a fractional Brownian motion model for particle tracking was proposed to simulate pollutant dispersion using particle tracking in the coastal zone of the United Kingdom, and numerical test cases were used to compare this new model with the results obtained from a traditional Gaussian particle tracking model.
Abstract: SUMMARY The work is motivated by the recent discovery that ocean surface drifter trajectories contain fractal properties. This suggests that the dispersion of pollutants in coastal waters may also be described using fractal statistics. The paper describes the development of a fractional Brownian motion model for simulating pollutant dispersion using particle tracking. Numerical test cases are used to compare this new model with the results obtained from a traditional Gaussian particle-tracking model. The results seems to be significantly different, which may have implications for pollution modelling in the coastal zone. # 1997 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this paper, a case-study approach is used to show the ways in which several different international organizations are dealing with current situations and their perceived future needs, and the extent to which companies have faced adapting to the new conditions and suggests organizations should think longer-term and more holistically when designing these systems.
Abstract: Human resource management (HRM) policies in international organizations with wide geographic distribution and operating in high numbers of different cultural environments must be underpinned with a strong international management development (IMD) programme. Uses an exploratory, case‐study approach to show the ways in which several different international organizations are dealing with current situations and their perceived future needs. Describes how IMD is seen to be a comprehensive approach covering selection, training and career support, and how international recruitment is becoming a much more important feature, with the expatriate model of management fading. Explains the extent to which companies have faced adapting to the new conditions and suggests organizations should think longer‐term and more holistically when designing these systems.

Journal ArticleDOI
TL;DR: In this paper, the authors discuss the rural dimensions of the objectives and activities of Local Enterprise Companies (LECs) and provide lessons both for rural economic development partnerships and for the possible setting up of other regional development economic development agencies.

Book ChapterDOI
13 Jul 1997
TL;DR: Barnacle as discussed by the authors is a more useable version of CLAM, with potential to extend its user base, as not all theorems may be proved automatically, even with the provision of lemmas.
Abstract: XBarnacle provides: 1. An extension to the capabilities of CLAM, as not all theorems may be proved automatically, even with the provision of lemmas. 2. A more useable version of CLAM, with potential to extend its user base. 3. A tool for experimenting with different methods and heuristics.

Journal ArticleDOI
TL;DR: ATP measured by the luciferin-luciferase bioluminescence assay was used to examine the effect of toxic substances on whole microbial communities in activated sludge mixed liquor samples to determine the toxicity of wastes discharged to sewer.
Abstract: ATP measured by the luciferin-luciferase bioluminescence assay was used to examine the effect of toxic substances on whole microbial communities in activated sludge mixed liquor samples. The response of the microorganisms to toxicants is rapid using ATP reduction as the criterion. The sensitivity of the mixed populations to various toxicant types (e.g., organic material and heavy metals) is lower than when using single species toxicity tests such as the Microtox bioassay. The differences in sensitivity is considered a function of acclimatization, modification of the toxicant by the waste physicochemical environment, and the predominance of less sensitive organisms than those used in the Microtox bioassay (Photobacterium phosphoreum). ATP bioluminescence is, however, considered an important rapid test utilizing natural waste treatment microorganisms in determining the toxicity of wastes discharged to sewer. It can detect whether wastewater will have an effect on the biodegradation capability of the resident population of microrganisms. © 1997 by John Wiley & Sons, Inc. Environ Toxicol Water Qual 12: 23–29, 1997

Journal ArticleDOI
TL;DR: In this paper, the effects of the addition of 0-3 mol% cholesterol, cholesteryl stearate (18:0), cholesterol oleate ( 18:1), cholesterly linoleate (19:2) or cholesterol linolenate (20:3) upon the main transition and pretransition of fully hydrated dimyristoylphosphatidylcholine (DMPC) multilamellar liposomes have been measured.

Journal ArticleDOI
TL;DR: In this paper, the DRCRISP interface element has been modified by introducing a limiting adhesive strength in the normal direction; a flag system has also been introduced to track the various conditions of separation and closure for any gap that is formed.