scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Loss Prevention in The Process Industries in 2011"


Journal ArticleDOI
TL;DR: In this paper, the main risk analysis and risk assessment methods and techniques by reviewing the scientific literature are classified into three main categories: (a) the qualitative, (b) the quantitative, and (c) the hybrid techniques (qualitative,quantitative, semi-quantitative).
Abstract: The objective of this work is to determine and study, analyze and elaborate, classify and categorize the main risk analysis and risk-assessment methods and techniques by reviewing the scientific literature. The paper consists of two parts: a) the investigation, presentation and elaboration of the main risk-assessment methodologies and b) the statistical analysis, classification, and comparative study of the corresponding scientific papers published by six representative scientific journals of Elsevier B.V. covering the decade 2000–2009. The scientific literature reviewing showed that the risk analysis and assessment techniques are classified into three main categories: (a) the qualitative, (b) the quantitative, and (c) the hybrid techniques (qualitative–quantitative, semi-quantitative). The qualitative techniques are based both on analytical estimation processes, and on the safety managers–engineers ability. According to quantitative techniques, the risk can be considered as a quantity, which can be estimated and expressed by a mathematical relation, under the help of real accidents’ data recorded in a work site. The hybrid techniques, present a great complexity due to their ad hoc character that prevents a wide spreading. The statistical analysis shows that the quantitative methods present the highest relative frequency (65.63%) while the qualitative a lower one (27.68%). Furthermore the hybrid methods remain constantly at a very low level (6.70%) during the entire processing period.

371 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present an inventory of major process industry accidents involving "domino effect", which includes, among other relevant information, the sequence of accidents that had occurred in each domino episode.
Abstract: The paper presents an inventory, perhaps the most comprehensive till date, of the major process-industry accidents involving 'domino effect'. The inventory includes, among other relevant information, the sequence of accidents that had occurred in each domino episode. The information has been analyzed to identify several patterns which may be useful in further work on understanding domino effect and reducing the probability of its occurrence in future. A concept of 'local domino effect' has been introduced. Language: en

196 citations


Journal ArticleDOI
TL;DR: In this paper, a simulation of heavy gas dispersion in the presence of obstacles is presented, and the results show that the realizable k − ǫ model was the most apt and enabled the closest prediction of the actual findings in terms of spatial and temporal concentration profiles.
Abstract: Quantification of spatial and temporal concentration profiles of vapor clouds resulting from accidental loss of containment of toxic and/or flammable substances is of great importance as correct prediction of spatial and temporal profiles can not only help in designing mitigation/prevention equipment such as gas detection alarms and shutdown procedures but also help decide on modifications that may help prevent any escalation of the event. The most commonly used models – SLAB ( Ermak, 1990 ), HEGADAS ( Colenbrander, 1980 ), DEGADIS ( Spicer & Havens, 1989 ), HGSYSTEM ( Witlox & McFarlane, 1994 ), PHAST ( DNV, 2007 ), ALOHA ( EPA & NOAA, 2007 ), SCIPUFF ( Sykes, Parker, Henn, & Chowdhury, 2007 ), TRACE ( SAFER Systems, 2009 ), etc. – for simulation of dense gas dispersion consider the dispersion over a flat featureless plain and are unable to consider the effect of presence of obstacles in the path of dispersing medium. In this context, computational fluid dynamics (CFD) has been recognized as a potent tool for realistic estimation of consequence of accidental loss of containment because of its ability to take into account the effect of complex terrain and obstacles present in the path of dispersing fluid. The key to a successful application of CFD in dispersion simulation lies in the accuracy with which the effect of turbulence generated due to the presence of obstacles is assessed. Hence a correct choice of the most appropriate turbulence model is crucial to a successful implementation of CFD in the modeling and simulation of dispersion of toxic and/or flammable substances. In this paper an attempt has been made to employ CFD in the assessment of heavy gas dispersion in presence of obstacles. For this purpose several turbulence models were studied for simulating the experiments conducted earlier by Health and Safety Executive, (HSE) U.K. at Thorney Island, USA ( Lees, 2005 ). From the various experiments done at that time, the findings of Trial 26 have been used by us to see which turbulence model enables the best fit of the CFD simulation with the actual findings. It is found that the realizable k – ɛ model was the most apt and enabled the closest prediction of the actual findings in terms of spatial and temporal concentration profiles. It was also able to capture the phenomenon of gravity slumping associated with dense gas dispersion.

156 citations


Journal ArticleDOI
TL;DR: In this article, the reliability of safety instrumented systems (SISs) is defined by the probability of failure on demand (PFD) and the frequency of entering a hazardous state that will lead to an accident if the situation is not controlled by additional barriers.
Abstract: Safety instrumented systems (SISs) are commonly used in the process industry, to respond to hazardous events. In line with the important standard IEC 61508, SISs are generally classified into two types: low-demand systems and high-demand systems. This article explores this classification by studying the SIS reliability for varying demand rates, demand durations, and test intervals. The approach is based on Markov models and is exemplified by two simple system configurations. The SIS reliability is quantified by the probability of failure on demand (PFD) and the frequency of entering a hazardous state that will lead to an accident if the situation is not controlled by additional barriers. The article concludes that very low-demand systems are similar and may be treated as a group. The same applies to very high-demand system. Between these group, there is a rather long interval where the demand rate is neither high-demand nor low-demand. These medium-demand systems need a specific treatment. The article shows that the frequency of entering into a hazardous state increases with the demand rate for low-demand systems, while it is nearly independent of both the demand rate and the demand duration for high-demand systems. The PFD is an adequate measure for the SIS reliability for low-demand systems, but may be confusing and difficult to interpret for high-demand systems.

76 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the relationship among three latent variables: safety leadership, safety climate, and safety performance, and found that safety climate mediated the relationship between safety leadership and performance.
Abstract: This study examines the relationship among three latent variables: safety leadership, safety climate, and safety performance. Employees from 23 plants in seven departments of a petrochemical company in central Taiwan completed a questionnaire survey. From this, a sample of 521 responses was randomly selected. Structural equation modeling (SEM) analysis using the AMOS 5.0 was employed to test the hypothesized model relating the above-mentioned variables. The results indicate that the model was supported, and that safety climate mediated the relationship between safety leadership and performance. Practical implications of these results for process safety management in the petrochemical industries are discussed.

76 citations


Journal ArticleDOI
TL;DR: In this article, a set of short generic event trees for the main loss of containment scenarios involving different types of hazardous materials was proposed, even though most of them have been taken from the literature (BEVI Reference Manual), added the corresponding intermediate probabilities (immediate ignition, delayed ignition, flame front acceleration, etc.) obtained from a literature review and expert judgment, and associated the use of each event tree to the hazardous properties of the material (flammability, volatility and toxicity) and to its category according to EC labeling directives.
Abstract: To simplify quantitative risk analysis, the initiating events leading to loss of containment are normally described using generic hypotheses. For example, the following hypothesis is applied to the loss of containment from a storage tank: instantaneous release of the complete inventory, continuous release of the complete inventory in 10 min, and continuous release from a hole with a diameter of 10 mm. Once the initiating events have been specified, the corresponding event trees must be drawn to establish the sequences from each initiating event to the diverse final outcomes or accident scenarios, which will depend on the properties of the released material or on other specific factors. In this paper we propose, in a systematic way, a set of short generic event trees for the main loss of containment scenarios involving different types of hazardous materials. Even though most of them have been taken from the literature (BEVI Reference Manual), we have modified some of them, added the corresponding intermediate probabilities (immediate ignition, delayed ignition, flame front acceleration, etc.) obtained from a literature review and expert judgment, and associated the use of each event tree to the hazardous properties of the material (flammability, volatility and toxicity) and to its category according to EC labeling directives.

73 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a systematic framework toward the development of a Transportation Model for Hazardous Materials (HazMat), in which the objective is to minimize transportation cost while reducing risks at the desired levels.
Abstract: This paper presents a systematic framework toward the development of a Transportation Model for Hazardous Materials (HazMat). In practice, the proposed modeling framework is realized through an appropriate generalization of the traditional transportation network problem in the presence of safety constraints that need to be satisfied. The objective is to minimize transportation cost while reducing risks at the desired levels. In particular, the present research study identifies and evaluates different risk factors that influence the HazMat transportation network. Next, the transportation model is depicted graphically using nodes and arcs and optimal conditions are identified by solving the associated minimum cost flow network problem. The results show safety levels that help making informed decisions on choosing the optimal transportation configuration for hazardous material shipments. Within the proposed methodological context, appropriately parameterized simulation studies elucidate the effects of occurrence probabilities of the different risk events on transportation cost. Furthermore, as the appropriate management decisions must consider the effect of actions in one time period on future periods, the proposed model is structured as a multi-periodic model. Finally, the proposed methodological approach is employed to demonstrate the utility of proper analytical tools in decision making and particularly in ensuring that scientifically informed safety procedures are in place while transporting goods that can be potentially proven dangerous to the public and the surroundings.

70 citations


Journal ArticleDOI
TL;DR: The present work is an attempt to develop a comprehensive open-source database to assist past accident analysis, named PUPAD (Pondicherry University Process-industry Accident Database), which doesn’t aim to replace or substitute the well established databases such as MHIDAS and MARS but, rather, aims to compliment them.
Abstract: Past accident analysis (PAA) is one of the most potent and oft-used exercises for gaining insights into the reasons why accidents occur in chemical process industry (CPI) and the damage they cause. PAA provides invaluable ‘wisdom of hindsight’ with which strategies to prevent accidents or cushion the impact of inevitable accidents can be developed. A number of databases maintain record of past accidents in CPI. The most comprehensive of the existing databases include Major Hazard Incident Data Service (MHIDAS), Major Accident Reporting System (MARS), and Failure and Accidents Technical Information Systems (FACTS). But each of these databases have some limitations. For example MHIDAS can be accessed only after paying a substantial fee. Moreover, as detailed in the paper, it is not infallible and has some inaccuracies. Other databases, besides having similar problems, are seldom confined to accidents in chemical process industries but also cover accidents from other domains such as nuclear power plants, construction industry, and natural disasters. This makes them difficult to use for PAA relating to CPI. Operational injuries not related to loss of containment, are also often included. Moreover, the detailing of events doesn’t follow a consistent pattern or classification; a good deal of relevant information is either missing or is misclassified. The present work is an attempt to develop a comprehensive open-source database to assist PAA. To this end, information on about 8000 accidents, available in different open-source clearing houses has been brought into a new database named by us PUPAD (Pondicherry University Process-industry Accident Database). Multiple and overlapping accident records have been carefully eliminated and a search engine has been developed for retrieval of the records on the basis of appropriate classification. PUPAD doesn’t aim to replace or substitute the well established databases such as MHIDAS and MARS but, rather, aims to compliment them.

67 citations


Journal ArticleDOI
TL;DR: In this paper, the authors compared the explosion behavior of heterogeneous/homogeneous fuel-air (hybrid) mixtures and compared to the explosion features of homogeneous fuel and fuel mixture separately, and showed that the increase in explosion severity of dust/gas-air mixtures has to be addressed to the role of initial level of turbulence prior to ignition.
Abstract: The explosion behaviour of heterogeneous/homogeneous fuel–air (hybrid) mixtures is here analysed and compared to the explosion features of heterogeneous fuel–air and homogeneous fuel–air mixtures separately. Experiments are performed to measure the pressure history, deflagration index and flammability limits of nicotinic acid/acetone–air mixtures in a standard 20 L Siwek bomb adapted to vapour–air mixtures. Literature data are also used for comparison. The explosion tests performed on gas–air mixtures in the same conditions as explosion tests of dust–air mixtures, show that the increase in explosion severity of dust/gas–air mixtures has to be addressed to the role of initial level of turbulence prior to ignition. At a fixed value of the equivalence ratio, by substituting the dust to the flammable gas in a dust/gas–air mixture the explosion severity decreases. Furthermore, the most severe conditions of dust–gas/air mixtures is found during explosion of gas–air mixture at stoichiometric concentration.

67 citations


Journal ArticleDOI
TL;DR: In this article, a method is described for evaluating the effectiveness of learning, based on the idea of "level of learning" of the lessons learned, expressed in terms of how broadly the lesson learned is applied geographically, how much organizational learning is involved and how longlasting the effect of learning is.
Abstract: Learning from incidents is considered a very important source for learning and improving safety in the process industries. However, the effectiveness of learning from reported incidents can often be questioned. Therefore, there is a need to be able to evaluate the effectiveness of learning from incidents, and for that purpose we need methods and tools. In this paper, a method is described for evaluating the effectiveness of learning, based on the idea of “level of learning” of the lessons learned. The level of learning is expressed in terms of how broadly the lesson learned is applied geographically, how much organizational learning is involved and how long-lasting the effect of learning is. In the 6-step method, the incidents reported in a typical incident learning system are evaluated both for the actual and the potential level of learning in a semi-quantitative way with different tools. The method was applied in six process industries on a large number of incidents. The method was found to be very useful and to give insights of aspects that influence the learning from incidents.

64 citations


Journal ArticleDOI
TL;DR: The derailment of a freight train carrying 14 LPG (Liquefied Petroleum Gas) tank-cars near Viareggio, in Italy, caused a massive LPG release as mentioned in this paper, which resulted in 31 fatalities and in extended damages to residential buildings around the railway line.
Abstract: On June 29th, 2009 the derailment of a freight train carrying 14 LPG (Liquefied Petroleum Gas) tank-cars near Viareggio, in Italy, caused a massive LPG release. A gas cloud formed and ignited triggering a flash-fire that resulted in 31 fatalities and in extended damages to residential buildings around the railway line. The vulnerability of the area impacted by the flash-fire emerged as the main factor in determining the severity of the final consequences. Important lessons learnt from the accident concern the need of specific regulations and the possible implementation of safety devices for tank-cars carrying LPG and other liquefied gases under pressure. Integrated tools for consequence assessment of heavy gas releases in urban areas may contribute to robust decision making for mitigation and emergency planning. Language: en

Journal ArticleDOI
Niansheng Kuai1, Jianming Li1, Zhi Chen1, Weixing Huang1, Jingjie Yuan1, Wenqing Xu1 
TL;DR: In this article, an experimental investigation was carried out on magnesium dust explosions using the Siwek 20-L vessel and influences of dust concentration, particle size, ignition energy, initial pressure and added inertant were taken into account.
Abstract: An experimental investigation was carried out on magnesium dust explosions. Tests of explosion severity, flammability limit and solid inerting were conducted thanks to the Siwek 20 L vessel and influences of dust concentration, particle size, ignition energy, initial pressure and added inertant were taken into account. That magnesium dust is more of an explosion hazard than coal dust is confirmed and quantified by contrastive investigation. The Chinese procedure GB/T 16425 is overly conservative for LEL determination while EN 14034-3 yields realistic LEL data. It is also suggested that 2000–5000 J is the most appropriate ignition energy to use in the LEL determination of magnesium dusts, using the 20 L vessel. It is essential to point out that the overdriving phenomenon usually occurs for carbonaceous and less volatile metal materials is not notable for magnesium dusts. Trends of faster burning velocity and more efficient and adiabatic flame propagation are associated with fuel-rich dust clouds, smaller particles and hyperbaric conditions. Moreover, Inerting effectiveness of CaCO 3 appears to be higher than KCl values on thermodynamics, whereas KCl represents higher effectiveness upon kinetics. Finer inertant shows better inerting effectiveness.

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper proposed an equipment maintenance and safety integrity management system (MSI), which integrates ERP, MES, RBI, RCM, SIL and PMIS together.
Abstract: Equipment management in process industry in China essentially belongs to the traditional breakdown maintenance pattern, and the basic inspection/maintenance decision-making is weak. Equipment inspection/maintenance tasks are mainly based on the empirical or qualitative method, and it lacks identification and classification of critical equipment, so that maintenance resources can’t be reasonably allocated. Reliability, availability and safety of equipment are difficult to control and guarantee due to the existing maintenance deficiencies, maintenance surplus, potential danger and possible accidents. In order to ensure stable production and reduce operation cost, equipment maintenance and safety integrity management system (MSI) is established in this paper, which integrates ERP, MES, RBI, RCM, SIL and PMIS together. MSI can provide dynamic risk rank data, predictive maintenance data and RAM decision-making data, through which the personnel at all levels can grasp the risk state of equipment timely and accurately and optimize maintenance schedules to support the decision-making. The result of an engineering case shows that the system can improve reliability, availability, and safety, lower failure frequency, decrease failure consequences and make full use of maintenance resources, thus achieving the reasonable and positive result.

Journal ArticleDOI
TL;DR: In this article, the authors used AutoReaGas, a finite element computational fluid dynamics (CFD) code suitable for gas explosions and blast problems, to carry out the numerical simulation for the explosion processes of a methane-air mixture in the gallery or duct at various scales.
Abstract: Explosion experiments using premixed gas in a duct have become a significant method of investigating methane–air explosions in underground coal mines. The duct sizes are far less than that of an actual mine gallery. Whether the experimental results in a duct are applicable to analyze a methane–air explosion in a practical mine gallery needed to be investigated. This issue involves the effects of scale on a gas explosion and its shockwave in a constrained space. The commercial software package AutoReaGas, a finite element computational fluid dynamics (CFD) code suitable for gas explosions and blast problems, was used to carry out the numerical simulation for the explosion processes of a methane–air mixture in the gallery (or duct) at various scales. Based on the numerical simulation and its analysis, the effect of scale on the degree of correlation with the real situation was studied for a methane–air explosion and its shockwave in a square section gallery (or duct). This study shows that the explosion process of the methane–air mixture relates to the scales of the gallery or duct. The effect of scale decreases gradually with the distance from the space containing the methane–air mixture and the air shock wave propagation conforms approximately to the geometric similarity law in the far field where the scaled distance (ratio of the propagation distance and the height (or width) of the gallery section) is over 80.

Journal ArticleDOI
TL;DR: In this paper, a CFD-based method is proposed on the basis of the author's finding that among the various models available for assessing turbulence, the realizable k − ǫ model yields results closer to experimental findings than the other, more frequently used, turbulence models if used in conjunction with the eddy-dissipation model.
Abstract: The effectiveness of the application of CFD to vapour cloud explosion (VCE) modelling depends on the accuracy with which geometrical details of the obstacles likely to be encountered by the vapour cloud are represented and the correctness with which turbulence is predicted. This is because the severity of a VCE strongly depends on the types of obstacles encountered by the cloud undergoing combustion; the turbulence generated by the obstacles influences flame speed and feeds the process of explosion through enhanced mixing of fuel and oxidant. In this paper a CFD-based method is proposed on the basis of the author’s finding that among the various models available for assessing turbulence, the realizable k – ɛ model yields results closer to experimental findings than the other, more frequently used, turbulence models if used in conjunction with the eddy-dissipation model. The applicability of the method has been demonstrated in simulating the dispersion and ignition of a typical vapour cloud formed as a result of a spill from a liquid petroleum gas (LPG) tank situated in a refinery. The simulation made it possible to assess the overpressures resulting from the combustion of the flammable vapour cloud. The phenomenon of flame acceleration, which is a characteristic of combustion enhanced in the presence of obstacles, was clearly observed. Comparison of the results with an oft-used commercial software reveals that the present CFD-based method achieves a more realistic simulation of the VCE phenomena.

Journal ArticleDOI
TL;DR: Results of simulating test and field experiment show that it is possible to distinguish weak non-stationarities from complicated noises by harmonic wavelet analysis in pipeline small leak detection system.
Abstract: In the long distance pipeline remote monitoring system, small leak detection becomes an important issue. Weak singularities in small leak signals are usually difficult to detect precisely under complicated noise background, which may cause false alarm or miss alarm. The advantage of applying the harmonic wavelet method is explored in this paper. Pipeline small leak sensitive characteristics are recognized and the negative pressure wave inflexions are extracted by harmonic wavelet analysis, expressed in terms of harmonic wavelet time-frequency mesh map, time-frequency contour map, and time-frequency profile plot. This paper also presents a comparative study of both Daubechies wavelet and harmonic wavelet analysis when applied to pipeline small leak detection under complicated background noises. Results of simulating test and field experiment show that it is possible to distinguish weak non-stationarities from complicated noises by harmonic wavelet analysis in pipeline small leak detection system. The comparison clearly illustrates that harmonic wavelet based pipeline small leakage detection method is significantly more accurate than other wavelets analysis such as Daubechies wavelet. This work provides a reliable and safe guarantee for oil and gas long distance transportation, reducing petroleum product losses and protecting surrounding environment.

Journal ArticleDOI
TL;DR: This paper applies social network analysis, an analytical tool used by social scientists, to study human interactions and to analyze characteristics of the critical infrastructure network to identify Oil & Gas, Information & Communication Technologies, and Electricity as three infrastructures that are most relied upon by other infrastructure.
Abstract: As a typical process industry, the Oil & Gas industries play a key role within a networked critical infrastructure system in terms of their interconnection and interdependency. While the tight coupling of infrastructures increases the efficiency of infrastructure operations, interdependency between infrastructures may cause cascading failure of infrastructures. The interdependency between critical infrastructures gives rise to an infrastructure network. In this paper, we apply social network analysis, an analytical tool used by social scientists, to study human interactions and to analyze characteristics of the critical infrastructure network. We identify Oil & Gas, Information & Communication Technologies (ICT), and Electricity as three infrastructures that are most relied upon by other infrastructures, thus these may cause the greatest cascading failure of the infrastructures. Among the three, we further determine that Oil & Gas and Electricity are the more vulnerable infrastructures. As a result, priority toward critical infrastructure protection should be given to the Oil & Gas and Electricity infrastructures since they are most relied upon but at the same time depend more on other infrastructures.

Journal ArticleDOI
TL;DR: In this paper, the thermal stability of organic peroxides (cumene hydroperoxide 80% and dicumyl peroxide) was studied by means of calorimetric measurement (DSC, TA Q1000) in an isotherm mode and a dynamic mode.
Abstract: The thermal stability of organic peroxides (cumene hydroperoxide 80 wt% and dicumyl peroxide) was studied by means of calorimetric measurement (DSC, TA Q1000) in an isotherm mode and a dynamic mode. Analysis of power profiles released in the isothermal mode was combined with the analysis of the decomposed compounds by a gas chromatograph/mass spectrometer (GC/MS) to determine the reaction mechanisms corresponding to each of the two reactions. In this work, a methodology for estimating kinetic parameters was based on the comparison of the power profile (dynamic mode) given by the model to that obtained experimentally by changing the parameters values. Parameter estimation is achieved using the mixed estimation method where a genetic algorithm is combined with a locally convergent method.

Journal ArticleDOI
TL;DR: It is argued that when experimenting with potential terrorist attacks involving hazardous chemical releases, observation points should be around the main line of the downwind direction when the source is known; while the uniform distribution of observation points is an efficient solution for unknown incidents.
Abstract: Source determination is vital in decision making and emergency planning involving hazardous chemical releases. This work was concentrated on inverse calculation approaches for source determination as well as current trends and future perspectives. In this paper, these different approaches are reviewed by dividing them into two categories: probability modeling methods and optimization modeling methods. The traits of these approaches are comparatively analyzed. Then it is shown how these approaches behave when applied to practical cases, and their feasibility, applicability, stability, and limitation in determining the location and strength are presented. It is argued that when experimenting with potential terrorist attacks involving hazardous chemical releases, observation points should be around the main line of the downwind direction when the source is known; while the uniform distribution of observation points is an efficient solution for unknown incidents. Probability modeling methods are demonstrated to be insufficient during emergency responses due to their lacking of enough prior information of unknown parameters, while optimization modeling methods are efficient and become a new trend in source determination. Findings reflect an urgent need for the development of high-accuracy detectors and further research of data transmission techniques in order to ensure the validity of these approaches.

Journal ArticleDOI
TL;DR: The European Working Group on Land Use Planning (EWP) as discussed by the authors has been established and is operating under the coordination of the European Commission Joint Research Centre (JRC) to understand the different approaches and their implications to LUP decision-making, to develop guidelines in support to these decisions and to examine data sources and tools for consistent application of risk assessment in support of LUP.
Abstract: Recognising the importance of establishing appropriate separation distances between hazardous installations and vulnerable residential areas for mitigating the effects of industrial accidents, the European legislation for the control of major accident hazards – the so-called Seveso II Directive – calls for procedures ensuring that technical advice is taken systematically into account for land-use planning (LUP) purposes. Due to historical, administrative, cultural and other reasons, these European Union’s Member States which have consolidated procedures for addressing this issue, have employed different approaches, methods and criteria, with a potential for great divergence in the resulting land-use planning decisions. In order to address this situation and to increase consistency and ‘defendability’ of land-use planning decisions in the EU, a European Working Group has been established and is operating under the coordination of the European Commission’s Joint Research Centre (JRC). This Group, consisting of experts from the EU Member States, the industry and the academia, is trying to understand the different approaches and their implications to LUP decision-making, to develop guidelines in support to these decisions and to examine data sources and tools for consistent application of risk assessment in support to LUP. This paper presents the activities of the Group, reviews the situation with respect to LUP in Europe and discusses whether a direction towards more consistent LUP decisions is being followed in Europe.

Journal ArticleDOI
TL;DR: A project was performed for the Explosion Research Cooperative to develop algorithms for predicting the frequencies of explosions based on a variety of design, operating and environmental conditions as mentioned in this paper, which were developed for estimating unit-based explosion frequencies, such as those reported in API Recommended Practice 752.
Abstract: A project was performed for the Explosion Research Cooperative to develop algorithms for predicting the frequencies of explosions based on a variety of design, operating and environmental conditions Algorithms were developed for estimating unit-based explosion frequencies, such as those reported in API Recommended Practice 752, but in more detail and covering a much broader range of chemical process types The project also developed methods for predicting scenario-based explosion frequencies, using frequencies of initiating events and conditional probabilities of immediate ignition and delayed ignition resulting in explosion The algorithms were based on a combination of published data and expert opinion

Journal ArticleDOI
TL;DR: In the past 10 years, the vapor cloud explosion at Texas City, the ammonium nitrate explosion in Toulouse, a pipeline disaster in Belgium, and three near total loss events in Norway have highlighted that major accident process safety is still a serious issue as discussed by the authors.
Abstract: In the past 10 years, the vapor cloud explosion at Texas City, the ammonium nitrate explosion in Toulouse, a pipeline disaster in Belgium, and three near total loss events in Norway have highlighted that major accident process safety is still a serious issue. Hopes that PSM or Safety Case regulations would reduce process events by 80% have not proven true. The Baker Panel, convened after Texas City developed a series of recommendations, mainly around leadership, incentives, safety culture and more effective implementation of PSM systems. Many US-based companies are working hard to implement the Baker recommendations. In Europe, an approach built around safety barriers, especially relating to technical safety systems, is being widely adopted. The author’s company has carried out a global survey of process industry initiatives, for both upstream and downstream activities, to identify what the industry itself is planning to enhance process safety in the next 5–10 years. This paper presents a summary of some of the major programs and initiatives as apply to traditional oil majors, newer national oil companies, and the chemical industry. These are a mixture of Baker recommendations, barrier approaches and tighter integration of process safety and asset integrity. While the factor of 10 improvement achieved in occupational safety over the past 20 years seems unattainable for process safety, a factor of 3–4 improvement in the next 20 years does seem possible. This would call for significant effort on the part of operators, but the benefits fully justify the effort.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a methodology to predict gas emissions rates from the size of the dispersed gas plume or cloud to the minimum detectable concentration for different meteorological conditions.
Abstract: Recently, infrared optical imaging has been applied in the oil and gas industry as a method to detect potential leaks in pipelines, components and equipment. The EPA suggested that this impending technique is considered as a smart gas LDAR (leak detection, monitoring and repair) for its rapid recognition of leaks, accuracy and robustness. In addition, compared to the conventional method using Total Vapor Analyzer (TVA) or gas sniffer, it has several other advantages, such as the ability to perform real-time scanning and remote sensing, ability to provide area measurement instead of point measurement, and provide an image of the gas which is not visible to naked eye. However, there is still some limitation in the application of optical imaging techniques; it does not give any measurement of gas emissions rates or concentrations of the leaking gas. Infrared cameras can recognize a target gas and distinguish the gas from its surrounding up to a certain concentration, namely the minimum detectable concentration. The value of the minimum detectable concentration depends on the camera design, environmental conditions and surface characteristics when the measurement is taken. This paper proposed a methodology to predict gas emissions rates from the size of the dispersed gas plume or cloud to the minimum detectable concentration. The gas emissions rate is predicted from the downwind distance and the height of the cloud at the minimum detectable concentration for different meteorological conditions. Gas release and dispersion from leaks in natural gas pipeline systems is simulated, and the results are presented.

Journal ArticleDOI
TL;DR: In this article, an age-dependent unavailability model that integrates the effects of the test and maintenance (T&M) activities as well as component ageing is developed and represents the basis for calculating risk.
Abstract: The improvement of safety in the process industries is related to assessment and reduction of risk in a cost-effective manner. This paper addresses the trade-off between risk and cost related to standby safety systems. An age-dependent unavailability model that integrates the effects of the test and maintenance (T&M) activities as well as component ageing is developed and represents the basis for calculating risk. The repair “same-as-new” process is considered regarding the T&M activities. Costs are expressed as a function of the selected risk measure. The time-averaged function of the selected risk measure is obtained from probabilistic safety assessment, i.e. the fault tree analysis. This function is further extended with inclusion of additional parameters related to T&M activities as well as ageing parameters related to component ageing. In that sense, a new model of system unavailability, incorporating component ageing and T&M costs, is presented. The testing strategy is also addressed. Sequential and staggered testings are compared. The developed approach is applied on a standard safety system in nuclear power plant although the method is applicable to standby safety systems that are tested and maintained in other industries as well. The results show that the risk-informed surveillance requirements differ from existing ones in technical specifications, which are deterministically based. Moreover, the presented approach achieves a significant reduction in system unavailability over a relatively small increase of total T&M costs.

Journal ArticleDOI
TL;DR: This is the first study that introduces an integrated ANN algorithm for assessment and improvement of human job satisfaction with respect to HSEE program in complex systems.
Abstract: Researchers have been continuously trying to improve human performance with respect to Health, Safety and Environment (HSE) and ergonomics (hence HSEE). This study proposes an adaptive neural network (ANN) algorithm for measuring and improving job satisfaction among operators with respect to HSEE in a gas refinery. To achieve the objectives of this study, standard questionnaires with respect to HSEE are completed by operators. The average results for each category of HSEE are used as inputs and job satisfaction is used as output for the ANN algorithm. Moreover, ANN is used to rank operators performance with respect to HSEE and job satisfaction. Finally, Normal probability technique is used to identify outlier operators. Moreover, operators with inadequate job satisfaction with respect to HSEE are identified. This would help managers to see if operators are satisfied with their jobs in the context of HSEE. This is the first study that introduces an integrated ANN algorithm for assessment and improvement of human job satisfaction with respect to HSEE program in complex systems.

Journal ArticleDOI
TL;DR: In this article, the authors evaluated the earthquake performance of Turkish industrial facilities, especially storage tanks, in terms of earthquake resistance, and the vulnerability of storage tanks in Turkey was determined and the probabilistic risk was defined with the results of the analysis.
Abstract: In 1999, two earthquakes in northwest Turkey caused heavy damage to a large number of industrial facilities. This region is the most industrialized in the country, and heavy damage has a significant economic influence. Industrial storage tanks, ruptured by earthquakes, exascerbate damage through the spread of fire. Storage tanks are uniquely structured, tall cylindrical vessels, some supported by relatively short reinforced concrete columns. The main aim of this study is to evaluate the earthquake performance of Turkish industrial facilities, especially storage tanks in terms of earthquake resistance. Modeling a typical storage tank of an industrial facility helps to understand the structure’s seismic response. A model tank structure was modelled as a solid with lumped mass and spring systems. Performance estimation was done with 40 different earthquake data through nonlinear time history analyses. After the time history analyses, fragility analyses produced probabilistic seismic assessment for the tank model. For the model structure, analysis results were evaluated and compared. In the study, vulnerability of storage tanks in Turkey was determined and the probabilistic risk was defined with the results of the analysis.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the diffusion behavior of gases in the ground, such as the diffusion range and time, in the case of leakage, and the impact on the surrounding area.
Abstract: Chemical plants and gas utilities own large underground pipelines to transport material such as combustible gas. For example, city gas utilities in Japan have about 230,000 km of pipelines, even if only the pipelines to deliver gas to their customers are considered. Any accidents involving such pipelines can lead to enormous human and physical damage, and their security is therefore a top-priority issue for utilities. For the safety management of underground pipelines, in addition to assessment of the long-term reliability of pipeline materials, it is extremely important to understand the diffusion behavior of gases in the ground, such as the diffusion range and time, in the case of leakage, and the impact on the surrounding area is a fundamental factor to be considered in the design and maintenance of safe facilities and for emergency response. Although many papers introduce the situations of gas diffusion in the atmosphere such as indoor and outdoor conditions, only fundamental surveys have been conducted on gas diffusion in the ground, and there have been few full-scale empirical studies. This study reports the results of the verification of the diffusion behavior with full-scale gas leakage experiments simulating real underground pipelines, as well as the outcomes of the applicability test of a numerical simulation model investigated and proposed based on the results. This technical knowledge regarding security will contribute to further improvement of safety in the industry.

Journal ArticleDOI
TL;DR: Fuzzy logic gives a possibility of better insights into hazards and safety phenomena for each explosion risk scenario, and is not possible to receive such conclusions from the traditional ExLOPA calculation results.
Abstract: Safety and health of workers potentially being at risk from explosive atmospheres are regulated by separate regulations (ANSI/AIHA in USA and ATEX in the European Union). The ANSI/AIHA does not require risk assessment whereas it is compulsory for ATEX. There is no standard method to do that assessment. For that purpose we have applied the explosion Layer of Protection Analysis (ExLOPA), which enables semi-quantitative risk assessment for process plants where explosive atmospheres occur. The ExLOPA is based on the original work of CCPS for LOPA taking into account an explosion accident scenario at workplace. That includes typical variables appropriate for workplace explosion like occurrence of the explosive atmosphere, the presence of effective ignition sources, activity of the explosion prevention and mitigation independent protection layers as well as the severity of consequences. All those variables are expressed in the form of qualitative linguistic categories and relations between them are presented using expert based engineering knowledge, expressed in the form of appropriate set of rules. In this way the category of explosion risk may be estimated by the semi-quantitative analysis. However, this simplified method is connected with essential uncertainties providing over or under estimation of the explosion risk and may not provide real output data. In order to overcome this problem and receive more detailed quantitative results, the fuzzy logic system was applied. In the first stage called fuzzification, all linguistic categories of the variables are mapped by fuzzy sets. In the second stage, the number of relation between all variables of analysis are determined by the enumerative combinatorics and the set of the 810 fuzzy rules “IF-THEN” is received. Each rule enables determination of the fuzzy risk level for a particular accident scenario. In the last stage, called defuzzification, the crisp value of final risk is obtained using a centroid method. The final result of the risk presents a contribution of each risk category represented by the fuzzy sets (A, TA, TNA and NA) and is therefore more precise and readable than the traditional approach producing one category of risk only. Fuzzy logic gives a possibility of better insights into hazards and safety phenomena for each explosion risk scenario. It is not possible to receive such conclusions from the traditional ExLOPA calculation results. However it requires the application of computer-aided analyses which may be partially in conflict with a simplicity of ExLOPA. The practical example provides a comparison between the traditional results obtained by ExLOPA and by fuzzy ExLOPA methods.

Journal ArticleDOI
TL;DR: An experimental study has been carried out in Jebel Dhanna (JD) terminal area by Abu Dhabi Company for Onshore Oil Operations (ADCO) with support of Resource Protection International (RPI) consultant as mentioned in this paper.
Abstract: Storage tanks are important facilities for the major hazard installations (MHIs) to store large quantity of crude oil. There is several fire types can occur with large diameter open top floating roof storage tanks. Boilover is considered one of the most dangerous fires in large-scale oil tank. The world has witnessed many incidents due to boilover in floating roof storage tank. Boilover problem has been studied in experiments and by models to understand how to control the boilover phenomena. An experimental study has been carried out in Jebel Dhanna (JD) terminal area by Abu Dhabi Company for Onshore Oil Operations (ADCO) with support of Resource Protection International (RPI) consultant. 2.4 m diameter and 4.5 m diameters pans have been used to study the characteristics of the large oil-tank fires (i) to gain more knowledge of the boilover phenomenon of crude oil (ii) verify if the crude oil stored by ADCO would boilover (ii) estimation of rate of hot-zone growth and the period needed from ignition to boilover (iii) estimation of radiant heat and consequences of boilover. This paper presents an overview on the floating roof storage tank boilover. The paper also presents briefly boilover experimental research study carried out by ADCO.

Journal ArticleDOI
TL;DR: In this paper, a quantitative structure-property relationship (QSPR) model was developed from molecular structures for the prediction of standard net heat of combustion from a diverse set of 1650 organic compounds.
Abstract: A quantitative structure–property relationship (QSPR) model for prediction of standard net heat of combustion was developed from molecular structures. A diverse set of 1650 organic compounds were employed as the studied dataset, and a total of 1481 molecular descriptors were calculated for each compound. The novel variable selection method of ant colony optimization (ACO) algorithm coupled with the partial least square (PLS) was employed to select optimal subset of descriptors that have significant contribution to the overall property of standard net heat of combustion from the large pool of calculated descriptors. As a result, four molecular descriptors were screened out as the input parameters, and a four-variable multi-linear model was finally constructed using multi-linear regression (MLR) method. The resulted squared correlation coefficient R 2 of the model was 0.995 for the training set of 1322 compounds, and 0.996 for the external test set of 328 compounds, respectively. The results showed that an accurate prediction model for the net heat of combustion could be obtained by using the ant colony optimization method. Moreover, this study can provide a new way for predicting the net heat of combustion of organic compounds for engineering based on only their molecular structures.