scispace - formally typeset
Search or ask a question

Showing papers in "Software - Practice and Experience in 1986"


Journal ArticleDOI
TL;DR: A refinement to a well‐known selection algorithm is described, which results in a useful improvement in the performance of the original algorithm, particularly when the selection index is small relative to the median.
Abstract: A refinement to a well-known selection algorithm is described. The refinement results in a useful improvement in the performance of the original algorithm, particularly when the selection index is small relative to the median.

350 citations



Journal ArticleDOI
Jason Gait1
TL;DR: For sufficiently large delays, the probe effect can almost completely mask synchronization errors in concurrent programs, and for sufficiently large concurrent process sets, even small values of embedded delay may masks synchronization errors.
Abstract: This paper reports on an experimental study of the probe effect, defined as an alteration in the frequency of run-time computational errors observed when delays are introduced into concurrent programs. If the concurrent program being studied has no synchronization errors, then there is no probe effect. In the presence of synchronization errors, the frequency of observable output errors for a sample experimental program starts at a high value for small delays, oscillates rapidly as the delay is increased, and apparently settles at zero errors for larger values of delay. Thus, for sufficiently large delays, the probe effect can almost completely mask synchronization errors in concurrent programs. For sufficiently large concurrent process sets, even small values of embedded delay may mask synchronization errors, provided side effects in shared memory are not included in the observation.

217 citations



Proceedings ArticleDOI
TL;DR: Foam stability in the presence of Salem crude oil and pure hydrocarbons is investigated as a function of chain length of cap alpha-olefin sulfonates and electrolyte concentration using transmitted light, incident light inteferometric and differential interferometric microscopic techniques.
Abstract: Foam stability in the presence of Salem crude oil and pure hydrocarbons is investigated as a function of chain length of cap alpha-olefin sulfonates and electrolyte concentration Interactions between aqueous foam films and emulsified oil droplets are observed using transmitted light, incident light inteferometric and differential interferometric microscopic techniques Foam destabilization factors are identified including the pseudoemulsion film tension and the surface and interfacial tension gradients Results from foam-enhanced oil recovery experiments in Berea Sandstone cores are presented using the combined gamma ray/microwave absorption technique to measure dynamic fluid saturation profiles

139 citations


Journal ArticleDOI
Robert L. Bernstein1
TL;DR: Methods are given for finding a sequence of ‘add’, ‘subtract’ and ‘shift’ instructions to multiply the contents of a register by an integer constant.
Abstract: Methods are given for finding a sequence of ‘add’, ‘subtract’ and ‘shift’ instructions to multiply the contents of a register by an integer constant. Each method generalizes the previous one and requires only a few intermediate or scratch registers. A variation of the last method is used in the PL.8 compiler and uses an unnoticeable amount of the overall compile time. Some statistics roughly indicating the effectiveness of the methods are presented.

121 citations



Proceedings ArticleDOI
TL;DR: A new theoretical model for predicting flow rates and the critical-subcritical flow boundary was tested against data from two published studies and substantially improves the existing methods for predicting choke behavior in two-phase flow.
Abstract: Two-phase flow through wellhead chokes, including both critical and subcritical flow and the boundary between them, was studied. Data were gathered for air-water and air-kerosene flows through five choke diameters from 1/4 in. (6.35 mm) to 1/2 in. (12.7 mm), and results were compared to published correlations. A new theoretical model for predicting flow rates and the critical-subcritical flow boundary was tested against these data, as well as data from two published studies. The new model substantially improves the existing methods for predicting choke behavior in two-phase flow.

112 citations


Journal Article
TL;DR: In this article, the rendezvous concept is used to extend the C programming language to support concurrent programs on both single computers and multicomputers, and a distributed version of Concurrent C is being implemented.
Abstract: Concurrent programming is becoming increasingly important because multicomputers, particularly networks of microprocessors, are rapidly becoming attractive alternatives to traditional maxicomputers. Effective use of such network computers requires that programs be written with components that can be executed in parallel. The C programming language does not have concurrent programming facilities. Our objective is to enhance C so that it can be used to write concurrent programs that can run efficiently on both single computers and multicomputers. Our concurrent programming extensions to C are based on the rendezvous concept. These extensions include mechanisms for the declaration and creation of processes, for process synchronization and interaction, and for process termination and abortion. We give a rationale for our decisions and compare Concurrent C extensions with the concurrent programming facilities in Ada. Concurrent C has been implemented on the UNIX system running on a single processor. A distributed version of Concurrent C is being implemented.

105 citations



Journal ArticleDOI
TL;DR: In this article, the authors use a model of distributed computation and measurement to implement a program monitoring system for programs running on the Berkeley UNIX 4.2BSD operating system, which describes the activities of the processes within a distributed program in terms of computation and communication.
Abstract: Writing and debugging distributed programs can be difficult. When a program is working, it can be difficult to achieve reasonable execution performance. A major cause of these difficulties is a lack of tools for the programmer. We use a model of distributed computation and measurement to implement a program monitoring system for programs running on the Berkeley UNIX 4.2BSD operating system. The model of distributed computation describes the activities of the processes within a distributed program in terms of computation (internal events) and communication (external events). The measurement model focuses on external events and separates the detection of external events, event record selection and data analysis. The implementation of the measurement tools involved changes to the Berkeley UNIX kernel, and the addition of daemon processes to allow the monitoring activity to take place across machine boundaries. A user interface has also been implemented.

Proceedings ArticleDOI
TL;DR: A high degree of accuracy is achieved when a correction factor to the torsional propagation velocity in drillpipe due to the offsets at the drillpipe joints is incorporated into the calculations.
Abstract: This paper presents a mathematical method for the computation of torsional resonance frequencies in a drillstring. The torsional resonance frequencies in a drillstring are readily calculated from the nominal drillstring dimensions. A high degree of accuracy is achieved when a correction factor to the torsional propagation velocity in drillpipe due to the offsets at the drillpipe joints is incorporated into the calculations. The theory and computer program were tested against data recorded in a 1000m deep, nearly vertical well. Torsional spectra were recorded with the bit off bottom at 450m, 550m, and 1000m and while drilling. Well defined torsional resonances were observed for frequencies up to 40 HZ. The calculated resonance frequencies are in very good agreement with all experimental data.

Proceedings ArticleDOI
TL;DR: In this article, a sample calculation of the influence of natural or artificial fractures on well productivity is presented, showing that a horizontal well may take advantage from an existing fracture network, much better than a vertical well would have.
Abstract: Horizontal and drainhole drilling have been used successfully to increase productivity of wells in the past years. Many formulae can be used to compute the productivity of horizontal or slanted wells compared to vertical ones. One objective of this work is to recall the proper way to use these formulae and to recall the assumptions made that may limit their use. The increase of productivity due to longer perforated interval increases as the logarithm of length and can reach values of two or three times the vertical well. Also presented is a sample calculation of the influence of natural or artificial fractures on well productivity: a horizontal well may take advantage from an existing fracture network, much better than a vertical well would have. This is because the most permeable fractures are the open ones which have their planes parallel to the largest stress, and the largest stress is generally vertical because of rock weight. The productivity of a drainhole in a fractured medium can raise to 10 times the productivity of the drainhole alone.

Proceedings ArticleDOI
TL;DR: In this paper, a comparison of the total cost of the 9 injector gel treatments with the gross revenues generated by the total calculated incremental oil from the 13 responding producers indicates that the treatments were an economic success.
Abstract: Following initiation by Amoco in 1973 of a secondary miscible (WAG) flood in the South Swan Hills Unit, tracer performance indicated severe channeling of solvent and water, particularly in the northwest quarter of the Unit. In 1975 and 1977, problem zones at 9 injectors were isolated and selectively treated with chrome-lignosulfonate time-set gels in an attempt to reduce solvent and water cycling. The treatment volumes varied between 8500 and 14,000 bbls (1350 and 2225 m/sup 3/) per well. As part of an evaluation of the gel treatments, incremental oil production responses observed at a number of the 56 producers offset to the 9 treated injectors were examined. The responses which could not be explained by other contributory factors, such as the drilling of infill seam injection wells, well workovers (e.g. acid jobs and asphaltene treatments), casing repairs, and pump changeouts, were considered to be the result of improved sweep due to the gel treatments. Thirteen of the 56 offset producers (23%) were found to have responded to the gel treatments, and these were rated as to the duration of response. Response durations of up to 18 months were rated as ''good'', and those lasting longer were rated as ''excellent''. Althoughmore » a detailed economic analysis was not performed, a comparison of the total cost of the 9 injector gel treatments with the gross revenues generated by the total calculated incremental oil from the 13 responding producers indicates that the treatments were an economic success.« less

Proceedings ArticleDOI
TL;DR: It is shown that the DPC can dictate the strategy for a drilling program and what the economics of drilling a sequence of wells should be in a given area and how well an operation is prepared to drill a given location and how difficult the area is to drill.
Abstract: The Drilling Performance Curve (DPC) is a simple yet powerful tool to assess the drilling performance in any given area where a consecutive series of similar wells have been drilled. All the information that is needed to perform the analysis is the sequence numbers of the well and the time it takes to reach a given depth. This paper presents some typical examples of DPC's covering a study of over 30 different areas (onshore and offshore) including over 2000 wells. From the data, a simple model for the overall drilling performance was derived. Three constants C/sub 1/, C/sub 2/, and C/sub 3/ are unique to the DPC. From the numerical value of the constants, the drilling performance can be derived. For example, the C/sub 1/ constant indicates how well an operation is prepared to drill a given location and/or how difficult the area is to drill. The C/sub 2/ constant directly reflects the rate of learning. The C/sub 3/ constant indicates the level of technology and organization for drilling in a particular area. This paper presents cases from poor to excellent drilling with the associated coefficients. It is shown that the DPC can dictate the strategy for a drilling programmore » and what the economics of drilling a sequence of wells should be in a given area. It is suggested that the DPC become the yardstick for evaluating drilling, much as the decline curve is for production.« less

Journal ArticleDOI
E Adams1, S S Muchnick1
TL;DR: Dbxtool is a window‐ and mouse‐based debugger for C, Pascal and FORTRAN programs running on Sun workstations that has been extended with the abilities to debug multiple‐process programs, already‐running processes, and the Sun Operating System kernel.
Abstract: Dbxtool is a window- and mouse-based debugger for C, Pascal and FORTRAN programs running on Sun workstations. Its use of the mouse as the primary input mechanism eliminates the need to type variables, line numbers, breakpoints and most commands. Its multiple windows provide several qualitatively different perspectives on the debugging problem. Compared to the Unix 4.2 BSD dbx from which it is derived, it has been extended with the abilities to debug multiple-process programs, already-running processes, and the Sun Operating System kernel.


Proceedings ArticleDOI
TL;DR: In this paper, both adsorption and desorption isotherms were determined for several coal samples and it was found that there are hysteresis effects which cause the two sorption modes to be different.
Abstract: Methane production from coal seams has been actively pursued in the U.S. since the mid-1970's. This resource is regarded as an unconventional natural gas supply due to several unique characteristics of the reservoir, including: two-phase (water and gas) flow, low effective permeability to gas, and the majority of the methane being held in an adsorbed state on the internal surfaces of the microporous coal structure rather than as compressed or dissolved gas within the pores. In a virgin coal reservoir with ideal conditions the adsorbed methane concentration is in equilibrium with the initial reservoir pressure. Production of gas from the coal bed depends on the release of this adsorbed methane to a free state, which then flows through the natural fracture system of the coal towards the wellbore. Release of the adsorbed methane is accomplished by dewatering the reservoir which decreases the local pressure and thereby upsets the equilibria. In the past, researchers have utilized laboratory-measured adsorption isotherms to model the gas concentration/reservoir pressure equilibrium relationship. In the current study both adsorption (pressure increasing) and desorption (pressure decreasing) isotherms were determined for several coal samples. It was found that there are hysteresis effects which cause the two sorption modes tomore » be different. The desorption isotherms were more non-linear than the adsorption isotherms. In terms of reservoir performance this mean that coalbed reservoirs must be drawn down to pressures lower than those indicated by adsorption isotherms to achieve the same degree of methane release.« less

Journal ArticleDOI
TL;DR: A table‐driven algorithm for drawing a variety of space‐filling curves is presented and a method for discovering new curves is described.
Abstract: A table-driven algorithm for drawing a variety of space-filling curves is presented. A method for discovering new curves is described. Numerous examples are shown.

Journal ArticleDOI
TL;DR: This paper examines a common design for a lexical analyser and its supporting modules and recommends several specific design and optimization strategies that are also valid for software other than lexical analyseers.
Abstract: This paper examines a common design for a lexical analyser and its supporting modules. An implementation of the design was tuned to produce the best possible performance. In effect, many of the optimizations that one would expect of a production-quality compiler were carried out by hand. After measuring the cost of tokenizing two large programs with this version, the code was ‘detuned’ to remove specific optimizations and the measurements were repeated. In all cases, the basic algorithm was unchanged, so that the difference in cost is an indication of the effectiveness of the optimization. Comparisons were also made with a tool-generated lexical analyser for the same task. On the basis of the measurements, several specific design and optimization strategies are recommended. These recommendations are also valid for software other than lexical analysers.


Proceedings ArticleDOI
TL;DR: In this article, a rate of penetration prediction equation for an insert roller cone bit is determined from laboratory drilling tests as a function of bit weight, well depth and laboratory measured rock properties.
Abstract: A rate of penetration (ROP) prediction equation for an insert roller cone bit is determined from laboratory drilling tests as a function of bit weight, well depth and laboratory measured rock properties. A complete description of each of the seven rock types used in the study is presented. A comparison of ROP at different flow rates for each rock shows minimal hydraulic cleaning problems at the base hydraulic energy levels chosen. The predictive equation will be valuable in predicting drilling rate and understanding how the rock properties affect it.

Proceedings ArticleDOI
TL;DR: The rheological behaviour of invert emulsion muds has been studied at pressures up to 1000 bar and temperatures up to 140/sup 0/C and a pair of two similar exponential expressions were found to be able to model the pressure and temperature behaviour of the two parameters of the Casson model.
Abstract: The rheological behaviour of invert emulsion muds has been studied at pressures up to 1000 bar and temperatures up to 140/sup 0/C. Rheological parameters were calculated for the Bingham, Herschel-Bulkley and Casson rheological models. The Herschel-Bulkley and Casson models both give good fits to the experimental rheograms. The Casson model is more reliable for extrapolation purposes than the Herschel-Bulkley model. A pair of two similar exponential expressions were found to be able to model the pressure and temperature behaviour of the two parameters of the Casson model. The expressions, which are based on the relation for pure liquids derived theoretically, contain temperature dependent pressure coefficients. The simplifications inherent in the temperature and pressure model are discussed in the light of the temperature and pressure behaviour of the viscosity of common base oils and their constituent hydrocarbons. Field application of the model requires measurement of the rheology of the mud at two or more temperatures and knowledge of the pressure coefficients relating the behaviour of the plastic viscosity to that of the yield point, or the Casson high shear viscosity to that of the Casson yield stress. Pressure measurements or other information are then not required. Applications can be based onmore » Casson or Bingham rheological measurements. The relationships between the parameters of the Casson and Binham models are discussed.« less

Journal ArticleDOI
TL;DR: The possibility of executing a language's semantic description directly supports a methodology of language design that is advocate: express the design as a formal language description, and use this to test and refine the design, before becoming committed to constructing a compiler.
Abstract: We describe how the denotational semantics of a programming language can be executed directly, if it is expressed in a suitable functional programming language such as ML. We also apply Mosses' idea of ‘semantic algebras’ to construct semantic descriptions that are significantly more modular and understandable than usual. The possibility of executing a language's semantic description directly supports a methodology of language design that we advocate: express the design as a formal language description, and use this to test and refine the design, before becoming committed to constructing a compiler.

Journal ArticleDOI
G Davies1, S Bowsher1
TL;DR: Although there is no overall ‘best’ algorithm, the more complex algorithms are worth considering as they are generally more efficient in terms of number of comparisons made and execution time.
Abstract: This paper describes four algorithms of varying complexity used for pattern matching, and investigates their behaviour. The algorithms are tested using patterns of varying length from several alphabets. It is concluded that although there is no overall ‘best’ algorithm, the more complex algorithms are worth considering as they are generally more efficient in terms of number of comparisons made and execution time.

Proceedings ArticleDOI
TL;DR: In this paper, the CO/sub 2/ Huff-Puff (Immiscible Carbon Dioxide Displacement) process is described including reservoir mechanics, planning, design, and implementation of the projects including the equipment specifically designed and constructed for these projects.
Abstract: The CO/sub 2/ Huff-Puff (Immiscible Carbon Dioxide Displacement) process is described including reservoir mechanics. The planning, design, and implementation of the projects including the equipment specifically designed and constructed for these projects are discussed. Case histories of selected projects are included. The paper contains field operations detailing the problems encountered and subsequentially corrected.


Proceedings ArticleDOI
R.E. Hinkley1, Lorne A. Davis1
TL;DR: In this paper, the effect of matrix discontinuities on two-phase flow in homogenous composite cores was studied as a function of flow rate and wettability, and composite cores were constructed by splicing multiple core segments with bridging materials of differing wettabilities.
Abstract: The optimum construction methodology of composite cores for multiphase flow tests is an important practical petrophysical concern. The critical feature of construction is the minimization of saturation disturbances which occur due to capillary pressure discontinuities at individual core segment interfaces. The effect of matrix discontinuities on two-phase flow in homogenous composite cores was studied as a function of flow rate and wettability. Five composite cores, representing a range of wettabilities, were prepared. The composite cores were constructed by splicing multiple core segments with bridging materials of differing wettabilities. Saturation profile data we measured with a microwave saturation scanner during imbibition and drainage floods. The interface between core segments had a strong effect on the saturation distributions when capillary contact was not maintained between contiguous segments. Under drainage conditions at low flow rates, each core segment behaved as though it were an independent core experiencing a strong exit end effect. In steady-state fractional flow experiments this occurred only at fractional flows close to unity. The most effective bridging materials for water-wet Berea were thin paper sheets. No bridging material was found to be completely adequate in an oil-wet environment. In the absence of good capillary contact, increasing the flow rate ismore » a practical remedy for the saturation disturbances.« less


Proceedings ArticleDOI
TL;DR: In this paper, the relationships between cement mixing and cement slurry quality are investigated and a tentative explanation is proposed through a mechanism of particle deflocculation and dissolution, leading to an increase in the available specific surface area.
Abstract: In this paper, the relationships between cement mixing and cement slurry quality are investigated. Laboratory mixing conditions, using a high shear mixer, are compared to field mixing conditions, including conventional jet mixer, recirculating type mixer and batch mixer. All the mixing conditions can be reduced according to a single parameter, the specific mixing energy, that allows the comparison of laboratory and field mixing with confidence. Typical cement slurry properties, like rheology, free water, fluid loss, thickening time and compressive strength, are measured as a function of the specific mixing energy. All these properties improve when the specific mixing energy increases. The efficiency of cement additives, like dispersants and fluid-loss agents, is also found to vary significantly with the energy. A tentative explanation is proposed through a mechanism of particle deflocculation and dissolution, leading to an increase in the available specific surface area.