scispace - formally typeset
Search or ask a question
Institution

Langley Research Center

FacilityHampton, Virginia, United States
About: Langley Research Center is a facility organization based out in Hampton, Virginia, United States. It is known for research contribution in the topics: Mach number & Wind tunnel. The organization has 15945 authors who have published 37602 publications receiving 821623 citations. The organization is also known as: NASA Langley & NASA Langley Research Center.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors estimate that boreal forest fires release 2.4 ± 0.6 Tg C yr−1 in the form of NMVOCs, with approximately 41 % of the carbon released as C1-C2 NMVs and 21 % as pinenes.
Abstract: . Boreal regions comprise about 17 % of the global land area, and they both affect and are influenced by climate change. To better understand boreal forest fire emissions and plume evolution, 947 whole air samples were collected aboard the NASA DC-8 research aircraft in summer 2008 as part of the ARCTAS-B field mission, and analyzed for 79 non-methane volatile organic compounds (NMVOCs) using gas chromatography. Together with simultaneous measurements of CO2, CO, CH4, CH2O, NO2, NO, HCN and CH3CN, these measurements represent the most comprehensive assessment of trace gas emissions from boreal forest fires to date. Based on 105 air samples collected in fresh Canadian smoke plumes, 57 of the 80 measured NMVOCs (including CH2O) were emitted from the fires, including 45 species that were quantified from boreal forest fires for the first time. After CO2, CO and CH4, the largest emission factors (EFs) for individual species were formaldehyde (2.1 ± 0.2 g kg−1), followed by methanol, NO2, HCN, ethene, α-pinene, β-pinene, ethane, benzene, propene, acetone and CH3CN. Globally, we estimate that boreal forest fires release 2.4 ± 0.6 Tg C yr−1 in the form of NMVOCs, with approximately 41 % of the carbon released as C1-C2 NMVOCs and 21 % as pinenes. These are the first reported field measurements of monoterpene emissions from boreal forest fires, and we speculate that the pinenes, which are relatively heavy molecules, were detected in the fire plumes as the result of distillation of stored terpenes as the vegetation is heated. Their inclusion in smoke chemistry models is expected to improve model predictions of secondary organic aerosol (SOA) formation. The fire-averaged EF of dichloromethane or CH2Cl2, (6.9 ± 8.6) × 10−4 g kg−1, was not significantly different from zero and supports recent findings that its global biomass burning source appears to have been overestimated. Similarly, we found no evidence for emissions of chloroform (CHCl3) or methyl chloroform (CH3CCl3) from boreal forest fires. The speciated hydrocarbon measurements presented here show the importance of carbon released by short-chain NMVOCs, the strong contribution of pinene emissions from boreal forest fires, and the wide range of compound classes in the most abundantly emitted NMVOCs, all of which can be used to improve biomass burning inventories in local/global models and reduce uncertainties in model estimates of trace gas emissions and their impact on the atmosphere.

206 citations

Proceedings ArticleDOI
15 Jun 1998
TL;DR: In this article, the accuracy and complexity of solving multi-component gaseous diffusion using the detailed multicomponent equations, the Stefan-Maxwell equations, and two commonly used approximate equations have been examined in a two part study.
Abstract: The accuracy and complexity of solving multi-component gaseous diffusion using the detailed multi-component equations, the Stefan-Maxwell equations, and two commonly used approximate equations have been examined in a two part study. Part I examined the equations in a basic study with specified inputs in which the results are applicable for many applications. Part II addressed the application of the equations in the Langley Aerothermodynamic Upwind Relaxiation Algorithm (LAURA) computational code for high-speed entries in Earth''s atmosphere. The results showed that the presented iterative scheme for solving the Stefan-Maxwell equations is an accurate and effective method as compared with solutions of the detailed equations. In general, good accuracy with the approximate equations cannot be guaranteed for a species or all species in a multi-component mixture. "Corrected" forms of the approximate equations that ensured the diffusion mass fluxes sum to zero, as required, were more accurate than the uncorrected forms. Good accuracy, as compared with the Stefan-Maxwell results, were obtained with the "corrected" approximate equations in defining the heating rates for the three Earth entries considered in Part II.

204 citations

Journal ArticleDOI
TL;DR: This paper investigated the roles of climate forcings and chaos (unforced variability) in climate change via ensembles of climate simulations in which they add forcings one by one, concluding that most interannual climate variability in the period 1979-1996 at middle and high latitudes is chaotic.
Abstract: We investigate the roles of climate forcings and chaos (unforced variability) in climate change via ensembles of climate simulations in which we add forcings one by one. The experiments suggest that most interannual climate variability in the period 1979–1996 at middle and high latitudes is chaotic. But observed SST anomalies, which themselves are partly forced and partly chaotic, account for much of the climate variability at low latitudes and a small portion of the variability at high latitudes. Both a natural radiative forcing (volcanic aerosols) and an anthropogenic forcing (ozone depletion) leave clear signatures in the simulated climate change that are identified in observations. Pinatubo aerosols warm the stratosphere and cool the surface globally, causing a tendency for regional surface cooling. Ozone depletion cools the lower stratosphere, troposphere and surface, steepening the temperature lapse rate in the troposphere. Solar irradiance effects are small, but our model is inadequate to fully explore this forcing. Well-mixed anthropogenic greenhouse gases cause a large surface wanning that, over the 17 years, approximately offsets cooling by the other three mechanisms. Thus the net calculated effect of all measured radiative forcings is approximately zero surface temperature trend and zero heat storage in the ocean for the period 1979–1996. Finally, in addition to the four measured radiative forcings, we add an initial (1979) disequilibrium forcing of +0.65 W/m2. This forcing yields a global surface warming of about 0.2°C over 1979–1996, close to observations, and measurable heat storage in the ocean. We argue that the results represent evidence of a planetary radiative imbalance of at least 0.5° W/m2; this disequilibrium presumably represents unrealized wanning due to changes of atmospheric composition prior to 1979. One implication of the disequilibrium forcing is an expectation of new record global temperatures in the next few years. The best opportunity for observational confirmation of the disequilibrium is measurement of ocean temperatures adequate to define heat storage.

204 citations

Journal ArticleDOI
TL;DR: In this paper, an intercomparison study of midlatitude continental cumulus convection simulated by eight two-dimensional and twothree-dimensional cloud-resolving models (CRMs), driven by observed large-scale advective temperature and moisture tendencies, surface turbulent euxes, and radiative-heating proe les during three sub-periods of the summer 1997 Intensive Observation Period of the US Department of Energy's Atmospheric Radiation Measurement (ARM) program was performed.
Abstract: SUMMARY This paper reports an intercomparison study of midlatitude continental cumulus convection simulated by eight two-dimensional and twothree-dimensional cloud-resolving models (CRMs), driven by observed large-scale advective temperature and moisture tendencies, surface turbulent euxes, and radiative-heating proe les during three sub-periods of the summer 1997 Intensive Observation Period of the US Department of Energy’s Atmospheric Radiation Measurement (ARM) program. Each sub-period includes two or three precipitation events of various intensities over a span of 4 or 5 days. The results can be summarized as follows. CRMs can reasonably simulate midlatitude continental summer convection observed at the ARM Cloud and Radiation Testbed site in terms of the intensity of convective activity, and the temperature and specie c-humidity evolution. Delayed occurrences of the initial precipitation events are a common feature for all three sub-cases among the models. Cloud mass e uxes, condensate mixing ratios and hydrometeor fractions produced by all CRMs are similar. Some of the simulated cloud properties such as cloud liquid-water path and hydrometeor fraction are rather similar to available observations. All CRMs produce large downdraught mass euxes with magnitudes similar to those of updraughts, in contrast to CRM results for tropical convection. Some inter-model differences in cloud properties are likely to be related to those in the parametrizations of microphysical processes. There is generally a good agreement between the CRMs and observations with CRMs being signie cantly better than single-column models (SCMs), suggesting that current results are suitable for use in improving parametrizations in SCMs. However, improvements can still be made in the CRM simulations; these include the proper initialization of the CRMs and a more proper method of diagnosing cloud boundaries in model outputs for comparison with satellite and radar cloud observations.

203 citations

Journal ArticleDOI
TL;DR: In this paper, a procedure for superposing linear cohesive laws to approximate an experimentally-determined R-curve is proposed, which is demonstrated for the longitudinal fracture of a fiber-reinforced polymer-matrix composite.
Abstract: The relationships between a resistance curve (R-curve), the corresponding fracture process zone length, the shape of the traction-displacement softening law, and the propagation of fracture are examined in the context of the through-the-thickness fracture of composite laminates. A procedure for superposing linear cohesive laws to approximate an experimentally-determined R-curve is proposed. Simple equations are developed for determining the separation of the critical energy release rates and the strengths that define the independent contributions of each linear softening law in the superposition. The proposed procedure is demonstrated for the longitudinal fracture of a fiber-reinforced polymer-matrix composite. It is shown that the R-curve measured with a Compact Tension Specimen test cannot be predicted using a linear softening law, but can be reproduced by superposing two linear softening laws.

203 citations


Authors

Showing all 16015 results

NameH-indexPapersCitations
Daniel J. Jacob16265676530
Donald R. Blake11872749697
Veerabhadran Ramanathan10030147561
Raja Parasuraman9140241455
Robert W. Platt8863831918
James M. Russell8769129383
Daniel J. Inman8391837920
Antony Jameson7947431518
Ya-Ping Sun7927728722
Patrick M. Crill7922820850
Richard B. Miles7875925239
Patrick Minnis7749023403
Robert W. Talbot7729719783
Raphael T. Haftka7677328111
Jack E. Dibb7534418399
Network Information
Related Institutions (5)
Ames Research Center
35.8K papers, 1.3M citations

89% related

German Aerospace Center
26.7K papers, 553.3K citations

89% related

Air Force Research Laboratory
24.6K papers, 493.8K citations

87% related

United States Naval Research Laboratory
45.4K papers, 1.5M citations

85% related

Jet Propulsion Laboratory
14.3K papers, 548.1K citations

85% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202335
202286
2021571
2020540
2019669
2018797