scispace - formally typeset
Search or ask a question

Showing papers by "Langley Research Center published in 2002"


01 Jun 2002
TL;DR: In this article, a decohesion element with mixed-mode capability is proposed and demonstrated at the interface between solid finite elements to model the initiation and non-self-similar growth of delaminations.
Abstract: A new decohesion element with mixed-mode capability is proposed and demonstrated. The element is used at the interface between solid finite elements to model the initiation and non-self-similar growth of delaminations. A single relative displacement-based damage parameter is applied in a softening law to track the damage state of the interface and to prevent the restoration of the cohesive state during unloading. The softening law for mixed-mode delamination propagation can be applied to any mode interaction criterion such as the two-parameter power law or the three-parameter Benzeggagh-Kenane criterion. To demonstrate the accuracy of the predictions and the irreversibility capability of the constitutive law, steady-state delamination growth is simulated for quasistatic loading-unloading cycles of various single mode and mixed-mode delamination test specimens.

909 citations


Journal ArticleDOI
TL;DR: In this article, an in-depth review of boundary-layer flow-separation control by a passive method using low-profile vortex generators is presented, defined as those with a device height between 10% and 50% of the boundary layer thickness.

874 citations


Journal ArticleDOI
TL;DR: In this article, single wall carbon nanotube reinforced polyimide nanocomposites were synthesized by in situ polymerization of monomers of interest in the presence of sonication.

764 citations


Journal ArticleDOI
TL;DR: Interrupting technologies are already widespread and include concurrent multitasking; mixed-initiative interaction; support for delegation and supervisory control of automation, including intelligent agents; and other distributed, background services and technologies that increase human-human communication.
Abstract: At first glance it seems absurd that busy people doing important jobs should want their computers to interrupt them. Interruptions are disruptive and people need to concentrate to make good decisions. However, successful job performance also frequently depends on people's abilities to (a) constantly monitor their dynamically changing information environments, (b) collaborate and communicate with other people in the system, and (c) supervise background autonomous services. These critical abilities can require people to simultaneously query a large set of information sources, continuously monitor for important events, and respond to and communicate with other human operators. Automated monitoring and alerting systems minimize the need to constantly monitor, but they induce alerts that may interrupt other activities. Such interrupting technologies are already widespread and include concurrent multitasking; mixed-initiative interaction; support for delegation and supervisory control of automation, including intelligent agents; and other distributed, background services and technologies that increase human-human communication. People do not perform sustained, simultaneous, multichannel sampling well; however, they have great capacity to manage concurrent activities when given specific kinds of interface support. Literature from many domains shows deleterious consequences of human performance in interrupt-laden situations when interfaces do not support this aspect of the task environment. This article identifies why human interruption is an important human-computer interaction problem, and why it will continue to grow in ubiquity and importance. We provide examples of this problem in real-world systems, and we review theoretical tools for understanding human interruption. Based on interdisciplinary scientific results, we suggest potential approaches to user-interface design to help people effectively manage interruptions.

505 citations


Journal ArticleDOI
TL;DR: Two approaches for force evaluation in the lattice Boltzmann equation are investigated: the momentum-exchange method and the stress-integration method on the surface of a body, both reliable, accurate, and easy to implement for both two-dimensional and three-dimensional flows.
Abstract: The present work investigates two approaches for force evaluation in the lattice Boltzmann equation: the momentum-exchange method and the stress-integration method on the surface of a body. The boundary condition for the particle distribution functions on curved geometries is handled with second-order accuracy based on our recent works [Mei et al., J. Comput. Phys. 155, 307 (1999); ibid. 161, 680 (2000)]. The stress-integration method is computationally laborious for two-dimensional flows and in general difficult to implement for three-dimensional flows, while the momentum-exchange method is reliable, accurate, and easy to implement for both two-dimensional and three-dimensional flows. Several test cases are selected to evaluate the present methods, including: (i) two-dimensional pressure-driven channel flow; (ii) two-dimensional uniform flow past a column of cylinders; (iii) two-dimensional flow past a cylinder asymmetrically placed in a channel (with vortex shedding); (iv) three-dimensional pressure-driven flow in a circular pipe; and (v) three-dimensional flow past a sphere. The drag evaluated by using the momentum-exchange method agrees well with the exact or other published results.

389 citations


Journal ArticleDOI
01 Feb 2002-Science
TL;DR: New evidence is presented that the top-of-atmosphere (TOA) tropical radiative energy budget is much more dynamic and variable than previously thought and the results indicate that the radiation budget changes are caused by changes in tropical mean cloudiness.
Abstract: It is widely assumed that variations in Earth's radiative energy budget at large time and space scales are small. We present new evidence from a compilation of over two decades of accurate satellite data that the top-of-atmosphere (TOA) tropical radiative energy budget is much more dynamic and variable than previously thought. Results indicate that the radiation budget changes are caused by changes in tropical mean cloudiness. The results of several current climate model simulations fail to predict this large observed variation in tropical energy budget. The missing variability in the models highlights the critical need to improve cloud modeling in the tropics so that prediction of tropical climate on interannual and decadal time scales can be improved.

368 citations


Journal ArticleDOI
TL;DR: In this article, the authors define the radiative forcings used in climate simulations with the SI2000 version of the Goddard Institute for Space Studies (GISS) global climate model and illustrate the global response to these forcings with specified sea surface temperature and with a simple Q-flux ocean.
Abstract: [1] We define the radiative forcings used in climate simulations with the SI2000 version of the Goddard Institute for Space Studies (GISS) global climate model. These include temporal variations of well-mixed greenhouse gases, stratospheric aerosols, solar irradiance, ozone, stratospheric water vapor, and tropospheric aerosols. Our illustrations focus on the period 1951–2050, but we make the full data sets available for those forcings for which we have earlier data. We illustrate the global response to these forcings for the SI2000 model with specified sea surface temperature and with a simple Q-flux ocean, thus helping to characterize the efficacy of each forcing. The model yields good agreement with observed global temperature change and heat storage in the ocean. This agreement does not yield an improved assessment of climate sensitivity or a confirmation of the net climate forcing because of possible compensations with opposite changes of these quantities. Nevertheless, the results imply that observed global temperature change during the past 50 years is primarily a response to radiative forcings. It is also inferred that the planet is now out of radiation balance by 0.5 to 1 W/m 2 and that additional global warming of about 0.5� C is already ‘‘in the pipeline.’’ INDEX TERMS: 1620 Global Change: Climate dynamics (3309); 1635 Global Change: Oceans (4203); 1650 Global Change: Solar variability;

365 citations


Journal ArticleDOI
TL;DR: Instrumentation and retrieval algorithms are described which use the forward scattered range-coded signals from the global positioning system (GPS) radio navigation system for the measurement of sea surface roughness.
Abstract: Instrumentation and retrieval algorithms are described which use the forward scattered range-coded signals from the global positioning system (GPS) radio navigation system for the measurement of sea surface roughness. This roughness has long been known to be dependent upon the surface wind speed. Experiments were conducted from aircraft along the TOPEX ground track and over experimental surface truth buoys. These flights used a receiver capable of recording the cross-correlation power in the reflected signal. The shape of this power distribution was then compared against analytical models, which employ a geometric optics approach. Two techniques for matching these functions were studied. The first recognized the most significant information content in the reflected signal is contained in the trailing edge slope of the waveform. The second attempted to match the complete shape of the waveform by approximating it as a series expansion and obtaining the nonlinear least squares estimate. Discussion is also presented on anomalies in the receiver operation and their identification and correction.

328 citations


Proceedings ArticleDOI
TL;DR: The fundamental scientific implications of this form of image processing, namely: the visual inadequacy of the linear representation of digital images, the existence of a canonical or statistical ideal visual image, and new measures of visual quality based upon these insights derived from the extensive experience with MSRCR enhanced images are explored.
Abstract: In the last published concept (1986) for a Retinex computation, Edwin Land introduced a center/surround spatial form, which was inspired by the receptive field structures of neurophysiology. With this as our starting point we have over the years developed this concept into a full scale automatic image enhancement algorithm - the Multi-Scale Retinex with Color Restoration (MSRCR) which combines color constancy with local contrast/lightness enhancement to transform digital images into renditions that approach the realism of direct scene observation. The MSRCR algorithm has proven to be quite general purpose, and very resilient to common forms of image pre-processing such as reasonable ranges of gamma and contrast stretch transformations. More recently we have been exploring the fundamental scientific implications of this form of image processing, namely: (i) the visual inadequacy of the linear representation of digital images, (ii) the existence of a canonical or statistical ideal visual image, and (iii) new measures of visual quality based upon these insights derived from our extensive experience with MSRCR enhanced images. The lattermost serves as the basis for future schemes for automating visual assessment - a primitive first step in bringing visual intelligence to computers.

287 citations


Journal ArticleDOI
TL;DR: An adaptive control scheme using output feedback for output tracking is developed for systems with unknown actuator failures and closed-loop signal boundedness and asymptotic output tracking are ensured analytically and verified by simulation results.
Abstract: An adaptive control scheme using output feedback for output tracking is developed for systems with unknown actuator failures. Such actuator failures are characterized by some unknown inputs stuck at some unknown fixed values at unknown time instants. An effective output feedback controller structure is proposed for actuator failure compensation. When implemented with true matching parameters, the controller achieves desired plant-model output matching. When implemented with adaptive parameter estimates, the controller achieves asymptotic output tracking. A stable adaptive law is derived for parameter adaptation in the presence of parameter uncertainties. Closed-loop signal boundedness and asymptotic output tracking, despite the uncertainties in actuator failures and plant parameters, are ensured analytically and verified by simulation results.

276 citations


Journal ArticleDOI
TL;DR: In this paper, a set of design codes based on a discrete adjoint method is extended to a multiprocessor environment using a shared memory approach, and a nearly linear speedup is demonstrated, and the consistency of the linearizations is shown to remain valid.
Abstract: Recent improvements in an unstructured-grid method for large-scale aerodynamic design are presented. Previous work had shown such computations to be prohibitively long in a sequential processing environment. Also, robust adjoint solutions and mesh movement procedures were difficult to realize, particularly for viscous flows. To overcome these limiting factors, a set of design codes based on a discrete adjoint method is extended to a multiprocessor environment using a shared memory approach. A nearly linear speedup is demonstrated, and the consistency of the linearizations is shown to remain valid. The full linearization of the residual is used to precondition the adjoint system, and a significantly improved convergence rate is obtained. A new mesh movement algorithm is implemented, and several advantages over an existing technique are presented

Journal ArticleDOI
TL;DR: A survey of CFD methods applied to the computation of high-lift multi-element configurations over the last 10-15 years is presented in this article, where 2-D and 3-D configurations are covered.

Book ChapterDOI
TL;DR: In this paper, the development and evaluation of an original flexible-wing-based Micro Air Vehicle (MAV) technology that reduces adverse effects of gusty wind conditions and unsteady aerodynamics, exhibits desirable flight stability, and enhances structural durability is described.
Abstract: This paper documents the development and evaluation of an original flexible-wing-based Micro Air Vehicle (MAV) technology that reduces adverse effects of gusty wind conditions and unsteady aerodynamics, exhibits desirable flight stability, and enhances structural durability. The flexible wing concept has been demonstrated on aircraft with wingspans ranging from 18 inches to 5 inches. Salient features of the flexible-wing-based MAV, including the vehicle concept, flexible wing design, novel fabrication methods, aerodynamic assessment, and flight data analysis are presented.

Journal ArticleDOI
TL;DR: This work examines a representative class of MDO problem formulations known as collaborative optimization, and discusses an alternative problem formulation, distributed analysis optimization, that yields a more tractable computational optimization problem.
Abstract: Analytical features of multidisciplinary optimization (MDO) problem formulations have significant practical consequences for the ability of nonlinear programming algorithms to solve the resulting computational optimization problems reliably and efficiently. We explore this important but frequently overlooked fact using the notion of disciplinary autonomy. Disciplinary autonomy is a desirable goal in formulating and solving MDO problems; however, the resulting system optimization problems are frequently difficult to solve. We illustrate the implications of MDO problem formulation for the tractability of the resulting design optimization problem by examining a representative class of MDO problem formulations known as collaborative optimization. We also discuss an alternative problem formulation, distributed analysis optimization, that yields a more tractable computational optimization problem.

Journal ArticleDOI
TL;DR: In this paper, an intercomparison study of midlatitude continental cumulus convection simulated by eight two-dimensional and twothree-dimensional cloud-resolving models (CRMs), driven by observed large-scale advective temperature and moisture tendencies, surface turbulent euxes, and radiative-heating proe les during three sub-periods of the summer 1997 Intensive Observation Period of the US Department of Energy's Atmospheric Radiation Measurement (ARM) program was performed.
Abstract: SUMMARY This paper reports an intercomparison study of midlatitude continental cumulus convection simulated by eight two-dimensional and twothree-dimensional cloud-resolving models (CRMs), driven by observed large-scale advective temperature and moisture tendencies, surface turbulent euxes, and radiative-heating proe les during three sub-periods of the summer 1997 Intensive Observation Period of the US Department of Energy’s Atmospheric Radiation Measurement (ARM) program. Each sub-period includes two or three precipitation events of various intensities over a span of 4 or 5 days. The results can be summarized as follows. CRMs can reasonably simulate midlatitude continental summer convection observed at the ARM Cloud and Radiation Testbed site in terms of the intensity of convective activity, and the temperature and specie c-humidity evolution. Delayed occurrences of the initial precipitation events are a common feature for all three sub-cases among the models. Cloud mass e uxes, condensate mixing ratios and hydrometeor fractions produced by all CRMs are similar. Some of the simulated cloud properties such as cloud liquid-water path and hydrometeor fraction are rather similar to available observations. All CRMs produce large downdraught mass euxes with magnitudes similar to those of updraughts, in contrast to CRM results for tropical convection. Some inter-model differences in cloud properties are likely to be related to those in the parametrizations of microphysical processes. There is generally a good agreement between the CRMs and observations with CRMs being signie cantly better than single-column models (SCMs), suggesting that current results are suitable for use in improving parametrizations in SCMs. However, improvements can still be made in the CRM simulations; these include the proper initialization of the CRMs and a more proper method of diagnosing cloud boundaries in model outputs for comparison with satellite and radar cloud observations.

29 Oct 2002
TL;DR: The vertical profiles of ozone trends provide a fingerprint for the mechanisms of ozone depletion over the last two decades, particularly for northern hemisphere midlatitudes where most balloon and ground-based measurements are made.
Abstract: Analyses of satellite, ground-based, and balloon measurements allow updated estimates of trends in the vertical profile of ozone since 1979. The results show overall consistency among several independent measurement systems, particularly for northern hemisphere midlatitudes where most balloon and ground-based measurements are made. Combined trend estimates over these latitudes for the period 1979-96 show statistically significant negative trends at all attitudes between 10 and 45 km, with two local extremes: -7.4 +/- 2.0% per decade at 40 km and -7.3 +/- 4.6% per decade at 15 km attitude. There is a strong seasonal variation in trends over northern midlatitudes in the attitude range of 10 to 18 km. with the largest ozone loss during winter and spring. The profile trends are in quantitative agreement with independently measured trends in column ozone, the amount of ozone in a column above the surface. The vertical profiles of ozone trends provide a fingerprint for the mechanisms of ozone depletion over the last two decades,

Proceedings ArticleDOI
14 Jan 2002
TL;DR: In this paper, the development and evaluation of an original flexible-wing-based Micro Air Vehicle (MAV) technology that reduces adverse effects of gusty wind conditions and unsteady aerodynamics, exhibits desirable flight stability, and enhances structural durability is described.
Abstract: This paper documents the development and evaluation of an original flexible-wing-based Micro Air Vehicle (MAV) technology that reduces adverse effects of gusty wind conditions and unsteady aerodynamics, exhibits desirable flight stability, and enhances structural durability. The flexible wing concept has been demonstrated on aircraft with wingspans ranging from 18 inches to 5 inches. Salient features of the flexible-wing-based MAV, including the vehicle concept, flexible wing design, novel fabrication methods, aerodynamic assessment, and flight data analysis are presented.

Journal ArticleDOI
TL;DR: It is concluded that reliable integration is most efficiently provided by fourth-order Runge–Kutta methods for this problem where order reduction is not observed.

Journal ArticleDOI
TL;DR: In this paper, an active separation control experiment was conducted in a cryogenic pressurized wind tunnel on a wall-mounted bump at chord Reynolds numbers from 2.4 x 10 6 to 26 x 106 and a Mach number of 0.25.
Abstract: An active separation control experiment was conducted in a cryogenic pressurized wind tunnel on a wall-mounted bump at chord Reynolds numbers from 2.4 x 10 6 to 26 x 106 and a Mach number of 0.25. The model simulates the upper surface of a 20% thick Glauert-Goldschmied-type airfoil at zero incidence. The turbulent boundary layer of the tunnel sidewall flows over the model and eliminates laminar-turbulent transition from the problem. Indeed, the Reynolds number either based on the chord or boundary-layer thickness had a negligible effect on the flow and its control. Without control, a large turbulent separation bubble is formed at the lee side of the model. Periodic excitation and steady suction or blowing were applied to eliminate gradually the separation bubble. Detailed effects due to variations in the excitation frequency, amplitude, and the steady mass flux are described and compared to those of steady suction or blowing

01 Jul 2002
TL;DR: The needs and opportunities for computational and experimental methods that provide accurate, efficient solutions to nondeterministic multidisciplinary aerospace vehicle design problems are identified.
Abstract: This report consists of a survey of the state of the art in uncertainty-based design together with recommendations for a Base research activity in this area for the NASA Langley Research Center. This report identifies the needs and opportunities for computational and experimental methods that provide accurate, efficient solutions to nondeterministic multidisciplinary aerospace vehicle design problems. Barriers to the adoption of uncertainty-based design methods are identified. and the benefits of the use of such methods are explained. Particular research needs are listed.

Journal ArticleDOI
TL;DR: A new version is introduced of the bilevel integrated system synthesis method intended for optimization of engineering systems conducted by distributed specialty groups working concurrently in a multiprocessor computing environment that shows that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition.
Abstract: The paper introduces a new version of the Bi-Level Integrated System Synthesis (BLISS) methods intended for optimization of engineering systems conducted by distributed specialty groups working concurrently and using a multiprocessor computing environment. The method decomposes the overall optimization task into subtasks associated with disciplines or subsystems where the local design variables are numerous and a single, system-level optimization whose design variables are relatively few. The subtasks are fully autonomous as to their inner operations and decision making. Their purpose is to eliminate the local design variables and generate a wide spectrum of feasible designs whose behavior is represented by Response Surfaces to be accessed by a system-level optimization. It is shown that, if the problem is convex, the solution of the decomposed problem is the same as that obtained without decomposition. A simplified example of an aircraft design shows the method working as intended. The paper includes a discussion of the method merits and demerits and recommendations for further research.

Proceedings ArticleDOI
01 Jan 2002
TL;DR: The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented here and recommendations as to the utility of the algorithm in future multidisciplinary optimization applications are made.
Abstract: The purpose of this paper is to demonstrate the application of particle swarm optimization to a realistic multidisciplinary optimization test problem. The paper's new contributions to multidisciplinary optimization is the application of a new algorithm for dealing with the unique challenges associated with multidisciplinary optimization problems, and recommendations as to the utility of the algorithm in future multidisciplinary optimization applications. The selected example is a bi-level optimization problem that demonstrates severe numerical noise and has a combination of continuous and truly discrete design variables. The use of traditional gradient-based optimization algorithms is thus not practical. The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented here. The algorithm is capable of dealing with the unique challenges posed by multidisciplinary optimization as well as the numerical noise and truly discrete variables present in the current example problem.

Journal ArticleDOI
01 Jan 2002-Polymer
TL;DR: In this paper, a mixture of N-methyl-2-pyrrolidinone (NMP) and polyimide/organoclay mixture was used to achieve fully exfoliated nanocomposites, which were characterized by differential scanning calorimetry, dynamic thermogravimetric analysis (TGA), transmission electron microscopy (TEM), X-ray diffraction (XRD), and thin film tensile properties.

Journal ArticleDOI
TL;DR: The atmospheric ionizing radiation (AIR) project made simultaneous radiation measurements with 14 instruments on five flights of a NASA ER-2 high-altitude aircraft, measuring the cosmic-ray neutron spectrum, total neutron fluence rate, and neutron effective dose and dose equivalent rates and their dependence on altitude and geomagnetic cutoff.
Abstract: Crews working on present-day jet aircraft are a large occupationally exposed group with a relatively high average effective dose from galactic cosmic radiation. Crews of future high-speed commercial aircraft flying at higher altitudes would be even more exposed. To help reduce the significant uncertainties in calculations of such exposures, the atmospheric ionizing radiation (AIR) project, an international collaboration of 15 laboratories, made simultaneous radiation measurements with 14 instruments on five flights of a NASA ER-2 high-altitude aircraft. The primary AIR instrument was a highly sensitive extended-energy multisphere neutron spectrometer with lead and steel shells placed within the moderators of two of its 14 detectors to enhance response at high energies. Detector responses were calculated for neutrons and charged hadrons at energies up to 100 GeV using MCNPX. Neutron spectra were unfolded from the measured count rates using the new MAXED code. We have measured the cosmic-ray neutron spectrum (thermal to >10 GeV), total neutron fluence rate, and neutron effective dose and dose equivalent rates and their dependence on altitude and geomagnetic cutoff. The measured cosmic-ray neutron spectra have almost no thermal neutrons, a large “evaporation” peak near 1 MeV and a second broad peak near 100 MeV which contributes about 69% of the neutron effective dose. At high altitude, geomagnetic latitude has very little effect on the shape of the spectrum, but it is the dominant variable affecting neutron fluence rate, which was eight times higher at the northernmost measurement location than it was at the southernmost. The shape of the spectrum varied only slightly with altitude from 21 km down to 12 km (56–201 g cm −2 atmospheric depth), but was significantly different on the ground. In all cases, ambient dose equivalent was greater than effective dose for cosmic-ray neutrons.

Journal ArticleDOI
TL;DR: In this paper, the authors used rate effects to study low temperature deformation mechanisms using nanoindentation creep and load relaxation and found that the rate effects are conspicuous in terms of the rate sensitivity of the hardness, ∂H / ∂ ln e eff, and calculated the activation volume, V ∗, and compared data from indentation creep with data from uniaxial loading.

Journal ArticleDOI
TL;DR: In this article, the authors developed automated routines to derive water vapor mixing ratio, relative humidity, aerosol extinction and backscatter coefficient, and linear depolarization profiles, as well as total precipitable water vapor and aerosol optical thickness from the operational Raman lidar at the Atmospheric Radiation Measurement (ARM) program's site in north-central Oklahoma.
Abstract: Automated routines have been developed to derive water vapor mixing ratio, relative humidity, aerosol extinction and backscatter coefficient, and linear depolarization profiles, as well as total precipitable water vapor and aerosol optical thickness, from the operational Raman lidar at the Atmospheric Radiation Measurement (ARM) program's site in north-central Oklahoma These routines have been devised to maintain the calibration of these data products, which have proven sensitive to the automatic alignment adjustments that are made periodically by the instrument Since this Raman lidar does not scan, aerosol extinction cannot be directly computed below approximately 800 m due to the incomplete overlap of the outgoing laser beam with the detector's field of view Therefore, the extinction-to-backscatter ratio at 1 km is used with the aerosol backscatter coefficient profile to compute aerosol extinction from 60 m to the level of complete overlap Comparisons of aerosol optical depth derived using

Journal ArticleDOI
TL;DR: In this paper, the shortwave bulk optical properties of seven ice particle shapes, or "habits", are parameterized as a function of the effective radius and ice water content by integrating the scattering properties over 30 in situ size distributions.
Abstract: [1] The relative importance of ice clouds in the climate system is highly uncertain. Measurements of their microphysical properties are sparse, especially given their complex structure and large variability in particle size, shape, and density. To better understand the role of ice clouds in the climate system, parameterizations of their radiative properties are needed. The shortwave bulk optical properties of seven ice particle shapes, or ‘‘habits,’’ are parameterized as a function of the effective ‘‘radius’’ and ice water content by integrating the scattering properties over 30 in situ size distributions. The particle habits are solid and hollow hexagonal columns, hexagonal plates, two- and three-dimensional bullet rosettes, aggregates of columns, and dendrites. Parameterizations of the volume extinction coefficient, single-scattering albedo, and the asymmetry parameter are presented for 6, 24, and 56 band shortwave schemes from 0.2 to 5.0 mm. Applications to downwelling flux and upwelling radiance calculations indicate that differences in fluxes for various habits can be more than 15%, and differences in retrievals of cloud optical depth from satellite visible reflectances can be more than 50%. INDEX TERMS: 3359 Meteorology and Atmospheric Dynamics: Radiative processes; 0360 Atmospheric Composition and Structure: Transmission and scattering of radiation; 3360 Meteorology and Atmospheric Dynamics: Remote sensing;

Journal ArticleDOI
TL;DR: This paper supplies a presentation of experiments on a commercial robot that demonstrate the effectiveness of iterative learning control, improving the tracking accuracy of the robot performing a high speed maneuver by a factor of 100 in six repetitions.
Abstract: Iterative learning control (ILC) applies to control systems that perform the same finite-time tracking command repeatedly. It iteratively adjusts the command from one repetition to the next in order to reduce the tracking error. This creates a two-dimensional (2-D) system, with time step and repetition number as independent variables. The simplest form of ILC uses only one gain times one error in the previous repetition, and can be shown to converge to the zero-tracking error independent of the system dynamics. Hence, it appears very effective from a mathematical perspective. However, in practice, there are unacceptable learning transients. A zero-phase low-pass filter is introduced here to eliminate the worst transients. The main purpose of this paper is to supply a presentation of experiments on a commercial robot that demonstrate the effectiveness of this approach, improving the tracking accuracy of the robot performing a high speed maneuver by a factor of 100 in six repetitions. Experiments using a two-gain ILC reaches this error level in only three iterations. It is suggested that these two simple ILC laws are the equivalent for learning control of proportional and PD control in classical control system design. Thus, what was an impractical approach, becomes practical, easy to apply, and effective.

Journal ArticleDOI
TL;DR: In this article, a spatiotemporal resolution of the free shear layer in the slat-cove region is used to obtain the farfield acoustics of a multi-element, high-lift configuration.
Abstract: Unsteady computational simulations of a multi-element, high-lift configuration are performed. Emphasis is placed on accurate spatiotemporal resolution of the free shear layer in the slat-cove region. The excessive dissipative effects of the turbulence model, so prevalent in previous simulations, are circumvented by switching off the turbulence-production term in the slat cove region. The justifications and physical arguments for taking such a step are explained in detail. The removal of this excess damping allows the shear layer to amplify large-scale structures, to achieve a proper non-linear saturation state, and to permit vortex merging. The large-scale disturbances are self-excited, and unlike our prior fully turbulent simulations, no external forcing of the shear layer is required. To obtain the farfield acoustics, the Ffowcs Williams and Hawkings equation is evaluated numerically using the simulated time-accurate flow data. The present comparison between the computed and measured farfield acoustic spectra shows much better agreement for the amplitude and frequency content than past calculations. The effect of the angle-of-attack on the slat's flow features radiated acoustic field are also simulated presented.

Journal ArticleDOI
TL;DR: In this article, structural analysis and design technology for buckling-critical shell structures are discussed, including a hierarchical analysis strategy that includes analyses that range from classical analysis methods to high-fidelity nonlinear finite element analysis methods, reliability based design methods, and the identification of traditional and nontraditional initial imperfections for composite shell structures.
Abstract: Recent advances in structural analysis and design technology for buckling-critical shell structures are discussed. These advances include a hierarchical analysis strategy that includes analyses that range from classical analysis methods to high-fidelity nonlinear finite element analysis methods, reliability based design methods, the development of imperfection data bases, and the identification of traditional and nontraditional initial imperfections for composite shell structures. When used judiciously, these advances provide the basis for a viable alternative to the traditional and conservative lower-bound design philosophy of the 1960s. These advances also help answer the question of why, after so many years of concentrated research effort to understand the behavior of buckling-critical thin-walled shells, one has not been able to improve on this conservative lower- bound design philosophy in the past.