scispace - formally typeset
Search or ask a question

Showing papers by "Naval Postgraduate School published in 2006"


Journal ArticleDOI
TL;DR: New bilevel and trilevel optimization models to make critical infrastructure more resilient against terrorist attacks are applied and insights gained from the modeling experience and many “red-team” exercises are reported.
Abstract: We apply new bilevel and trilevel optimization models to make critical infrastructure more resilient against terrorist attacks. Each model features an intelligent attacker (terrorists) and a defender (us), information transparency, and sequential actions by attacker and defender. We illustrate with examples of the US Strategic Petroleum Reserve, the US Border Patrol at Yuma, Arizona, and an electrical transmission system. We conclude by reporting insights gained from the modeling experience and many “red-team” exercises. Each exercise gathers open-source data on a real-world infrastructure system, develops an appropriate bilevel or trilevel model, and uses these to identify vulnerabilities in the system or to plan an optimal defense.

731 citations


Journal ArticleDOI
TL;DR: It is found that if remanufacturing is very profitable, the original-equipment manufacturer may forgo some of the first-period margin by lowering the price and selling additional units to increase the number of cores available for remanufactured in future periods.
Abstract: We study a firm that makes new products in the first period and uses returned cores to offer remanufactured products, along with new products, in future periods. We introduce the monopoly environment in two-period and multiperiod scenarios to identify thresholds in remanufacturing operations. Next, we focus our attention on the duopoly environment where an independent operator (IO) may intercept cores of products made by the original equipment manufacturer (OEM) to sell remanufactured products in future periods. We characterize the production quantities associated with self-selection and explore the effect of various parameters in the Nash equilibrium. Among other results, we find that if remanufacturing is very profitable, the original-equipment manufacturer may forgo some of the first-period margin by lowering the price and selling additional units to increase the number of cores available for remanufacturing in future periods. Further, as the threat of competition increases, the OEM is more likely to completely utilize all available cores, offering the remanufactured products at a lower price.

699 citations


Journal ArticleDOI
TL;DR: This paper argues for the independence of prediction and control, that the pursuit of successful outcomes can occur through either predictive or control oriented approaches, and develops and highlights control oriented approach to open new avenues for dealing with the uncertainty inherent to the question of what organizations should do next.
Abstract: Two prescriptions dominate the topic of what firms should do next in uncertain situations: planning approaches and adaptive approaches. These differ primarily on the appropriate role of prediction in the decision process. Prediction is a central issue in strategy making owing to the presumption that what can be predicted can be controlled. In this paper we argue for the independence of prediction and control. This implies that the pursuit of successful outcomes can occur through control-oriented approaches that may essentially be non-predictive. We further develop and highlight control-oriented approaches with particular emphasis on the question of what organizations should do next. We also explore how these approaches may impact the costs and risks of firm strategies as well as the firm's continual efforts to innovate. Copyright © 2006 John Wiley & Sons, Ltd.

640 citations


Journal ArticleDOI
TL;DR: Experimental results validate the filter design, show the feasibility of using inertial/magnetic sensor modules for real-time human body motion tracking, and validate the quaternion-based Kalman filter design.
Abstract: Real-time tracking of human body motion is an important technology in synthetic environments, robotics, and other human-computer interaction applications. This paper presents an extended Kalman filter designed for real-time estimation of the orientation of human limb segments. The filter processes data from small inertial/magnetic sensor modules containing triaxial angular rate sensors, accelerometers, and magnetometers. The filter represents rotation using quaternions rather than Euler angles or axis/angle pairs. Preprocessing of the acceleration and magnetometer measurements using the Quest algorithm produces a computed quaternion input for the filter. This preprocessing reduces the dimension of the state vector and makes the measurement equations linear. Real-time implementation and testing results of the quaternion-based Kalman filter are presented. Experimental results validate the filter design, and show the feasibility of using inertial/magnetic sensor modules for real-time human body motion tracking

556 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that the surface wind speed associated with the east Asian monsoon has significantly weakened in both winter and summer in the recent three decades, and they found that the monsoon wind speed is also highly correlated with incoming solar radiation at the surface.
Abstract: [1] It is commonly believed that greenhouse-gas-induced global warming can weaken the east Asian winter monsoon but strengthen the summer monsoon, because of stronger warming over high-latitude land as compared to low-latitude oceans. In this study, we show that the surface wind speed associated with the east Asian monsoon has significantly weakened in both winter and summer in the recent three decades. From 1969 to 2000, the annual mean wind speed over China has decreased steadily by 28%, and the prevalence of windy days (daily mean wind speed > 5 m/s) has decreased by 58%. The temperature trends during this period have not been uniform. Significant winter warming in northern China may explain the decline of the winter monsoon, while the summer cooling in central south China and warming over the South China Sea and the western North Pacific Ocean may be responsible for weakening the summer monsoon. In addition, we found that the monsoon wind speed is also highly correlated with incoming solar radiation at the surface. The present results, when interpreted together with those of recent climate model simulations, suggest two mechanisms that govern the decline of the east Asian winter and summer monsoons, both of which may be related to human activity. The winter decline is associated with global-scale warming that may be attributed to increased greenhouse gas emission, while the summer decline is associated with local cooling over south-central China that may result from air pollution.

441 citations


BookDOI
01 Apr 2006
TL;DR: This paper presents a meta-analysis of the literature on cyber-security and its applications to decision-making in the rapidly changing environment, which highlights the need to understand more fully the role and risks of cyber-attacks.
Abstract: Preface. About the Author. 1. Strategy. 2. Origins. 3. Challenges. 4. Networks. 5. Vulnerability Analysis. 6. Risk Analysis. 7. Water. 8. SCADA. 9. Power. 10. Energy. 11. Telecommunications. 12. Internet. 13. Cyber-Threats. 14. Cyber-Security.

348 citations


Journal ArticleDOI
TL;DR: The need to conduct a literature review is by no means limited to graduate students; scholarly researchers generally carry out literature reviews throughout their research careers as mentioned in this paper, and they can pose challenges even to an experienced researcher.
Abstract: Students entering a graduate program often encounter a new type of assignment that differs from the papers they had to write in high school or as college undergraduates: the literature review (also known as a critical review essay). Put briefly, a literature review summarizes and evaluates a body of writings about a specific topic. The need to conduct such reviews is by no means limited to graduate students; scholarly researchers generally carry out literature reviews throughout their research careers. In a world where the Internet has broadened the range of potentially relevant sources, however, doing a literature review can pose challenges even to an experienced researcher.In drafting this overview, I have incorporated some points made by Paul Pitman in a lecture delivered to students at the Naval Postgraduate School. I have also incorporated some suggestions contained in a handout prepared by John Odell for students in the School of International Relations at the University of Southern California.

327 citations


Journal ArticleDOI
TL;DR: An overview of the rip current kinematics based on these observations and the scientific advances obtained from these efforts are synthesized in this article, where rip current flows are partitioned into mean, infragravity, very low frequency (vorticity), and tidal contributions, and it is found that each contributes significantly to the total.

292 citations


Journal ArticleDOI
TL;DR: It is proved that a sequence of solutions to the PS-discretized constrained problem converges to the optimal solution of the continuous-time optimal control problem under mild and numerically verifiable conditions.
Abstract: We consider the optimal control of feedback linearizable dynamical systems subject to mixed state and control constraints. In general, a linearizing feedback control does not minimize the cost function. Such problems arise frequently in astronautical applications where stringent performance requirements demand optimality over feedback linearizing controls. In this paper, we consider a pseudospectral (PS) method to compute optimal controls. We prove that a sequence of solutions to the PS-discretized constrained problem converges to the optimal solution of the continuous-time optimal control problem under mild and numerically verifiable conditions. The spectral coefficients of the state trajectories provide a practical method to verify the convergence of the computed solution. The proposed ideas are illustrated by several numerical examples.

248 citations


Journal ArticleDOI
TL;DR: In this paper, the Mie-equivalent aerosol asymmetry parameter (g) was derived using a variety of methods from the large suite of measurements (in situ and remote from surface and aircraft) made in Oklahoma during the 2003 aerosol Intensive Operations Period (IOP).
Abstract: Received 21 December 2004; revised 19 March 2005; accepted 7 June 2005; published 21 January 2006. [1] Values for Mie-equivalent aerosol asymmetry parameter (g) were derived using a variety of methods from the large suite of measurements (in situ and remote from surface and aircraft) made in Oklahoma during the 2003 aerosol Intensive Operations Period (IOP). Median values derived for dry asymmetry parameter at 550 nm ranged between 0.55 and 0.63 over all instruments and for all derivation methods, with the exception of one instrument which did not measure over the full size range of optically important aerosol. Median values for the ‘‘wet’’ asymmetry parameter (i.e., asymmetry parameter at humidity conditions closer to ambient) were between 0.59 and 0.72. Values for g derived for surface and airborne in situ measurements were highly correlated, but in situ and remote sensing measurements both at the surface and aloft did not agree as well because of vertical inhomogeneity of the aerosol. Radiative forcing calculations suggest that a 10% decrease in g would result in a 19% reduction in top of atmosphere radiative forcing for the conditions observed during the IOP. Comparison of the different methods for deriving g suggests that in computing the asymmetry parameter, aerosol size is the most important parameter to measure; composition is less important except for how it influences the hygroscopic growth (i.e., size) of particles.

241 citations



Journal ArticleDOI
TL;DR: In this paper, the relative contributions of two aqueous-phase routes responsible for oxalic acid formation were examined; the oxidation of glyoxylic acid was predicted to dominate over the decay of longer-chain dicarboxylic acids.
Abstract: inorganic ions (including SO4� ) and five organic acid ions (including oxalate) were measured on board the Center for Interdisciplinary Remotely Piloted Aircraft Studies (CIRPAS) Twin Otter research aircraft by a particle-into-liquid sampler (PILS) during flights over Ohio and surrounding areas. Five local atmospheric conditions were studied: (1) cloud-free air, (2) power plant plume in cloud-free air with precipitation from scattered clouds overhead, (3) power plant plume in cloud-free air, (4) power plant plume in cloud, and (5) clouds uninfluenced by local pollution sources. The aircraft sampled from two inlets: a counterflow virtual impactor (CVI) to isolate droplet residuals in clouds and a second inlet for sampling total aerosol. A strong correlation was observed between oxalate and SO4� when sampling through both inlets in clouds. Predictions from a chemical cloud parcel model considering the aqueous-phase production of dicarboxylic acids and SO4� show good agreement for the relative magnitude of SO4� and oxalate growth for two scenarios: power plant plume in clouds and clouds uninfluenced by local pollution sources. The relative contributions of the two aqueous-phase routes responsible for oxalic acid formation were examined; the oxidation of glyoxylic acid was predicted to dominate over the decay of longer-chain dicarboxylic acids. Clear evidence is presented for aqueous-phase oxalic acid production as the primary mechanism for oxalic acid formation in ambient aerosols.

Proceedings ArticleDOI
14 Jun 2006
TL;DR: This paper addresses the development of a vision-based target tracking system for a small unmanned air vehicle that performs autonomous tracking of a moving target, while simultaneously estimating GPS coordinates of the target.
Abstract: This paper addresses the development of a vision-based target tracking system for a small unmanned air vehicle. The algorithm performs autonomous tracking of a moving target, while simultaneously estimating GPS coordinates of the target. A low cost off the shelf system is utilized, with a modified radio controlled aircraft airframe, gas engine and servos. Tracking is enabled using a low-cost, miniature pan-tilt gimbal. The control algorithm provides rapid and sustained target acquisition and tracking capability. A target position estimator was designed and shown to provide reasonable targeting accuracy. The impact of target loss events on the control and estimation algorithms is analyzed in detail.

Book ChapterDOI
TL;DR: In this article, the role of atomic scattering factors in the computation of crystal structure factors by summation over unit-cell atoms, and the reflecting power of small crystals is described.
Abstract: Section 6.1.1 covers X-ray scattering from atoms and ions. Scattering is described by the Thomson formula, including coherent (Rayleigh) and incoherent (Compton) X-ray scattering. Atomic scattering factors, calculated using relativistic Hartree–Fock or Dirac–Slater wavefunctions, give the X-ray scattering from an atom (in fact, its ensemble of electrons) in terms of that from a single electron. Free-atom scattering factors are tabulated for neutral atoms from atomic number 1 (hydrogen) to 98 (californium) over a scattering range of sin θ/λ from 0 to 6 A−1, and for ions from H1− to Pu6+ over 0 to 2 A−1. Analytical fits to the scattering factors are given and methods for interpolation of the tabulated factors are described. Perturbations from free-atom electron density for bound atoms are handled with generalized scattering factors expressed as spherical harmonics. Probability density functions for atom displacement due to temperature are described in terms of generalized temperature factors related to atom vibration symmetries. The final parts of Section 6.1.1 describe the role of atomic scattering factors in the computation of crystal structure factors by summation over unit-cell atoms, and the reflecting power of small crystals. Section 6.1.2 presents the basic equations governing magnetic scattering of neutrons. They are used to define the useful intermediate quantities of the magnetic interaction vector, the magnetic structure factor and the magnetic form factor, which are used in calculations of magnetic cross sections. A brief account of the way in which the magnetic scattering depends upon the neutron spin direction (neutron polarization) is included. Formulae for the scattering of neutrons by the nuclei of an atom are given in Section 6.1.3. The scattering cross sections for a single nucleus, for an element containing a mixture of isotopes, and for a single crystal are considered.

Journal ArticleDOI
TL;DR: The ability to form multi-organizational networks rapidly is crucial to humanitarian aid, disaster relief, and large urgent projects and design and implementing the network's conversation space is the central challenge.
Abstract: The ability to form multi-organizational networks rapidly is crucial to humanitarian aid, disaster relief, and large urgent projects. Designing and implementing the network's conversation space is the central challenge.

Journal ArticleDOI
TL;DR: In this paper, the authors quantify uncertainties in both data and model estimates to understand limitations and identify the research needed to increase accuracies, which will lead to fundamental progress in ocean modeling.
Abstract: : A multitude of physical and biological processes occur in the ocean over a wide range of temporal and spatial scales. Many of these processes are nonlinear and highly variable, and involve interactions across several scales and oceanic disciplines. For example, sound propagation is infl uenced by physical and biological properties of the water column and by the seabed. From observations and conservation laws, ocean scientists formulate models that aim to explain and predict dynamics of the sea. This formulation is intricate because it is challenging to observe the ocean on a sustained basis and to transform basic laws into generic but usable models. There are imperfections in both data and model estimates. It is important to quantify such uncertainties to understand limitations and identify the research needed to increase accuracies, which will lead to fundamental progress.

Journal ArticleDOI
TL;DR: In this article, the authors examined four years of daily time exposure images from an embayed beach to study the spacing, persistence, and location preferences of rips in a natural rip channel system.
Abstract: [1] Four years of daily time exposure images from an embayed beach were examined to study the spacing, persistence, and location preferences of rips in a natural rip channel system. A total of 5271 rip channels was observed on 782 days. Occurrence statistics showed no evidence of the preferred location pattern associated with standing edge waves trapped in an embayed beach. The histogram of rip spacing, the primary diagnostic observable for most models, was well modeled by a lognormal distribution (mean spacing of 178 m). However, spacings were highly longshore variable (time mean of the standard deviation/longshore mean of rip spacing was 39%), so they are of questionable merit as a diagnostic variable. Storm-driven resets to the longshore uniform condition required by most models occurred only four times per year on average, making rip generation models relevant to only a small fraction of the system behavior. Rip spacings after the 15 observed reset events were uncorrelated with bar crest distance. The lifetime of the 324 individual rip channel trajectories averaged 45.6 days. Rips were equally mobile in both longshore directions, but the coefficient of variation of rip migration rates was large, even for high migration rates. The mean migration rate was well correlated to a proxy for the longshore current (R2 of 0.78). Thus there is no significant evidence that the formation, spacing, and migration of rip channels on this beach can be explained by currently existing simple models. Moreover, the alongshore uniform initial conditions assumed by these models are rare on Palm Beach, making the models generally inapplicable.

Journal ArticleDOI
TL;DR: The Power of Everyday Politics: How Vietnamese Peasants Transformed National Policy by Benedict J. Tria Kerkvliet as discussed by the authors offers not only a persuasive answer to this question but also a useful theoretical framework for understanding the power of everyday politics.
Abstract: The Power of Everyday Politics: How Vietnamese Peasants Transformed National Policy. By Benedict J. Tria Kerkvliet. Ithaca, NY: Cornell University Press, 2005. 320p. $39.95. Short of rebelling, how do powerless peasants under a domineering Stalinist state influence policy? Benedict J. Tria Kerkvliet's book offers not only a persuasive answer to this question but also a useful theoretical framework for understanding the power of “everyday politics.”

Journal ArticleDOI
TL;DR: In this paper, the authors focus on the validation of remotely sensed ocean surface currents from SeaSonde-type high-frequency (HF) radar systems and compare the results obtained using measured and ideal (i.e., perfect) antenna patterns.
Abstract: This paper focuses on the validation of remotely sensed ocean surface currents from SeaSonde-type high-frequency (HF) radar systems. Hourly observations during the period July 22, 2003 through September 9, 2003 are used from four separate radar sites deployed around the shores of Monterey Bay, CA. Calibration of direction-finding techniques is addressed through the comparisons of results obtained using measured and ideal (i.e., perfect) antenna patterns. Radial currents are compared with observations from a moored current meter and from 16 surface drifter trajectories. In addition, four overwater baselines are used for radar-to-radar comparisons. Use of measured antenna patterns improves system performance in almost all cases. Antenna-pattern measurements repeated one year later at three of the four radar locations exhibit only minor changes indicating that pattern distortions are stable. Calibrated results show root-mean-square (rms) radial velocity differences in the range of 9.8-13.0 cm/s, which suggest radar observation error levels in the range of 6.9-9.2 cm/s. In most cases, clear evidence of bearing errors can be seen, which range up to 30deg for uncalibrated radar-derived radial currents and up to 15deg for currents obtained using measured antenna patterns. Bearing errors are not, however, constant with angle. The results recommend use of measured antenna patterns in all SeaSonde-type applications. They also recommend an expanded simulation effort to better describe the effects of antenna-pattern distortions on bearing determination under a variety of ocean conditions

Journal ArticleDOI
TL;DR: By investigating the comparative performance of human and software agents across varying levels of ambiguity in the procurement domain, the experimentation described in this article helps to elucidate some new boundaries of computer-based decision making quite broadly.
Abstract: Recently, researchers have begun investigating an emerging, technology-enabled innovation that involves the use of intelligent software agents in enterprise supply chains. Software agents combine and integrate capabilities of several information technology classes in a novel manner that enables supply chain management and decision making in modes not supported previously by IT and not reported previously in the information systems literature. Indeed, federations and swarms of software agents today are moving the boundaries of computer-aided decision making more generally. Such moving boundaries highlight promising new opportunities for competitive advantage in business, in addition to novel theoretical insights. But they also call for shifting research thrusts in information systems. The stream of research associated with this article is taking some first steps to address such issues by examining experimentally the capabilities, limitations, and boundaries of agent technology for computer-based decision support and automation in the procurement domain. Procurement represents an area of particular potential for agent-based process innovation, as well as reflecting some of the greatest technological advances in terms of agents emerging from the laboratory. Procurement is imbued with considerable ambiguity in its task environment, ambiguity that presents a fundamental limitation to IT-based automation of decision making and knowledge work. By investigating the comparative performance of human and software agents across varying levels of ambiguity in the procurement domain, the experimentation described in this article helps to elucidate some new boundaries of computer-based decision making quite broadly. We seek in particular to learn from this domain and to help inform computer-based decision making, agent technological design, and IS research more generally.

Journal ArticleDOI
TL;DR: This paper considers a situation where the firm does not have an accurate demand forecast, but can only roughly estimate the customer arrival rate before the sale begins, and shows how this modified arrival rate estimation can be used to dynamically adjust the product price in order to maximize the expected total revenue.

Journal ArticleDOI
TL;DR: The largest, most comprehensive atmospheric data set ever used to evaluate the von Karman constant has been used in this paper, which is the largest dataset ever used for the analysis of atmospheric data.
Abstract: The von Karman constant . This is, thus, the largest, most comprehensive atmospheric data set ever used to evaluate the von Karman constant.

Journal ArticleDOI
TL;DR: This paper builds upon integrative modeling work that composes a parsimonious, multidimensional, analytical framework for representing and visualizing dynamic knowledge, and focuses on understanding the dynamics of knowledge phenomenologically and on developing and applying techniques for modeling and visualize dynamic knowledge flows and stocks.
Abstract: Knowledge represents a critical resource in the modern enterprise. But it is dynamic and distributed unevenly. Capitalizing on this dynamic resource for enterprise performance depends upon its rapid and reliable flows across people, organizations, locations, and times of application. From a technological perspective, this points immediately to the design of information systems to enhance knowledge flows. The problem is, the design of information systems to enhance knowledge flows requires new understanding. The research described in this paper concentrates on understanding the dynamics of knowledge phenomenologically and on developing and applying techniques for modeling and visualizing dynamic knowledge flows and stocks. We draw key, theoretical concepts from multiple literatures, and we build upon integrative modeling work that composes a parsimonious, multidimensional, analytical framework for representing and visualizing dynamic knowledge. We then conduct field research to learn how this theoretical framework may be used to model knowledge flows in practice. By focusing this empirical work on an extreme organization and processes that involve and rely upon tacit knowledge, we illustrate how dynamic knowledge patterns can inform design in new ways. New chunks of kernel theory deriving from this fieldwork are articulated in terms of a propositional model, which provides a basis for the development of testable design theory hypotheses.

Journal ArticleDOI
TL;DR: In this paper, the authors show that the strong tidal modulation of infragravity (200 to 20 s period) waves observed on the southern California shelf is the result of nonlinear transfers of energy from these low-frequency long waves to higher-frequency motions.
Abstract: [1] The strong tidal modulation of infragravity (200 to 20 s period) waves observed on the southern California shelf is shown to be the result of nonlinear transfers of energy from these low-frequency long waves to higher-frequency motions. The energy loss occurs in the surfzone, and is stronger as waves propagate over the convex low-tide beach profile than over the concave high-tide profile, resulting in a tidal modulation of seaward-radiated infragravity energy. Although previous studies have attributed infragravity energy losses in the surfzone to bottom drag and turbulence, theoretical estimates using both observations and numerical simulations suggest nonlinear transfers dominate. The observed beach profiles and energy transfers are similar along several km of the southern California coast, providing a mechanism for the tidal modulation of infragravity waves observed in bottom-pressure and seismic records on the continental shelf and in the deep ocean.

Journal ArticleDOI
TL;DR: In this paper, nonlinear energy transfers with sea and swell (frequencies 0.05-0.40 Hz) were responsible for much of the generation and loss of infragravity wave energy.
Abstract: [1] Nonlinear energy transfers with sea and swell (frequencies 0.05–0.40 Hz) were responsible for much of the generation and loss of infragravity wave energy (frequencies 0.005–0.050 Hz) observed under moderate- and low-energy conditions on a natural beach. Cases with energetic shear waves were excluded, and mean currents, a likely shear wave energy source, were neglected. Within 150 m of the shore, estimated nonlinear energy transfers to (or from) the infragravity band roughly balanced the divergence (or convergence) of the infragravity energy flux, consistent with a conservative energy equation. Addition of significant dissipation (requiring a bottom drag coefficient exceeding about 10−2) degraded the energy balance.

Journal ArticleDOI
TL;DR: An analysis of the costs and benefits of fielding Radio Frequency Identification/MicroElectroMechanical System (RFID/MEMS) technology for the management of ordnance inventory and a factorial model of these benefits is proposed.

Journal ArticleDOI
TL;DR: Four contributions to decision analysis literature are offered: an instructive application of multiple-objective decision analysis methods to portfolio selection, a useful method for constructing scales for interdependent attributes, a new method for assessing weights that explicitly considers importance and variation (Swing Weight Matrix), and practical advice on how to use multiple- Objectives analysis methods in a complex and controversial political environment.
Abstract: In 2001, Congress enacted legislation that required a 2005 Base Realignment and Closure (BRAC) round to realign military units, remove excess facility capacity, and support defense transformation. The United States Army used multiple-objective decision analysis to determine the military value of installations and an installation portfolio model to develop the starting point to identify potential unit realignments and base closures, providing the basis for all recommendations. Ninety-five percent of the armys recommendations were accepted by the BRAC 2005 Commission. The army expects these recommendations to create recurring savings of 1.5 billion annually after completion of BRAC implementation. This paper offers four contributions to decision analysis literature: an instructive application of multiple-objective decision analysis methods to portfolio selection, a useful method for constructing scales for interdependent attributes, a new method for assessing weights that explicitly considers importance and variation (Swing Weight Matrix), and practical advice on how to use multiple-objective decision analysis methods in a complex and controversial political environment.

Journal ArticleDOI
TL;DR: In this article, the microtexture and microstructure evolution during repetitive equal-channel angular pressing (ECAP) of pure aluminum through a 90° die was evaluated by orientation imaging microscopy and transmission electron microscopy (TEM).
Abstract: Microtexture and microstructure evolution during repetitive equal-channel angular pressing (ECAP) of pure aluminum through a 90° die was evaluated by orientation imaging microscopy (OIM) and transmission electron microscopy (TEM). Billet distortion appears to conform to the idealized ECAP model. After the initial pass, the textures were inhomogeneous but one or more shear-texture components and long-range lattice rotations were apparent. Following repetitive ECAP, the textures became more homogeneous but still included either two or three distinct shear-texture orientations. The OIM and TEM data revealed meso-scale deformation bands that were inclined at about 26° to the axis of the as-pressed samples and that involved alternation of lattice orientations between distinct shear-texture orientations. The band interfaces were of high disorientation (40–62.8°) and were distinct boundaries in TEM. The evolution of the band structures during repetitive ECAP accounts for an increasing population of high-angle boundaries in repetitively processed materials.

Journal ArticleDOI
TL;DR: A unified framework for handling the computation of optimal controls where the description of the governing equations or that of the path constraint is not a limitation and any inherent smoothness present in the optimal system trajectories is harnessed.

Proceedings ArticleDOI
01 Dec 2006
TL;DR: For a general class of vehicles moving in either two or three-dimensional space, Lyapunov-based techniques and graph theory are brought together to yield a decentralized control structure where the dynamics of the cooperating vehicles and the constraints imposed by the topology of the inter-vehicle communications network are explicitly taken into account.
Abstract: This paper addresses the problem of steering a group of underactuated autonomous vehicles along given spatial paths, while holding a desired inter-vehicle formation pattern. For a general class of vehicles moving in either two or three-dimensional space, we show how Lyapunov-based techniques and graph theory can be brought together to yield a decentralized control structure where the dynamics of the cooperating vehicles and the constraints imposed by the topology of the inter-vehicle communications network are explicitly taken into account. Path-following for each vehicle amounts to reducing an appropriately defined geometric error to a small neighborhood of the origin. Vehicle coordination is achieved by adjusting the speed of each vehicle along its path according to information on the positions of a subset of the other vehicles, as determined by the communications topology adopted. The system obtained by putting together the path-following and vehicle coordination strategies adopted takes a cascade form, where the former subsystem is input-to-state stable (ISS) with the error variables of the latter as inputs. Convergence and stability of the overall system are proved formally. The results are also extended to solve the problem of temporary communication failures. Using the concept of "brief instabilities" we show that for a given maximum failure rate, the coordinated path following system is stable and the errors converge to a small neighborhood of the origin. We illustrate our design procedure for underwater vehicles moving in three-dimensional space. Simulations results are presented and discussed.