Institution
United States Air Force Academy
Government•Colorado Springs, Colorado, United States•
About: United States Air Force Academy is a government organization based out in Colorado Springs, Colorado, United States. It is known for research contribution in the topics: Aerodynamics & Wind tunnel. The organization has 4595 authors who have published 5969 publications receiving 130757 citations. The organization is also known as: Air Force Academy & USAFA.
Topics: Aerodynamics, Wind tunnel, Laser, Population, Angle of attack
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this paper, a new method for obtaining optimized parameters for semi-empirical methods has been developed and applied to the modified neglect of diatomic overlap (MNDO) method.
Abstract: A new method for obtaining optimized parameters for semiempirical methods has been developed and applied to the modified neglect of diatomic overlap (MNDO) method. The method uses derivatives of calculated values for properties with respect to adjustable parameters to obtain the optimized values of parameters. The large increase in speed is a result of using a simple series expression for calculated values of properties rather than employing full semiempirical calculations. With this optimization procedure, the rate-determining step for parameterizing elements changes from the mechanics of parameterization to the assembling of experimental reference data.
7,125 citations
••
TL;DR: In this paper, the average difference between the predicted heats of formation and experimental values for 657 compounds is 7.8 kcal/mol, and for 106 hypervalent compounds, 13.6 kcal/min.
Abstract: MNDO/AM1‐type parameters for twelve elements have been optimized using a newly developed method for optimizing parameters for semiempirical methods. With the new method, MNDO‐PM3, the average difference between the predicted heats of formation and experimental values for 657 compounds is 7.8 kcal/mol, and for 106 hypervalent compounds, 13.6 kcal/mol. For MNDO the equivalent differences are 13.9 and 75.8 kcal/mol, while those for AM1, in which MNDO parameters are used for aluminum, phosphorus, and sulfur, are 12.7 and 83.1 kcal/mol, respectively. Average errors for ionization potentials, bond angles, and dipole moments are intermediate between those for MNDO and AM1, while errors in bond lengths are slightly reduced.
3,465 citations
••
TL;DR: This work focuses on the calculations of vibrational spectra, thermodynamic quantities, isotopic substitution effects, and force constants in a fully integrated program for the study of chemical reactions involving molecules, ions, and linear polymers using MOPAC.
Abstract: Before we start, we need a working definition for MOPAC. The following description has been used many times to describe MOPAC: MOPAC is a general-purpose, semiempirical molecular orbital program for the study of chemical reactions involving molecules, ions, and linear polymers. It implements the semiempirical Hamiltonians MNDO, AM 1, MINDO/3, and MNDOPM3, and combir_es the calculations of vibrational spectra, thermodynamic quantities, isotopic substitution effects, and force constants in a fully integrated program. Elements parameterized at the MNDO level include H, Li, Be, B, C, N, O, F, A1, Si, P, S, C1, Ge, Br, Sn, Hg, Pb, and I; at the PM3 level the elements H, C, N, O, F, A1, Si, P, S, C1, Br, and I are available. Within the electronic part of the calculation, molecular and localized orbitals, excited states up to sextets, chemical bond indices, charges, etc. are computed. Both intrinsic and dynamic reaction coordinates can be calculated. A transition-state location routine and two transition-state optimizing routines are available for studying chemical reactions.
2,422 citations
••
TL;DR: Ionic liquids, defined here as salts with melting temperatures below 100 °C, evolved from traditional high temperature molten salts and were observed as far back as the mid 19th century.
1,456 citations
••
09 Jul 1995TL;DR: Both direct and residual gradient algorithms are shown to be special cases of residual algorithms, and it is shown that residual algorithms can combine the advantages of each approach.
Abstract: A number of reinforcement learning algorithms have been developed that are guaranteed to converge to the optimal solution when used with lookup tables It is shown, however, that these algorithms can easily become unstable when implemented directly with a general function-approximation system, such as a sigmoidal multilayer perceptron, a radial-basis-function system, a memory-based learning system, or even a linear function-approximation system A new class of algorithms, residual gradient algorithms, is proposed, which perform gradient descent on the mean squared Bellman residual, guaranteeing convergence It is shown, however, that they may learn very slowly in some cases A larger class of algorithms, residual algorithms, is proposed that has the guaranteed convergence of the residual gradient algorithms, yet can retain the fast learning speed of direct algorithms In fact, both direct and residual gradient algorithms are shown to be special cases of residual algorithms, and it is shown that residual algorithms can combine the advantages of each approach The direct, residual gradient, and residual forms of value iteration, Q-learning, and advantage learning are all presented Theoretical analysis is given explaining the properties these algorithms have, and simulation results are given that demonstrate these properties
1,147 citations
Authors
Showing all 4614 results
Name | H-index | Papers | Citations |
---|---|---|---|
Jonathan I. Epstein | 138 | 1121 | 80975 |
Dennis Stello | 109 | 442 | 41667 |
Michael J. Zaworotko | 97 | 519 | 44441 |
W. Keith Campbell | 83 | 216 | 26435 |
Pramod K. Varshney | 79 | 894 | 30834 |
Michael W. Ross | 77 | 828 | 25680 |
Steven R. Steinhubl | 72 | 323 | 23387 |
Thomas W. Bauer | 72 | 473 | 19058 |
Lawrence T. Drzal | 70 | 274 | 21711 |
Gertrude Henle | 70 | 223 | 19452 |
Gary S. Gronseth | 70 | 141 | 16959 |
Brian L. Scott | 69 | 555 | 17928 |
Christopher A. Lowry | 61 | 252 | 10883 |
James Alm | 59 | 287 | 13294 |
Okyay Kaynak | 56 | 346 | 13990 |