scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Atoms, molecules, solids, and surfaces: Applications of the generalized gradient approximation for exchange and correlation.

TL;DR: A way is found to visualize and understand the nonlocality of exchange and correlation, its origins, and its physical effects as well as significant interconfigurational and interterm errors remain.
Abstract: Generalized gradient approximations (GGA's) seek to improve upon the accuracy of the local-spin-density (LSD) approximation in electronic-structure calculations. Perdew and Wang have developed a GGA based on real-space cutoff of the spurious long-range components of the second-order gradient expansion for the exchange-correlation hole. We have found that this density functional performs well in numerical tests for a variety of systems: (1) Total energies of 30 atoms are highly accurate. (2) Ionization energies and electron affinities are improved in a statistical sense, although significant interconfigurational and interterm errors remain. (3) Accurate atomization energies are found for seven hydrocarbon molecules, with a rms error per bond of 0.1 eV, compared with 0.7 eV for the LSD approximation and 2.4 eV for the Hartree-Fock approximation. (4) For atoms and molecules, there is a cancellation of error between density functionals for exchange and correlation, which is most striking whenever the Hartree-Fock result is furthest from experiment. (5) The surprising LSD underestimation of the lattice constants of Li and Na by 3--4 % is corrected, and the magnetic ground state of solid Fe is restored. (6) The work function, surface energy (neglecting the long-range contribution), and curvature energy of a metallic surface are all slightly reduced in comparison with LSD. Taking account of the positive long-range contribution, we find surface and curvature energies in good agreement with experimental or exact values. Finally, a way is found to visualize and understand the nonlocality of exchange and correlation, its origins, and its physical effects.

Summary (3 min read)

1 Introduction

  • Section 2 provides an overview of the SecReq approach and how it can be used to simulate the presence of a security expert in requirements elicitation.
  • Section 3 outlines how Bayesian classifiers can improve security awareness.
  • Section 4 presents the evaluation for their technical solution.
  • Section 6 outlines related work, and Section 7 concludes the paper by summing up the main results and outlining further directions of work from here.

2 Simulating the Presence of a Security Expert

  • To support the identification of hidden security aspects, the authors need to identify security-relevant requirements.
  • Furthermore, each and every functional requirement could be regarded to be somewhat security-relevant: Safety and Confidentiality of data should always be ensured.
  • The classification question was very instrumental when classifying a requirement: Classification Question:.
  • Assume there is only a limited budget for refining requirements to security requirements.

3 Using Bayesian Classification to Enhance Security Awareness

  • Naive Bayesian Classification has some drawbacks and limitations.
  • Rennie et al. discuss technical limitations as well as strategies to overcome them [9] .
  • Accordingly, Naive Bayesian Classifiers are widespread because they are easy to implement and efficient.
  • This makes them a good choice for their evaluation, despite the known drawbacks.
  • It is considerably more difficult for humans to identify security-relevant requirements than identifying spam-mail.

3.1 Assessing the Security-Relatedness of a Single Word

  • The approach the authors described is called Naive Bayesian Classification, due to its simplifying assumptions.
  • The authors chose the classic variant to investigate whether even this simple variant would work in security requirements identification.

4 Evaluation of Bayesian Classifiers

  • This section discusses the quality of classifiers and how they can be used to assist in security requirements elicitation.
  • Then the authors describe their strategy to reach these goals and the general process of evaluation.
  • Finally, the authors show and discuss the results for each evaluation goal.

4.1 Evaluation Goals

  • For the goals (G1) and (G2) the authors used expert evaluation and data mining to create meaningful test data, as described in Section 2.
  • Subsets of this test data were used to train and evaluate the Bayesian Classifiers.
  • The authors evaluation strategy had to ensure that training and evaluation sets were kept disjoint.
  • In the context of this paper, goal (G3) is informally evaluated by asking experts for their opinions about classification results.

4.2 Evaluation Strategy

  • Table 1 provides an overview of the three specifications the authors used for evaluation of their classifiers:.
  • For each specification (left column), the authors list the total number of requirements they ccontain (2nd. column) and the number of requirements considered security-relevant (3rd. column).
  • The authors used either experts (see Sect. 2) or existing databases for identifying security-relevant requirements (last column).

4.5 Transferability of Classifiers Trained in Multiple Domains: G2.b

  • The bottom entry in Table 3 shows the results when the authors combined all three specifications for training.
  • Now the authors got good results for all three specifications included in the evaluation.
  • Figure 3 shows the learning curve, by giving the results when using less than 9 parts for.
  • The learning curve grows not as fast as in the Figure 2 , probably because the classifier cannot leverage the domain specific concepts.
  • Nevertheless, the authors get a recall of 91 %, a precision of 79 %, and a f-measure of 84 % -results that clearly show that the trained classifier is suitable to support security requirements elicitation in all of the three domains used for training.

5 Discussion and Implications on Industrial Practice

  • In this section the authors discuss whether the observed results are sufficient for employing the filter in practice at its current status.
  • Then the authors take a look at the validity of their evaluation of the Bayesian classifier filter.
  • Finally, the authors summarise the discussion with practitioners and describe how they perceive the implications of the filter in practice, meaning their development projects.

5.1 Interpretation of Evaluation Results

  • To summarise, in its current status the classifier is indeed a very valuable addition for example in the context of software evolution or product lines.
  • I.e., the classifier could be trained using the last version of the requirements specification and than offer precious help in developing the new software version.
  • Typically, subsequent specifications resemble their predecessor in large parts and add only small new parts.
  • Evaluation of this situation is covered by k-fold cross validation, as large (k − 1) parts of a specification are used for training and applied to a small held-out part.
  • Therefore, the results in Figure 2 apply to this situation.

5.2 Discussion on Validity

  • The authors used specifications from two different domains in their evaluation.
  • Therefore, the authors cannot guarantee that their results would hold for a third domain.
  • To leverage this threat, the authors share their evaluation tool, classified data sets, and databases of learned words at http://www.se.uni-hannover.de/.
  • The authors invite others to replicate their experiment, or use their results.

5.3 Implications on industrial practice

  • The Bayesian classification as an addition to SecReq not only contributes to a more effective and focused security elicitation process, but also in separating important from not so important security-relevant aspects.
  • In particular, the Bayesian classification and security expert simulation in HeRA, with its ability to train the classification to be system and project specific, directly enables effective reuse of earlier experience, as well as prioritising and company specific security-related focus areas or policies.
  • The ability to first train the classification engine to understand how to separate important security-relevant aspects from not so important, and then use this newly gained knowledge to traverse functional descriptions and already specified security requirements have a promising potential to contribute in a better control of security spending in development projects.

7 Conclusion

  • The authors approach does not aim at completeness in a strict logical sense.
  • There is no 100% guarantee that all security-relevant requirements are found, nor that no non-security-relevant requirements are falsely reported.
  • This is, however, a limitation that is directly imposed by the current limitations from computational linguistics (essentially, the fact that a true automated text understanding is currently not available).
  • Therefore, the authors believe that the approach provides useful assistance in that it supports requirements engineers to identify security-relevant requirements, when no security expert is present.
  • Moreover, since this selection process is supported by automated tools, its execution is easy to document and it is repeatable and thus well auditable.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report






Citations
More filters
Journal ArticleDOI
TL;DR: In this article, a semi-empirical exchange correlation functional with local spin density, gradient, and exact exchange terms was proposed. But this functional performed significantly better than previous functionals with gradient corrections only, and fits experimental atomization energies with an impressively small average absolute deviation of 2.4 kcal/mol.
Abstract: Despite the remarkable thermochemical accuracy of Kohn–Sham density‐functional theories with gradient corrections for exchange‐correlation [see, for example, A. D. Becke, J. Chem. Phys. 96, 2155 (1992)], we believe that further improvements are unlikely unless exact‐exchange information is considered. Arguments to support this view are presented, and a semiempirical exchange‐correlation functional containing local‐spin‐density, gradient, and exact‐exchange terms is tested on 56 atomization energies, 42 ionization potentials, 8 proton affinities, and 10 total atomic energies of first‐ and second‐row systems. This functional performs significantly better than previous functionals with gradient corrections only, and fits experimental atomization energies with an impressively small average absolute deviation of 2.4 kcal/mol.

87,732 citations

Journal ArticleDOI
TL;DR: QUANTUM ESPRESSO as discussed by the authors is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave).
Abstract: QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

19,985 citations

Journal ArticleDOI
TL;DR: The basics of the suject are looked at, a brief review of the theory is given, examining the strengths and weaknesses of its implementation, and some of the ways simulators approach problems are illustrated through a small case study.
Abstract: First-principles simulation, meaning density-functional theory calculations with plane waves and pseudopotentials, has become a prized technique in condensed-matter theory. Here I look at the basics of the suject, give a brief review of the theory, examining the strengths and weaknesses of its implementation, and illustrating some of the ways simulators approach problems through a small case study. I also discuss why and how modern software design methods have been used in writing a completely new modular version of the CASTEP code.

9,350 citations

Journal ArticleDOI
TL;DR: The “Activation‐strain TS interaction” (ATS) model of chemical reactivity is reviewed as a conceptual framework for understanding how activation barriers of various types of reaction mechanisms arise and how they may be controlled, for example, in organic chemistry or homogeneous catalysis.
Abstract: We present the theoretical and technical foundations of the Amsterdam Density Functional (ADF) program with a survey of the characteristics of the code (numerical integration, density fitting for the Coulomb potential, and STO basis functions). Recent developments enhance the efficiency of ADF (e.g., parallelization, near order-N scaling, QM/MM) and its functionality (e.g., NMR chemical shifts, COSMO solvent effects, ZORA relativistic method, excitation energies, frequency-dependent (hyper)polarizabilities, atomic VDD charges). In the Applications section we discuss the physical model of the electronic structure and the chemical bond, i.e., the Kohn–Sham molecular orbital (MO) theory, and illustrate the power of the Kohn–Sham MO model in conjunction with the ADF-typical fragment approach to quantitatively understand and predict chemical phenomena. We review the “Activation-strain TS interaction” (ATS) model of chemical reactivity as a conceptual framework for understanding how activation barriers of various types of (competing) reaction mechanisms arise and how they may be controlled, for example, in organic chemistry or homogeneous catalysis. Finally, we include a brief discussion of exemplary applications in the field of biochemistry (structure and bonding of DNA) and of time-dependent density functional theory (TDDFT) to indicate how this development further reinforces the ADF tools for the analysis of chemical phenomena. © 2001 John Wiley & Sons, Inc. J Comput Chem 22: 931–967, 2001

8,490 citations

Journal ArticleDOI
TL;DR: In this paper, scaling factors for fundamental vibrational frequencies, low-frequency vibrations, zero-point vibrational energies (ZPVE), and thermal contributions to enthalpy and entropy from harmonic frequencies determined at 19 levels of theory have been derived through a least-squares approach.
Abstract: Scaling factors for obtaining fundamental vibrational frequencies, low-frequency vibrations, zero-point vibrational energies (ZPVE), and thermal contributions to enthalpy and entropy from harmonic frequencies determined at 19 levels of theory have been derived through a least-squares approach. Semiempirical methods (AM1 and PM3), conventional uncorrelated and correlated ab initio molecular orbital procedures [Hartree−Fock (HF), Moller−Plesset (MP2), and quadratic configuration interaction including single and double substitutions (QCISD)], and several variants of density functional theory (DFT: B-LYP, B-P86, B3-LYP, B3-P86, and B3-PW91) have been examined in conjunction with the 3-21G, 6-31G(d), 6-31+G(d), 6-31G(d,p), 6-311G(d,p), and 6-311G(df,p) basis sets. The scaling factors for the theoretical harmonic vibrational frequencies were determined by a comparison with the corresponding experimental fundamentals utilizing a total of 1066 individual vibrations. Scaling factors suitable for low-frequency vib...

6,287 citations