scispace - formally typeset
Search or ask a question

Showing papers in "Wiley Interdisciplinary Reviews: Computational Molecular Science in 2013"


Journal ArticleDOI
TL;DR: The most recent developments, since version 9 was released in April 2006, of the Amber and AmberTools MD software packages are outlined, referred to here as simply the Amber package.
Abstract: Molecular dynamics (MD) allows the study of biological and chemical systems at the atomistic level on timescales from femtoseconds to milliseconds. It complements experiment while also offering a way to follow processes difficult to discern with experimental techniques. Numerous software packages exist for conducting MD simulations of which one of the widest used is termed Amber. Here, we outline the most recent developments, since version 9 was released in April 2006, of the Amber and AmberTools MD software packages, referred to here as simply the Amber package. The latest release represents six years of continued development, since version 9, by multiple research groups and the culmination of over 33 years of work beginning with the first version in 1979. The latest release of the Amber package, version 12 released in April 2012, includes a substantial number of important developments in both the scientific and computer science arenas. We present here a condensed vision of what Amber currently supports and where things are likely to head over the coming years. Figure 1 shows the performance in ns/day of the Amber package version 12 on a single-core AMD FX-8120 8-Core 3.6GHz CPU, the Cray XT5 system, and a single GPU GTX680. © 2012 John Wiley & Sons, Ltd.

1,734 citations



Journal ArticleDOI
TL;DR: Q‐Chem is a general‐purpose electronic structure package featuring a variety of established and new methods implemented using innovative algorithms that enable fast calculations of large systems on regular laboratory workstations using density functional and wave‐function‐based approaches.
Abstract: Q-Chem is a general-purpose electronic structure package featuring a variety of established and new methods implemented using innovative algorithms that enable fast calculations of large systems on regular laboratory workstations using density functional and wave-function-based approaches. It features an integrated graphical interface and input generator, a large selection of functionals and correlation approaches including methods for electronically excited states and open-shell systems. In addition to serving the computational chemistry community, Q-Chem also provides an excellent development platform. © 2012 John Wiley & Sons, Ltd.

304 citations



Journal ArticleDOI
Frank Jensen1
TL;DR: A number of hierarchical basis sets have been proposed over the last two decades, and they have enabled systematic approaches to assess and control the errors due to incomplete basis sets as mentioned in this paper, and compare the compositions of eight families of basis sets available in several different qualities and for a reasonable number of elements in the periodic table.
Abstract: Electronic structure methods for molecular systems rely heavily on using basis sets composed of Gaussian functions for representing the molecular orbitals. A number of hierarchical basis sets have been proposed over the last two decades, and they have enabled systematic approaches to assessing and controlling the errors due to incomplete basis sets. We outline some of the principles for constructing basis sets, and compare the compositions of eight families of basis sets that are available in several different qualities and for a reasonable number of elements in the periodic table. © 2012 John Wiley & Sons, Ltd.

183 citations


Journal ArticleDOI
TL;DR: In this article, an explanation of the physical meaning of the electron propagator's poles and residues is followed by a discussion of its couplings to more complicated propagators and connections between Dyson orbitals and transition probabilities.
Abstract: Electron propagator theory provides a practical means of calculating electron binding energies, Dyson orbitals, and ground-state properties from first principles. This approach to ab initio electronic structure theory also facilitates the interpretation of its quantitative predictions in terms of concepts that closely resemble those of one-electron theories. An explanation of the physical meaning of the electron propagator's poles and residues is followed by a discussion of its couplings to more complicated propagators. These relationships are exploited in superoperator theory and lead to a compact form of the electron propagator that is derived by matrix partitioning. Expressions for reference-state properties, relationships to the extended Koopmans's theorem technique for evaluating electron binding energies, and connections between Dyson orbitals and transition probabilities follow from this discussion. The inverse form of the Dyson equation for the electron propagator leads to a strategy for obtaining electron binding energies and Dyson orbitals that generalizes the Hartree–Fock equations through the introduction of the self-energy operator. All relaxation and correlation effects reside in this operator, which has an energy-dependent, nonlocal form that is systematically improvable. Perturbative arguments produce several, convenient (e.g. partial third order, outer valence Green's function, and second-order, transition-operator) approximations for the evaluation of valence ionization energies, electron affinities, and core ionization energies. Renormalized approaches based on Hartree–Fock or approximate Brueckner orbitals are employed when correlation effects become qualitatively important. Reference-state total energies based on contour integrals in the complex plane and gradients of electron binding energies enable exploration of final-state potential energy surfaces. © 2012 John Wiley & Sons, Ltd.

172 citations


Journal ArticleDOI
TL;DR: Although a final and generally accepted multireference CC theory is still lacking, it is emphasized that recent developments render the new MRCC schemes useful tools for solving chemical problems.
Abstract: The multireference problem is considered one of the great challenges in coupled-cluster (CC) theory. Most recent developments are based on state-specific approaches, which focus on a single state and avoid some of the numerical problems of more general approaches. We review various state-of-the-art methods, including Mukherjee's state-specific multireference coupled-cluster (Mk-MRCC) theory, multireference Brillouin–Wigner coupled-cluster (MR-BWCC) theory, the MRexpT method, and internally contracted multireference coupled-cluster (ic-MRCC) theory. Related methods such as extended single-reference schemes [e.g., the complete active space coupled-cluster (CASCC) theory] and canonical transformation (CT) theory are covered as well. The comparison is done on the basis of formal arguments, implementation issues, and numerical results. Although a final and generally accepted multireference CC theory is still lacking, it is emphasized that recent developments render the new MRCC schemes useful tools for solving chemical problems. © 2012 John Wiley & Sons, Ltd.

113 citations


Journal ArticleDOI
TL;DR: The present paper will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied, focusing on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form.
Abstract: Materials science is a highly interdisciplinary field. It is devoted to the understanding of the relationship between (a) fundamental physical and chemical properties governing processes at the atomistic scale with (b) typically macroscopic properties required of materials in engineering applications. For many materials, this relationship is not only determined by chemical composition, but strongly governed by microstructure. The latter is a consequence of carefully selected process conditions (e.g., mechanical forming and annealing in metallurgy or epitaxial growth in semiconductor technology). A key task of computational materials science is to unravel the often hidden composition–structure–property relationships using computational techniques. The present paper does not aim to give a complete review of all aspects of materials science. Rather, we will present the key concepts underlying the computation of selected material properties and discuss the major classes of materials to which they are applied. Specifically, our focus will be on methods used to describe single or polycrystalline bulk materials of semiconductor, metal or ceramic form. © 2013 John Wiley & Sons, Ltd.

103 citations


Journal ArticleDOI
TL;DR: This review gives a brief overview of selected linear‐scaling QM approaches at the Hartree–Fock and density‐functional theory level with a particular emphasis on density matrix‐based approaches.
Abstract: Over the last decades, linear-scaling quantum–chemical methods (QM) have become an important tool for studying large molecular systems, so that already with modest computer resources molecules with more than a thousand atoms are well in reach. The key feature of the methods is the reduction of the steep scaling of the computational effort of conventional ab initio schemes to linear while reliability and accuracy of the underlying quantum–chemical approximation is preserved in the most successful schemes. This review gives a brief overview of selected linear-scaling approaches at the Hartree–Fock and density-functional theory level with a particular emphasis on density matrix-based approaches. The focus is not only on energetics, but also on the calculation of molecular properties providing an important link between theory and experiment. In addition, the usefulness of linear-scaling QM approaches within quantum mechanical/molecular mechanical (QM/MM) hybrid schemes is briefly discussed. WIREs Comput Mol Sci 2013, 3:614–636. doi: 10.1002/wcms.1138 For further resources related to this article, please visit the WIREs website.

102 citations


Journal ArticleDOI
TL;DR: The aim is to review the existing CG models of the double‐stranded DNA, where a small selection of models, which are believed to provide avenues for promising future development, are discussed in some detail.
Abstract: The growing interest in the DNA-based mesoscale systems of biological and nonbiological nature has encouraged the computational molecular science community to develop coarse-grained (CG) representationsof the DNA that will be simple enough to permit exhaustive simulations in a reasonable amount of time, yet complex enough to capture the essential physics at play. In the recent years, there have been some major developments in the DNA coarse-graining area and several fairly sophisticated models are now available that faithfully reproduce key mechanical and chemical properties of the double- and single-stranded DNA. However, there are still many challenges, which limit the applicability of the present models, and much has to be done yet to develop more reliable schemes which would have a predictive power beyond the target domain of the intrinsic parametrization. A development of robust, controllable, and transferrable CG DNA force fields will provide an invaluable tool for gaining physical insights into the molecular nature of complex DNA-based nanoscale entities such as the chromatin, virus capsids, and DNA nanocomposites. In the present contribution, we provide an overview of the recent developments in the DNA coarse-graining field. Our aim is to review the existing CG models of the double-stranded DNA, where a small selection of models, which we believe provide avenues for promising future development, are discussed in some detail. © 2012 John Wiley & Sons, Ltd.

95 citations


Journal ArticleDOI
TL;DR: In this paper, a Hessian-based predictor-corrector algorithm and a high-accuracy Hessian updating algorithm are described for enhancing the efficiency of direct dynamics simulations, in which an ensemble of trajectories is calculated which represents the experimental and chemical system under study.
Abstract: In classical and quasiclassical trajectory chemical dynamics simulations, the atomistic dynamics of collisions, chemical reactions, and energy transfer are studied by solving the classical equations of motion. These equations require the potential energy and its gradient for the chemical system under study, and they may be obtained directly from an electronic structure theory. This article reviews such direct dynamics simulations. The accuracy of classical chemical dynamics is considered, with simulations highlighted for the F− + CH3OOH reaction and of energy transfer in collisions of CO2 with a perfluorinated self-assembled monolayer (F-SAM) surface. Procedures for interfacing chemical dynamics and electronic structure theory computer codes are discussed. A Hessian-based predictor–corrector algorithm and high-accuracy Hessian updating algorithm, for enhancing the efficiency of direct dynamics simulations, are described. In these simulations, an ensemble of trajectories is calculated which represents the experimental and chemical system under study. Algorithms are described for selecting the appropriate initial conditions for bimolecular and unimolecular reactions, gas-surface collisions, and initializing trajectories at transition states and conical intersections. Illustrative direct dynamics simulations are presented for the Cl− + CH3I SN2 reaction, unimolecular decomposition of the epoxy resin constituent CH3NHCHCHCH3 versus temperature, collisions and reactions of N-protonated diglycine with a F-SAM surface that has a reactive head group, and the product energy partitioning for the post-transition state dynamics of C2H5F HF + C2H4 dissociation. © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A robust and efficient implementation of the Cholesky decomposition techniques for handling two‐electron integrals has been developed which is unique to MOLCAS and a powerful and multilayer graphical and scripting user interface is available, which is an essential ingredient for the setup of MC‐WFT calculations.
Abstract: At variance, with most of the quantum chemistry software presently available, MOLCAS is a package that is specialized in multiconfigurational wave function theory (MC-WFT) rather than density functional theory (DFT). Given the much higher algorithmic complexity of MC-WFT versus DFT, an extraordinary effort needs to be made from the programming point of view to achieve state-of-the-art performance for large-scale calculations. In particular, a robust and efficient implementation of the Cholesky decomposition techniques for handling two-electron integrals has been developed which is unique to MOLCAS. Together with this 'Cholesky infrastructure', a powerful and multilayer graphical and scripting user interface is available, which is an essential ingredient for the setup of MC-WFT calculations. These two aspects of the MOLCAS software constitute the focus of the present report. (C) 2012 John Wiley & Sons, Ltd. (Less)

Journal ArticleDOI
TL;DR: Most of the new algorithms are optimization based, designed to find optimal mappings with the minimum number of broken and formed bonds, and incorporate the chemical knowledge into the searching process in the form of bond weights.
Abstract: A reaction center is the part of a chemical reaction that undergoes changes, the heart of the chemical reaction. The reaction atom–atom mapping indicates which reactant atom becomes which product atom during the reaction. Automatic reaction mapping and reaction center detection are of great importance in many applications, such as developing chemical and biochemical reaction databases and studying reaction mechanisms. Traditional reaction mapping algorithms are either based on extended-connectivity or maximum common substructure (MCS) algorithms. With the development of several biochemical reaction databases (such as KEGG database) and increasing interest in studying metabolic pathways in recent years, several novel reaction mapping algorithms have been developed to serve the new needs. Most of the new algorithms are optimization based, designed to find optimal mappings with the minimum number of broken and formed bonds. Some algorithms also incorporate the chemical knowledge into the searching process in the form of bond weights. Some new algorithms showed better accuracy and performance than the MCS-based method. WIREs Comput Mol Sci 2013, 3:560–593. doi: 10.1002/wcms.1140 For further resources related to this article, please visit the WIREs website.

Journal ArticleDOI
TL;DR: In this paper, a theory of resonance energy transfer for photo-synthetic light harvesting complex systems and organic materials is presented, which can address the capabilities of new theoretical approaches and future challenges.
Abstract: Recent experimental and theoretical studies suggest that biological photosynthetic complexes utilize the quantum coherence in a positive manner for efficient and robust flow of electronic excitation energy. Clear and quantitative understanding of such suggestion is important for identifying the design principles behind efficient flow of excitons coherently delocalized over multiple chromophores in condensed environments. Adaptation of such principles for synthetic macromolecular systems has also significant implication for the development of novel photovoltaic systems. Advanced theories of resonance energy transfer are presented, which can address these issues. Applications to photosynthetic light harvesting complex systems and organic materials demonstrate the capabilities of new theoretical approaches and future challenges.

Journal ArticleDOI
TL;DR: In this paper, the structural changes induced by vaporization, the exactcharacteristic of proteins in the gas phase, and the physicochemical forces stabilizing dehydrated proteins are reviewed using both experimental and theoretical sources of information.
Abstract: Proteins are complex macromolecules that evolved over billions of years to be active in aqueous solution. Water is a key element that stabilizes their structure, and most structural studies on proteins have thus been carried out in aqueous environment. However, recent experimental approaches have opened the possibility to gain structural information on proteins from gas-phase measurements. The obtained results revealed significant structural memory in proteins when transferred from water to the gas phase. However, after several years of experimental and theoretical research, the nature of the structural changes induced by vaporization,theexactcharacteristicsofproteinsinthegasphase,andthephysicochemical forces stabilizing dehydrated proteins are still unclear. We will review here these issues using both experimental and theoretical sources of information. C � 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The atomistic techniques reviewed here are deemed necessary for exploration of the near infinite variations in atomic composition that exists even in the canonical nucleosome octamer.
Abstract: Nearly a dozen all-atom molecular dynamics (MD) simulations of the nucleosome have been performed Collectively, these simulations provide insights into the structure and dynamics of the biomolecular complex that serves as the fundamental folding unit of chromatin Nucleosomes contain 146 base pairs of DNA wrapped in a left-handed superhelix around a core of eight histones This review provides a survey of what has been learned about DNA, histones, and solvent interactions based on all-atom MD studies of the nucleosome The longest simulations to date are on the order of 100 nanoseconds On this time scale, nucleosomes are quite stable DNA kinks, the histone tails, solvent, and ions are highly dynamic and can be readily investigated using equilibrium dynamics methods Steered MD is required to observe large-scale structural changes The need for explicit solvent techniques is underscored by the inability of continuum solvent methods to properly describe the ion-nucleosome radial distribution functions The atomistic techniques reviewed here are deemed necessary for exploration of the near infinite variations in atomic composition that exists even in the canonical nucleosome octamer Continued development of these nascent simulation efforts will enable experimentalists to utilize rational design strategies in their efforts to investigate nucleosomes and chromatin © 2013 John Wiley & Sons, Ltd

Journal ArticleDOI
TL;DR: A brief overview of current status of studies on nucleic acid bases and base pairs tautomeric properties in the different environments can be found in this paper, where the usefulness of high-level theoretical calculations have been found to be extremely useful in interpreting and enlightening complex experimental results and it is not surprising that application of theoretical calculations were necessary to assign relatively higher energy tautomers of guanine in the gas phase.
Abstract: This article provides a brief overview of current status of studies on nucleic acid bases and base pairs tautomeric properties in the different environments. Applications of high-level theoretical calculations have been found to be extremely useful in interpreting and enlightening complex experimental results. Actually, theory and experiments are complementary to each other and therefore it is not surprising that application of theoretical calculations were necessary to assign relatively higher energy tautomers of guanine in the gas phase. Inability to observe the stable tautomers in the laser-desorbed guanine in the jet-cooled beam demands more experiments and rigorous analysis. The usefulness of theoretical methods in the assignment of tautomeric forms of other bases and base pairs has also been discussed. Furthermore, we would like to state that recent surge in the high-level computation of system such as nucleic acid bases and base pairs is only possible due to the advancement in state-of-the-art computer hardware and computational algorithms. Afterward, it is not unexpected to assume that future investigations would utilize high-level quantum chemistry calculation more rigorously in explaining complex experimental data as well as in revealing new phenomena not studied by experiments. WIREs Comput Mol Sci 2013, 3:637–649. doi: 10.1002/wcms.1145 For further resources related to this article, please visit the WIREs website.

Journal ArticleDOI
TL;DR: The utility of MIF‐based in silico approaches in drug design is extremely broad, including approaches to support experimental design in hit‐finding, lead‐optimization, physicochemical property prediction and PK modeling, drug metabolism prediction, and toxicity.
Abstract: Drug discovery is a highly complex and costly process, and in recent years, the pharmaceutical industry has shifted from traditional to genomics- and proteomics-based drug research strategies. The identification of druggable target sites, promising hits, and high quality leads are crucial steps in the early stages of drug discovery projects. Pharmacokinetic (PK) and drug metabolism profiling to optimize bioavailability, clearance, and toxicity are increasingly important areas to prevent costly failures in preclinical and clinical studies. The integration of a wide variety of technologies and expertise in multidisciplinary research teams combining synergistic effects between experimental and computational approaches on the selection and optimization of bioactive compounds to pass these hurdles is now commonplace, although there remain challenging areas. Molecular interaction fields (MIFs) are widely used in a range of applications to support the discovery teams, characterizing molecules according to their favorable interaction sites and therefore enabling predictions to be made about how molecules might interact. The utility of MIF-based in silico approaches in drug design is extremely broad, including approaches to support experimental design in hit-finding, lead-optimization, physicochemical property prediction and PK modeling, drug metabolism prediction, and toxicity. C � 2013 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: Druggability predictions assess the ability of a given binding site to host drug‐like organic molecules and can be used to prioritize therapeutic targets and are particularly useful when moving beyond the traditional target classes.
Abstract: Pharmacological treatment with small organic molecules offers important medical, economic, and practical advantages over other therapeutic approaches. However, such small molecules can only elicit an effect when they bind to a biological component at the appropriate site and with sufficient affinity to modify its behavior. Druggability predictions assess the ability of a given binding site to host drug-like organic molecules. Combined with information about the involvement of such component in a disease, druggability predictions can be used to prioritize therapeutic targets and are particularly useful when moving beyond the traditional target classes. In the last few years, significant progress has been made to understand the molecular basis of druggability, to compile datasets of targets with various degrees of druggability, and to develop a diverse set of computational prediction methods. These tools offer a better prospect for target-based drug discovery. © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: In this article, the use of electrostatic potentials at nuclei (EPN) in characterizing the reactivity of substituted aromatic compounds when the reaction center is situated outside the aromatic ring is discussed.
Abstract: Recent advances have been achieved in the quantitative description of the reactivity of aromatic compounds in terms of simple parameters derived from theoretical computations. The first part of this review surveys the use of electrostatic potentials at nuclei (EPN) in characterizing the reactivity of substituted aromatic compounds when the reaction center is situated outside the aromatic ring. The application of EPN for several typical reactions of substituted aromatic systems is described in detail. The performance of alternative reactivity descriptors, such as theoretical atomic charges, the Parr electrophilicity index, and the experimental Hammett constants, is considered as well. The second part of this review discusses the recently proposed electrophile affinity construct for quantifying reactivity and regiospecificity for the most typical reaction of arenes: electrophilic aromatic substitution. The characterization of reactivity of aromatic molecules in terms of proton affinities and arene nucleophilicity indices is surveyed briefly.

Journal ArticleDOI
TL;DR: In this paper, the basic chemicophysical concepts and the most recent developments in the dynamics of the elementary electron transfer reactions are reviewed, posing particular attention to discrete state approaches, which combine use of a few experimental data with reliable ab initio calculations of the equilibrium nuclear configurations and normal coordinates of vibration of the redox partners.
Abstract: The basic chemicophysical concepts and the most recent developments in the dynamics of the elementary electron transfer reactions are reviewed, posing particular attention to discrete state approaches, which combine use of a few experimental data with reliable ab initio calculations of the equilibrium nuclear configurations and normal coordinates of vibration of the redox partners. WIREs Comput Mol Sci 2013, 3:542–559. doi: 10.1002/wcms.1147 For further resources related to this article, please visit the WIREs website.

Journal ArticleDOI
TL;DR: In this paper, the authors consider the computational scheme of the differential overlap method and its recent applications in chemistry, biophysics, and material science, and present a review of their work.
Abstract: Intermediate neglect of differential overlap for spectroscopy is a semiempirical approach, which is widely used to calculate spectroscopic and electron-transfer properties of various molecular systems including biological molecules, transition metal compounds, and advanced materials. In the review, we consider the computational scheme of the method and its recent applications in chemistry, biophysics, and material science. © 2013 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The use of computational methods in carbene chemistry has a long-standing tradition as mentioned in this paper, and the field has come a long way since the first ab initio calculations on methylene.
Abstract: The use of computational methods in carbene chemistry has a long-standing tradition. Indeed, the field has come a long way since the first ab initio calculations on methylene. Computations now routinely accompany most experimental studies, either to validate the obtained results or to help design appropriate experiments. Advances in computational carbene chemistry within the last decade are covered in this text, encompassing a plethora of studies on alkyl-, aryl-, halo-, and heterocarbenes (N, P, O, S) as well as on persistent triplet carbenes. Moreover, the conceptual advancements in the fields of theoretical chemistry and computing technology have enabled researchers to conduct intricate ab initio studies. The application of leading-edge theory to multireference problems, high-accuracy thermochemical evaluations, atom tunneling, and the description of bonding is thoroughly reviewed. In addition, general recommendations for the choice of an appropriate method for a specific computational problem are given. Practitioners of the art are likely to discover new computational approaches in carbene chemistry applied to various examples from the current literature. © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: A number of approaches that have been developed to predict the ability of proteins and RNA molecules to associate are discussed.
Abstract: Ribonucleoprotein interactions play important roles in a wide variety of cellular processes, ranging from transcriptional and posttranscriptional regulation of gene expression to host defense against pathogens. High throughput experiments to identify RNA–protein interactions provide information about the complexity of interaction networks, but require time and considerable efforts. Thus, there is need for reliable computational methods for predicting ribonucleoprotein interactions. In this review, we discuss a number of approaches that have been developed to predict the ability of proteins and RNA molecules to associate. © 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: There has been a significant surge in the number of computational approaches available to mine and identify bioisosteric replacements for fragments of bioactive compounds.
Abstract: Bioisosterism is a key concept in medicinal chemistry, as it allows medicinal chemists to interchange structural fragments without significant perturbation in biological activity. Not surprisingly, given the vast amount of bioactivity data and chemoinformatics resources now available, there has been a significant surge in the number of computational approaches available to mine and identify bioisosteric replacements for fragments of bioactive compounds. Such methods have certainly provided medicinal chemists with a diverse arsenal of in-house, commercial, and academic tools and interfaces to aid in the optimization across a number of end points such as bioactivity; selectivity; and absorption, distribution, metabolism, excretion, and toxicity properties for effective and efficient drug design. These in silico bioisosteric replacement mining approaches can generally be divided into two categories, namely ligand based and structure based. The approaches of the former category use information that is derived from ligands, whereas the latter category requires knowledge of the biological target, as well as specific knowledge of the interactions between ligand and target in the binding pocket. Ligand-based methodologies are also typically divided into similarity-based and database mining (or knowledge-based) approaches. In general, the former provide an assessment of putative bioisosteric fragments and substructures, in terms of molecular topology and descriptors, whereas the latter extract structural transformations from large chemical repositories and associate them with the induced change in biological or any other property of interest. Following systematic retrospective studies, a large number of nonclassical bioisosteric equivalents have been reported in the literature.

Journal ArticleDOI
TL;DR: The role of curvature on electronics properties of curved carbon!-systems is discussed in this paper for systems of varying dimensionalities, ranging from 0D (fullerenes, molecularbowls) to 1D (carbonnanotubes) and 3D (bulkcrystals).
Abstract: The extended family of curved carbon!-systems offers a unique possibility for building up structures with a tunable spectrum of structural and electronic properties. Such a structure‐property profile motivates the creative use of these materials as active components in molecular devices. Key to these functional building blocks is the curvature, which confines the electronic states in one or more directions (nanoscale directions) imparting remarkable physical phenomena to a material. In this respect, the formation of electronic excitations in form of excitons has a fundamental role in determining the optical and transport properties of this class of materials. The role of the curvature on electronics properties of curved aromatics is discussed for systems of varying dimensionalities, ranging from 0D (fullerenes,molecularbowls)to1D(carbonnanotubes)and3D(bulkcrystals).Recent progress in the area of optical and transport properties of the largest classes of curved aromatic systems is discussed, and focus is given to molecules in isolation, molecules on surfaces, crystalline systems, and molecular nanojunctions. C " 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: The quadratic configuration interaction (QCI) approach as mentioned in this paper is a special form of size-extensive configuration interaction, and it has been shown that the original form of QCI in its original form can be converted into a series of simplified coupled cluster methods.
Abstract: Configuration interaction (CI) theory has dominated the first 50 years of quantum chemistry before it was replaced by many-body perturbation theory and coupled cluster theory. However, even today it plays an important role in the education of everybody who wants to enter the realm of quantum chemistry. Apart from this, full CI is the method of choice for getting exact energies for a given basis set. The development of CI theory from the early days of quantum chemistry up to our time is described with special emphasis on the size-extensivity problem, which after its discovery has reduced the use of CI methods considerably. It led to the development of the quadratic CI (QCI) approach as a special form of size-extensive CI. Intimately linked with QCI is the scientific dispute between QCI developers and their opponents, who argued that the QCI approach in its original form does not lead to a set of size-extensive CI methods. This dispute was settled when it was shown that QCI in its original form can be converted into a generally defined series of size-extensive methods, which however have to be viewed as a series of simplified coupled cluster methods rather than a series of size-extensive CI methods. © 2013 John Wiley & Sons, Ltd.

Journal ArticleDOI
TL;DR: Cheminformatics is evolving from being a field of study associated primarily with drug discovery into a discipline that embraces the distribution, management, access, and sharing of chemical data, which depends on a range of factors: the principles of chemical identifiers and their role in relationships between chemical and biological entities.
Abstract: Cheminformatics is evolving from being a field of study associated primarily with drug discovery into a discipline that embraces the distribution, management, access, and sharing of chemical data. The relationship with the related subject of bioinformatics is becoming stronger and better defined, owing to the influence of Semantic Web technologies, which enable researchers to integrate heterogeneous sources of chemical, biochemical, biological, and medical information. These developments depend on a range of factors: the principles of chemical identifiers and their role in relationships between chemical and biological entities; the importance of preserving provenance and properly curated metadata; and an understanding of the contribution that the Semantic Web can make at all stages of the research lifecycle. The movements toward open access, open source, and open collaboration all contribute to progress toward the goals of integration.

Journal ArticleDOI
TL;DR: In the second century, enthalpies of formation were determined with precision in several laboratories, though usually on the microscale, as appropriate to the small quantities of rare or unstable species preparative chemists are able to win and purify as mentioned in this paper.
Abstract: Determination of enthalpies of formation, now well into its second century, contin- ues to be an active research field. Classical combustion thermochemistry, known by Lavoisier, is carried out with precision in several laboratories, though usually on the microscale, as appropriate to the small quantities of rare or unstable species preparative chemists are able to win and purify. Nonclassical methods such as differential scanning calorimetry and proton emission techniques are practiced. Enthalpy estimation based on additivity has been brought to an improved level of accuracy, and its basis in molecular structure has been examined with the goal of achieving maximum simplicity. Discrepancies between experimental results and additive estimates due to 'special effects' have brought about a considerable amount of causative speculation in the literature. Quantum mechanical methods have enjoyed increased proliferation through new methods of finding enthalpies of formation and other thermochemical and molecular properties such as heat capacity and entropy. Powerful basis set and configuration interaction software is available within the Gaussian c suites of programs. New levels of accuracy, in the kilojoules per mole range, have been achieved by Wn methods, and wider gener- ality is enjoyed by methods based on density functional theory. New tabulation methods have been introduced that use computer error estimation procedures to root out flawed experimental results and increase overall reliability of the data one selects from the compilation. C � 2012 John Wiley & Sons, Ltd.

Journal ArticleDOI
Peifeng Su1, Wei Wu1
TL;DR: In this paper, the authors proposed a method to solve the problem of artificial neural networks in the context of artificial intelligence.Ministry of Science and Technology of China [2011CB808504], Natural Science Foundation of China[21120102035, 21003101]
Abstract: Ministry of Science and Technology of China [2011CB808504]; Natural Science Foundation of China [21120102035, 21003101]