scispace - formally typeset
Search or ask a question
Author

Eric Vanden-Eijnden

Bio: Eric Vanden-Eijnden is an academic researcher from New York University. The author has contributed to research in topics: Limit (mathematics) & Rare events. The author has an hindex of 65, co-authored 234 publications receiving 16483 citations. Previous affiliations of Eric Vanden-Eijnden include Institute for Advanced Study & Mercer University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the authors present an efficient method for computing the transition pathways, free energy barriers, and transition rates in complex systems with relatively smooth energy landscapes, i.e., smooth curves with intrinsic parametrization whose dynamics takes them to the most probable transition path between two metastable regions in configuration space.
Abstract: We present an efficient method for computing the transition pathways, free energy barriers, and transition rates in complex systems with relatively smooth energy landscapes. The method proceeds by evolving strings, i.e., smooth curves with intrinsic parametrization whose dynamics takes them to the most probable transition path between two metastable regions in configuration space. Free energy barriers and transition rates can then be determined by a standard umbrella sampling around the string. Applications to Lennard-Jones cluster rearrangement and thermally induced switching of a magnetic film are presented.

1,070 citations

Journal Article
TL;DR: This paper gives a systematic introduction to HMM, the heterogeneous multiscale methods, including the fundamental design principles behind the HMM philosophy and the main obstacles that have to be overcome.
Abstract: This paper gives a systematic introduction to HMM, the heterogeneous multiscale methods, including the fundamental design principles behind the HMM philosophy and the main obstacles that have to be ...

774 citations

Journal ArticleDOI
TL;DR: An approach is presented that allows for the reconstruction of the full ensemble of folding pathways from simulations that are much shorter than the folding time, and reveals the existence of misfolded trap states outside the network of efficient folding intermediates that significantly reduce the folding speed.
Abstract: Characterizing the equilibrium ensemble of folding pathways, including their relative probability, is one of the major challenges in protein folding theory today. Although this information is in principle accessible via all-atom molecular dynamics simulations, it is difficult to compute in practice because protein folding is a rare event and the affordable simulation length is typically not sufficient to observe an appreciable number of folding events, unless very simplified protein models are used. Here we present an approach that allows for the reconstruction of the full ensemble of folding pathways from simulations that are much shorter than the folding time. This approach can be applied to all-atom protein simulations in explicit solvent. It does not use a predefined reaction coordinate but is based on partitioning the state space into small conformational states and constructing a Markov model between them. A theory is presented that allows for the extraction of the full ensemble of transition pathways from the unfolded to the folded configurations. The approach is applied to the folding of a PinWW domain in explicit solvent where the folding time is two orders of magnitude larger than the length of individual simulations. The results are in good agreement with kinetic experimental data and give detailed insights about the nature of the folding process which is shown to be surprisingly complex and parallel. The analysis reveals the existence of misfolded trap states outside the network of efficient folding intermediates that significantly reduce the folding speed.

772 citations

Journal ArticleDOI
TL;DR: The heterogeneous multiscales method (HMM), a general framework for designing multiscale algorithms, is reviewed and emphasis is given to the error analysis that comes naturally with the framework.
Abstract: The heterogeneous multiscale method (HMM), a general framework for designing multiscale algorithms, is reviewed. Emphasis is given to the error analysis that comes naturally with the framework. Examples of finite element and finite difference HMM are presented. Applications to dynamical systems and stochastic simulation algorithms with multiple time scales, spall fracture and heat conduction in microprocessors are discussed.

675 citations

Journal ArticleDOI
TL;DR: A computational technique is proposed which combines the string method with a sampling technique to determine minimum free energy paths and captures the mechanism of transition in that it allows to determine the committor function for the reaction and, in particular, the transition state region.
Abstract: A computational technique is proposed which combines the string method with a sampling technique to determine minimum free energy paths. The technique only requires to compute the mean force and another conditional expectation locally along the string, and therefore can be applied even if the number of collective variables kept in the free energy calculation is large. This is in contrast with other free energy sampling techniques which aim at mapping the full free energy landscape and whose cost increases exponentially with the number of collective variables kept in the free energy. Provided that the number of collective variables is large enough, the new technique captures the mechanism of transition in that it allows to determine the committor function for the reaction and, in particular, the transition state region. The new technique is illustrated on the example of alanine dipeptide, in which we compute the minimum free energy path for the isomerization transition using either two or four dihedral angles as collective variables. It is shown that the mechanism of transition can be captured using the four dihedral angles, but it cannot be captured using only two of them.

662 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
TL;DR: QUANTUM ESPRESSO as discussed by the authors is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave).
Abstract: QUANTUM ESPRESSO is an integrated suite of computer codes for electronic-structure calculations and materials modeling, based on density-functional theory, plane waves, and pseudopotentials (norm-conserving, ultrasoft, and projector-augmented wave). The acronym ESPRESSO stands for opEn Source Package for Research in Electronic Structure, Simulation, and Optimization. It is freely available to researchers around the world under the terms of the GNU General Public License. QUANTUM ESPRESSO builds upon newly-restructured electronic-structure codes that have been developed and tested by some of the original authors of novel electronic-structure algorithms and applied in the last twenty years by some of the leading materials modeling groups worldwide. Innovation and efficiency are still its main focus, with special attention paid to massively parallel architectures, and a great effort being devoted to user friendliness. QUANTUM ESPRESSO is evolving towards a distribution of independent and interoperable codes in the spirit of an open-source project, where researchers active in the field of electronic-structure calculations are encouraged to participate in the project by contributing their own codes or by implementing their own ideas into existing codes.

19,985 citations

28 Jul 2005
TL;DR: PfPMP1)与感染红细胞、树突状组胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作�ly.
Abstract: 抗原变异可使得多种致病微生物易于逃避宿主免疫应答。表达在感染红细胞表面的恶性疟原虫红细胞表面蛋白1(PfPMP1)与感染红细胞、内皮细胞、树突状细胞以及胎盘的单个或多个受体作用,在黏附及免疫逃避中起关键的作用。每个单倍体基因组var基因家族编码约60种成员,通过启动转录不同的var基因变异体为抗原变异提供了分子基础。

18,940 citations

Journal ArticleDOI
TL;DR: A thorough exposition of community structure, or clustering, is attempted, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists.
Abstract: The modern science of networks has brought significant advances to our understanding of complex systems. One of the most relevant features of graphs representing real systems is community structure, or clustering, i. e. the organization of vertices in clusters, with many edges joining vertices of the same cluster and comparatively few edges joining vertices of different clusters. Such clusters, or communities, can be considered as fairly independent compartments of a graph, playing a similar role like, e. g., the tissues or the organs in the human body. Detecting communities is of great importance in sociology, biology and computer science, disciplines where systems are often represented as graphs. This problem is very hard and not yet satisfactorily solved, despite the huge effort of a large interdisciplinary community of scientists working on it over the past few years. We will attempt a thorough exposition of the topic, from the definition of the main elements of the problem, to the presentation of most methods developed, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

9,057 citations

Journal ArticleDOI
TL;DR: A thorough exposition of the main elements of the clustering problem can be found in this paper, with a special focus on techniques designed by statistical physicists, from the discussion of crucial issues like the significance of clustering and how methods should be tested and compared against each other, to the description of applications to real networks.

8,432 citations