scispace - formally typeset
Search or ask a question
Journal ArticleDOI

J. Appl. Cryst.の発刊に際して

10 Mar 1970-Vol. 12, Iss: 1, pp 1-1
About: The article was published on 1970-03-10 and is currently open access. It has received 8159 citations till now.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper could serve as a general literature citation when one or more of the open-source SH ELX programs (and the Bruker AXS version SHELXTL) are employed in the course of a crystal-structure determination.
Abstract: An account is given of the development of the SHELX system of computer programs from SHELX-76 to the present day. In addition to identifying useful innovations that have come into general use through their implementation in SHELX, a critical analysis is presented of the less-successful features, missed opportunities and desirable improvements for future releases of the software. An attempt is made to understand how a program originally designed for photographic intensity data, punched cards and computers over 10000 times slower than an average modern personal computer has managed to survive for so long. SHELXL is the most widely used program for small-molecule refinement and SHELXS and SHELXD are often employed for structure solution despite the availability of objectively superior programs. SHELXL also finds a niche for the refinement of macromolecules against high-resolution or twinned data; SHELXPRO acts as an interface for macromolecular applications. SHELXC, SHELXD and SHELXE are proving useful for the experimental phasing of macromolecules, especially because they are fast and robust and so are often employed in pipelines for high-throughput phasing. This paper could serve as a general literature citation when one or more of the open-source SHELX programs (and the Bruker AXS version SHELXTL) are employed in the course of a crystal-structure determination.

81,116 citations


Cites background from "J. Appl. Cryst.の発刊に際して"

  • ...These days such padding is less desirable and there are excellent programs such as enCIFer (Allen et al., 2004) for working with CIF files, so CIFTAB is now effectively redundant....

    [...]

Journal ArticleDOI
TL;DR: The goals of the PDB are described, the systems in place for data deposition and access, how to obtain further information and plans for the future development of the resource are described.
Abstract: The Protein Data Bank (PDB; http://www.rcsb.org/pdb/ ) is the single worldwide archive of structural data of biological macromolecules. This paper describes the goals of the PDB, the systems in place for data deposition and access, how to obtain further information, and near-term plans for the future development of the resource.

34,239 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...This dictionary contains among oth i ems descriptions of the solution components, the experime conditions, enumerated lists of the instruments used, as we information about structure refinement....

    [...]

Journal ArticleDOI
TL;DR: New features added to the refinement program SHELXL since 2008 are described and explained.
Abstract: The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as `a CIF') containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

28,425 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...Multithreading is achieved using OpenMP along the lines suggested by Diederichs (2000), and the program is particularly suitable for multiple-core processors....

    [...]

Journal ArticleDOI
TL;DR: CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and analysis, and publication-quality graphics.
Abstract: CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and analysis, and publication-quality graphics. The map-fitting tools are available as a stand-alone package, distributed as `Coot'.

27,505 citations


Cites background or methods from "J. Appl. Cryst.の発刊に際して"

  • ...…e-mail: emsley@ysbl.york.ac.uk # 2004 International Union of Crystallography Printed in Denmark ± all rights reserved CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and…...

    [...]

  • ...The introduction of FRODO (Jones, 1978) and then O (Jones et al., 1991) to the ®eld of protein crystallography was in each case revolutionary, each in their time breaking new ground in demonstrating what was possible with the current hardware....

    [...]

Journal ArticleDOI
TL;DR: The PHENIX software for macromolecular structure determination is described and its uses and benefits are described.
Abstract: Macromolecular X-ray crystallography is routinely applied to understand biological processes at a molecular level. How­ever, significant time and effort are still required to solve and complete many of these structures because of the need for manual interpretation of complex numerical data using many software packages and the repeated use of interactive three-dimensional graphics. PHENIX has been developed to provide a comprehensive system for macromolecular crystallo­graphic structure solution with an emphasis on the automation of all procedures. This has relied on the development of algorithms that minimize or eliminate subjective input, the development of algorithms that automate procedures that are traditionally performed by hand and, finally, the development of a framework that allows a tight integration between the algorithms.

18,531 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...After ensuring that the diffraction data are sound and understood, the next critical necessity for solving a structure is the determination of phases using one of several strategies (Adams, Afonine et al., 2009)....

    [...]

  • ...Tools such as efficient rigid-body refinement (multiplezones algorithm; Afonine et al., 2009), simulated-annealing refinement (Brünger et al., 1987) in Cartesian or torsion-angle space (Grosse-Kunstleve et al., 2009), automatic NCS detection and its use as restraints in refinement are important at…...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: The results of the fifth blind test of crystal structure prediction, which show important success with more challenging large and flexible molecules, are presented and discussed.
Abstract: Following on from the success of the previous crystal structure prediction blind tests (CSP1999, CSP2001, CSP2004 and CSP2007), a fifth such collaborative project (CSP2010) was organized at the Cambridge Crystallographic Data Centre. A range of methodologies was used by the participating groups in order to evaluate the ability of the current computational methods to predict the crystal structures of the six organic molecules chosen as targets for this blind test. The first four targets, two rigid molecules, one semi-flexible molecule and a 1:1 salt, matched the criteria for the targets from CSP2007, while the last two targets belonged to two new challenging categories – a larger, much more flexible molecule and a hydrate with more than one polymorph. Each group submitted three predictions for each target it attempted. There was at least one successful prediction for each target, and two groups were able to successfully predict the structure of the large flexible molecule as their first place submission. The results show that while not as many groups successfully predicted the structures of the three smallest molecules as in CSP2007, there is now evidence that methodologies such as dispersion-corrected density functional theory (DFT-D) are able to reliably do so. The results also highlight the many challenges posed by more complex systems and show that there are still issues to be overcome.

352 citations

Journal ArticleDOI
TL;DR: Integral breadth methods for line profile analysis, including modifications of the Williamson-Hall method recently proposed for the specific case of dislocation strain broadening, are reviewed in this article, supported by the results of a TEM investigation.
Abstract: Integral breadth methods for line profile analysis are reviewed, including modifications of the Williamson–Hall method recently proposed for the specific case of dislocation strain broadening. Two cases of study, supported by the results of a TEM investigation, are considered in detail: nanocrystalline ceria crystallized from amorphous precursors and highly deformed nickel powder produced by extensive ball milling. A further application concerns a series of Fe–Mo powder specimens that were ball milled for increasing time. Traditional and modified Williamson–Hall methods confirm their merits for a rapid overview of the line broadening effects and possible understanding of the main causes. However, quantitative results are generally not reliable. Limits in the applicability of integral breadth methods and reliability of the results are discussed in detail.

347 citations

Journal ArticleDOI
TL;DR: In this paper, a fast and non-destructive method for generating three-dimensional maps of the grain boundaries in undeformed polycrystals is presented, which relies on tracking of micro-focused high-energy X-rays.
Abstract: A fast and non-destructive method for generating three-dimensional maps of the grain boundaries in undeformed polycrystals is presented. The method relies on tracking of micro-focused high-energy X-rays. It is verified by comparing an electron microscopy map of the orientations on the 2.5 × 2.5 mm surface of an aluminium polycrystal with tracking data produced at the 3DXRD microscope at the European Synchrotron Radiation Facility. The average difference in grain boundary position between the two techniques is 26 µm, comparable with the spatial resolution of the 3DXRD microscope. As another extension of the tracking concept, algorithms for determining the stress state of the individual grains are derived. As a case study, 3DXRD results are presented for the tensile deformation of a copper specimen. The strain tensor for one embedded grain is determined as a function of load. The accuracy on the strain is Δ∊ ≃ 10−4.

347 citations

Journal ArticleDOI
TL;DR: A new software system called PHENIX (Python-based Hierarchical ENvironment for Integrated Xtallography) is being developed for the automation of crystallographic structure solution, which will provide the necessary algorithms to proceed from reduced intensity data to a refined molecular model, and facilitate structure solution for both the novice and expert crystallographer.
Abstract: A new software system called PHENIX (Python-based Hierarchical ENvironment for Integrated Xtallography) is being developed for the automation of crystallographic structure solution. This will provide the necessary algorithms to proceed from reduced intensity data to a refined molecular model, and facilitate structure solution for both the novice and expert crystallographer. Here, the features of PHENIXare reviewed and the recent advances in infrastructure and algorithms are briefly described.

346 citations

Journal ArticleDOI
TL;DR: The emerging technique of serial X-ray diffraction requires new software tools for the efficient analysis of large volumes of data and event selection early in the analysis pipeline is highly advantageous.
Abstract: The emerging technique of serial X-ray diffraction, in which diffraction data are collected from samples flowing across a pulsed X-ray source at repetition rates of 100 Hz or higher, has necessitated the development of new software in order to handle the large data volumes produced. Sorting of data according to different criteria and rapid filtering of events to retain only diffraction patterns of interest results in significant reductions in data volume, thereby simplifying subsequent data analysis and management tasks. Meanwhile the generation of reduced data in the form of virtual powder patterns, radial stacks, histograms and other meta data creates data set summaries for analysis and overall experiment evaluation. Rapid data reduction early in the analysis pipeline is proving to be an essential first step in serial imaging experiments, prompting the authors to make the tool described in this article available to the general community. Originally developed for experiments at X-ray free-electron lasers, the software is based on a modular facility-independent library to promote portability between different experiments and is available under version 3 or later of the GNU General Public License.

340 citations