scispace - formally typeset
Search or ask a question
Journal ArticleDOI

J. Appl. Cryst.の発刊に際して

10 Mar 1970-Vol. 12, Iss: 1, pp 1-1
About: The article was published on 1970-03-10 and is currently open access. It has received 8159 citations till now.
Citations
More filters
Journal ArticleDOI
TL;DR: This paper could serve as a general literature citation when one or more of the open-source SH ELX programs (and the Bruker AXS version SHELXTL) are employed in the course of a crystal-structure determination.
Abstract: An account is given of the development of the SHELX system of computer programs from SHELX-76 to the present day. In addition to identifying useful innovations that have come into general use through their implementation in SHELX, a critical analysis is presented of the less-successful features, missed opportunities and desirable improvements for future releases of the software. An attempt is made to understand how a program originally designed for photographic intensity data, punched cards and computers over 10000 times slower than an average modern personal computer has managed to survive for so long. SHELXL is the most widely used program for small-molecule refinement and SHELXS and SHELXD are often employed for structure solution despite the availability of objectively superior programs. SHELXL also finds a niche for the refinement of macromolecules against high-resolution or twinned data; SHELXPRO acts as an interface for macromolecular applications. SHELXC, SHELXD and SHELXE are proving useful for the experimental phasing of macromolecules, especially because they are fast and robust and so are often employed in pipelines for high-throughput phasing. This paper could serve as a general literature citation when one or more of the open-source SHELX programs (and the Bruker AXS version SHELXTL) are employed in the course of a crystal-structure determination.

81,116 citations


Cites background from "J. Appl. Cryst.の発刊に際して"

  • ...These days such padding is less desirable and there are excellent programs such as enCIFer (Allen et al., 2004) for working with CIF files, so CIFTAB is now effectively redundant....

    [...]

Journal ArticleDOI
TL;DR: The goals of the PDB are described, the systems in place for data deposition and access, how to obtain further information and plans for the future development of the resource are described.
Abstract: The Protein Data Bank (PDB; http://www.rcsb.org/pdb/ ) is the single worldwide archive of structural data of biological macromolecules. This paper describes the goals of the PDB, the systems in place for data deposition and access, how to obtain further information, and near-term plans for the future development of the resource.

34,239 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...This dictionary contains among oth i ems descriptions of the solution components, the experime conditions, enumerated lists of the instruments used, as we information about structure refinement....

    [...]

Journal ArticleDOI
TL;DR: New features added to the refinement program SHELXL since 2008 are described and explained.
Abstract: The improvements in the crystal structure refinement program SHELXL have been closely coupled with the development and increasing importance of the CIF (Crystallographic Information Framework) format for validating and archiving crystal structures. An important simplification is that now only one file in CIF format (for convenience, referred to simply as `a CIF') containing embedded reflection data and SHELXL instructions is needed for a complete structure archive; the program SHREDCIF can be used to extract the .hkl and .ins files required for further refinement with SHELXL. Recent developments in SHELXL facilitate refinement against neutron diffraction data, the treatment of H atoms, the determination of absolute structure, the input of partial structure factors and the refinement of twinned and disordered structures. SHELXL is available free to academics for the Windows, Linux and Mac OS X operating systems, and is particularly suitable for multiple-core processors.

28,425 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...Multithreading is achieved using OpenMP along the lines suggested by Diederichs (2000), and the program is particularly suitable for multiple-core processors....

    [...]

Journal ArticleDOI
TL;DR: CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and analysis, and publication-quality graphics.
Abstract: CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and analysis, and publication-quality graphics. The map-fitting tools are available as a stand-alone package, distributed as `Coot'.

27,505 citations


Cites background or methods from "J. Appl. Cryst.の発刊に際して"

  • ...…e-mail: emsley@ysbl.york.ac.uk # 2004 International Union of Crystallography Printed in Denmark ± all rights reserved CCP4mg is a project that aims to provide a general-purpose tool for structural biologists, providing tools for X-ray structure solution, structure comparison and…...

    [...]

  • ...The introduction of FRODO (Jones, 1978) and then O (Jones et al., 1991) to the ®eld of protein crystallography was in each case revolutionary, each in their time breaking new ground in demonstrating what was possible with the current hardware....

    [...]

Journal ArticleDOI
TL;DR: The PHENIX software for macromolecular structure determination is described and its uses and benefits are described.
Abstract: Macromolecular X-ray crystallography is routinely applied to understand biological processes at a molecular level. How­ever, significant time and effort are still required to solve and complete many of these structures because of the need for manual interpretation of complex numerical data using many software packages and the repeated use of interactive three-dimensional graphics. PHENIX has been developed to provide a comprehensive system for macromolecular crystallo­graphic structure solution with an emphasis on the automation of all procedures. This has relied on the development of algorithms that minimize or eliminate subjective input, the development of algorithms that automate procedures that are traditionally performed by hand and, finally, the development of a framework that allows a tight integration between the algorithms.

18,531 citations


Cites methods from "J. Appl. Cryst.の発刊に際して"

  • ...After ensuring that the diffraction data are sound and understood, the next critical necessity for solving a structure is the determination of phases using one of several strategies (Adams, Afonine et al., 2009)....

    [...]

  • ...Tools such as efficient rigid-body refinement (multiplezones algorithm; Afonine et al., 2009), simulated-annealing refinement (Brünger et al., 1987) in Cartesian or torsion-angle space (Grosse-Kunstleve et al., 2009), automatic NCS detection and its use as restraints in refinement are important at…...

    [...]

References
More filters
Journal ArticleDOI
TL;DR: A number of emerging applications, based on the current understanding of the structural properties of peptides, are presented in the context of domain fusion of synthetic multifunctional chimeric proteins.
Abstract: Control of structural flexibility is essential for the proper functioning of a large number of proteins and multiprotein complexes. At the residue level, such flexibility occurs due to local relaxation of peptide bond angles whose cumulative effect may result in large changes in the secondary, tertiary or quaternary structures of protein molecules. Such flexibility, and its absence, most often depends on the nature of interdomain linkages formed by oligopeptides. Both flexible and relatively rigid peptide linkers are found in many multidomain proteins. Linkers are thought to control favorable and unfavorable interactions between adjacent domains by means of variable softness furnished by their primary sequence. Large-scale structural heterogeneity of multidomain proteins and their complexes, facilitated by soft peptide linkers, is now seen as the norm rather than the exception. Biophysical discoveries as well as computational algorithms and databases have reshaped our understanding of the often spectacular biomolecular dynamics enabled by soft linkers. Absence of such motion, as in so-called molecular rulers, also has desirable functional effects in protein architecture. We review here the historic discovery and current understanding of the nature of domains and their linkers from a structural, computational, and biophysical point of view. A number of emerging applications, based on the current understanding of the structural properties of peptides, are presented in the context of domain fusion of synthetic multifunctional chimeric proteins.

216 citations

Journal ArticleDOI
TL;DR: Recent developments in PHENIX are reported that allow the use of reference-model torsion restraints, secondary-structure hydrogen-bond restraints and Ramachandran restraints for improved macromolecular refinement in phenix.
Abstract: Traditional methods for macromolecular refinement often have limited success at low resolution (3.0–3.5 A or worse), producing models that score poorly on crystallographic and geometric validation criteria. To improve low-resolution refinement, knowledge from macromolecular chemistry and homology was used to add three new coordinate-restraint functions to the refinement program phenix.refine. Firstly, a `reference-model' method uses an identical or homologous higher resolution model to add restraints on torsion angles to the geometric target function. Secondly, automatic restraints for common secondary-structure elements in proteins and nucleic acids were implemented that can help to preserve the secondary-structure geometry, which is often distorted at low resolution. Lastly, we have implemented Ramachandran-based restraints on the backbone torsion angles. In this method, a φ,ψ term is added to the geometric target function to minimize a modified Ramachandran landscape that smoothly combines favorable peaks identified from non­redundant high-quality data with unfavorable peaks calculated using a clash-based pseudo-energy function. All three methods show improved MolProbity validation statistics, typically complemented by a lowered Rfree and a decreased gap between Rwork and Rfree.

216 citations

Journal ArticleDOI
TL;DR: In this paper, the authors used the Pascal programming language for the calculation of diffraction contrast factors of dislocations in elastically anisotropic cubic, hexagonal and trigonal crystals.
Abstract: The computer program ANIZC has been developed using the Pascal programming language for the calculation of diffraction contrast factors of dislocations in elastically anisotropic cubic, hexagonal and trigonal crystals. The contrast factor is obtained numerically by integrating the angular part of the distortion tensor in the slip plane. The distortion tensor is calculated by solving the sextic equation provided by the mechanical equilibrium of a single dislocation in an infinite anisotropic medium. The contrast factors can be used for the interpretation of strain anisotropy as obtained from peak profile measurements made on either single crystals, textured polycrystals or powders.

216 citations

Journal ArticleDOI
TL;DR: A systematic approach to the scaling and merging of data from multiple crystals in macromolecular crystallography is introduced and explained.
Abstract: The availability of intense microbeam macromolecular crystallography beamlines at third-generation synchrotron sources has enabled data collection and structure solution from microcrystals of <10 µm in size. The increased likelihood of severe radiation damage where microcrystals or particularly sensitive crystals are used forces crystallographers to acquire large numbers of data sets from many crystals of the same protein structure. The associated analysis and merging of multi-crystal data is currently a manual and time-consuming step. Here, a computer program, BLEND, that has been written to assist with and automate many of the steps in this process is described. It is demonstrated how BLEND has successfully been used in the solution of a novel membrane protein.

215 citations

Journal ArticleDOI
TL;DR: Making the most of hard-won data in protein crystallography: to keep or not to keep, that is the question.
Abstract: In macromolecular X-ray crystallography, typical data sets have substantial multiplicity. This can be used to calculate the consistency of repeated measurements and thereby assess data quality. Recently, the properties of a correlation coefficient, CC1/2, that can be used for this purpose were characterized and it was shown that CC1/2 has superior properties compared with `merging' R values. A derived quantity, CC*, links data and model quality. Using experimental data sets, the behaviour of CC1/2 and the more conventional indicators were compared in two situations of practical importance: merging data sets from different crystals and selectively rejecting weak observations or (merged) unique reflections from a data set. In these situations controlled `paired-refinement' tests show that even though discarding the weaker data leads to improvements in the merging R values, the refined models based on these data are of lower quality. These results show the folly of such data-filtering practices aimed at improving the merging R values. Interestingly, in all of these tests CC1/2 is the one data-quality indicator for which the behaviour accurately reflects which of the alternative data-handling strategies results in the best-quality refined model. Its properties in the presence of systematic error are documented and discussed.

213 citations