scispace - formally typeset
Search or ask a question
Topic

Model building

About: Model building is a research topic. Over the lifetime, 1314 publications have been published within this topic receiving 47007 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: This work discusses the use of Graduating Functions, design Aspects of Variance, Bias, and Lack of Fit, and Practical Choice of a Response Surface Design in relation to Second--Order Response Surfaces.

4,363 citations

Journal ArticleDOI
TL;DR: A recent survey of capture-recapture models can be found in this article, with an emphasis on flexibility in modeling, model selection, and the analysis of multiple data sets.
Abstract: The understanding of the dynamics of animal populations and of related ecological and evolutionary issues frequently depends on a direct analysis of life history parameters. For instance, examination of trade-offs between reproduction and survival usually rely on individually marked animals, for which the exact time of death is most often unknown, because marked individuals cannot be followed closely through time. Thus, the quantitative analysis of survival studies and experiments must be based on capture- recapture (or resighting) models which consider, besides the parameters of primary interest, recapture or resighting rates that are nuisance parameters. Capture-recapture models oriented to estimation of survival rates are the result of a recent change in emphasis from earlier approaches in which population size was the most important parameter, survival rates having been first introduced as nuisance parameters. This emphasis on survival rates in capture-recapture models developed rapidly in the 1980s and used as a basic structure the Cormack-Jolly-Seber survival model applied to an homogeneous group of animals, with various kinds of constraints on the model parameters. These approaches are conditional on first captures; hence they do not attempt to model the initial capture of unmarked animals as functions of population abundance in addition to survival and capture probabilities. This paper synthesizes, using a common framework, these recent developments together with new ones, with an emphasis on flexibility in modeling, model selection, and the analysis of multiple data sets. The effects on survival and capture rates of time, age, and categorical variables characterizing the individuals (e.g., sex) can be considered, as well as interactions between such effects. This "analysis of variance" philosophy emphasizes the structure of the survival and capture process rather than the technical characteristics of any particular model. The flexible array of models encompassed in this synthesis uses a common notation. As a result of the great level of flexibility and relevance achieved, the focus is changed from fitting a particular model to model building and model selection. The following procedure is recommended: (1) start from a global model compatible with the biology of the species studied and with the design of the study, and assess its fit; (2) select a more parsimonious model using Akaike's Information Criterion to limit the number of formal tests; (3) test for the most important biological questions by comparing this model with neighboring ones using likelihood ratio tests; and (4) obtain maximum likelihood estimates of model parameters with estimates of precision. Computer software is critical, as few of the models now available have parameter estimators that are in closed form. A comprehensive table of existing computer software is provided. We used RELEASE for data summary and goodness-of-fit tests and SURGE for iterative model fitting and the computation of likelihood ratio tests. Five increasingly complex examples are given to illustrate the theory. The first, using two data sets on the European Dipper (Cinclus cinclus), tests for sex-specific parameters,

4,038 citations

Journal ArticleDOI
TL;DR: Automatic pattern recognition (model building) combined with refinement, allows a structural model to be obtained reliably within a few CPU hours and is demonstrated with examples of a few recently solved structures.
Abstract: In protein crystallography, much time and effort are often required to trace an initial model from an interpretable electron density map and to refine it until it best agrees with the crystallographic data. Here, we present a method to build and refine a protein model automatically and without user intervention, starting from diffraction data extending to resolution higher than 2.3 A and reasonable estimates of crystallographic phases. The method is based on an iterative procedure that describes the electron density map as a set of unconnected atoms and then searches for protein-like patterns. Automatic pattern recognition (model building) combined with refinement, allows a structural model to be obtained reliably within a few CPU hours. We demonstrate the power of the method with examples of a few recently solved structures.

2,463 citations


Network Information
Related Topics (5)
Software
130.5K papers, 2M citations
73% related
Matrix (mathematics)
105.5K papers, 1.9M citations
72% related
Cluster analysis
146.5K papers, 2.9M citations
71% related
Artificial neural network
207K papers, 4.5M citations
68% related
Inference
36.8K papers, 1.3M citations
68% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202314
202224
202135
202043
201957
201859