Topic
Multivariate adaptive regression splines
About: Multivariate adaptive regression splines is a research topic. Over the lifetime, 1982 publications have been published within this topic receiving 97308 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: In this article, a new method is presented for flexible regression modeling of high dimensional data, which takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data.
Abstract: A new method is presented for flexible regression modeling of high dimensional data. The model takes the form of an expansion in product spline basis functions, where the number of basis functions as well as the parameters associated with each one (product degree and knot locations) are automatically determined by the data. This procedure is motivated by the recursive partitioning approach to regression and shares its attractive properties. Unlike recursive partitioning, however, this method produces continuous models with continuous derivatives. It has more power and flexibility to model relationships that are nearly additive or involve interactions in at most a few variables. In addition, the model can be represented in a form that separately identifies the additive contributions and those associated with the different multivariable interactions.
6,651 citations
•
03 Dec 1996TL;DR: This work compares support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space and expects that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.
Abstract: A new regression technique based on Vapnik's concept of support vectors is introduced. We compare support vector regression (SVR) with a committee regression technique (bagging) based on regression trees and ridge regression done in feature space. On the basis of these experiments, it is expected that SVR will have advantages in high dimensionality space because SVR optimization does not depend on the dimensionality of the input space.
4,009 citations
•
01 Jul 1982
TL;DR: In this paper, the authors present the foundations of multiple regression analysis and its application in computer science. But, they do not discuss the application in the field of computer science, except for the following:
Abstract: Part I: Foundations of Multiple Regression Analysis. Overview. Simple Linear Regression and Correlation. Regression Diagnostics. Computers and Computer Programs. Elements of Multiple Regression Analysis: Two Independent Variables. General Method of Multiple Regression Analysis: Matrix Operations. Statistical Control: Partial and Semi-Partial Correlation. Prediction. Part II: Multiple Regression Analysis. Variance Partitioning. Analysis of Effects. A Categorical Independent Variable: Dummy, Effect, And Orthogonal Coding. Multiple Categorical Independent Variables and Factorial Designs. Curvilinear Regression Analysis. Continuous and Categorical Independent Variables I: Attribute-Treatment Interaction, Comparing Regression Equations. Continuous and Categorical Independent Variables II: Analysis of Covariance. Elements of Multilevel Analysis. Categorical Dependent Variable: Logistic Regression. Part III: Structural Equation Models. Structural Equation Models with Observed Variables: Path Analysis. Structural Equation Models with Latent Variables. Part IV: Multivariate Analysis. Regression, Discriminant, And Multivariate Analysis of Variance: Two Groups. Canonical, Discriminant, And Multivariate Analysis of Variance: Extensions. Appendices.
3,931 citations
••
TL;DR: Elements of Sampling Theory and Methods is unique in its presentation of materials, and the book’s price is reasonable in comparison to the other four books mentioned in this area.
Abstract: (2002). Introduction to Linear Regression Analysis. Technometrics: Vol. 44, No. 2, pp. 191-192.
2,818 citations
•
01 Jan 1977
TL;DR: In this paper, the authors take into serious consideration the further development of regression computer programs that are efficient, accurate, and considered an important part of statistical research, and provide up-to-date accounts of computational methods and algorithms currently in use without getting entrenched in minor computing details.
Abstract: Description: Regression analysis is an often used tool in the statistician's toolbox. This new edition takes into serious consideration the furthering development of regression computer programs that are efficient, accurate, and considered an important part of statistical research. The book provides up-to-date accounts of computational methods and algorithms currently in use without getting entrenched in minor computing details.
2,811 citations