scispace - formally typeset
Open Access

Processor design space exploration and performance prediction

About
The article was published on 2009-01-01 and is currently open access. It has received 1 citations till now. The article focuses on the topics: Design space exploration & Set (abstract data type).

read more

Content maybe subject to copyright    Report

Citations
More filters
Proceedings ArticleDOI

Architecture-Level Design Space Exploration of Super Scalar Microarchitecture for Network Applications

TL;DR: An exhaustive simulation for exploring the performance of instruction-level parallel super scalar processors executing packet-processing applications based on MIPS instruction set architecture is presented.
References
More filters
Journal ArticleDOI

Greedy function approximation: A gradient boosting machine.

TL;DR: A general gradient descent boosting paradigm is developed for additive expansions based on any fitting criterion, and specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification.
Book

The jackknife, the bootstrap, and other resampling plans

Bradley Efron
TL;DR: The Delta Method and the Influence Function Cross-Validation, Jackknife and Bootstrap Balanced Repeated Replication (half-sampling) Random Subsampling Nonparametric Confidence Intervals as mentioned in this paper.
Journal ArticleDOI

Stochastic gradient boosting

TL;DR: It is shown that both the approximation accuracy and execution speed of gradient boosting can be substantially improved by incorporating randomization into the procedure.
Related Papers (5)