scispace - formally typeset
Proceedings ArticleDOI

Runtime prediction on new architectures

Alexey Sidnev
- pp 17
TLDR
The paper examines peculiarities of software algorithms and suggests a method for processing experimental data provided by computational systems and a two-step method of problem solution using linear and non-linear machine learning algorithms.
Abstract
This paper formulates the program runtime prediction problem subject to algorithm parameters and characteristics of a computational system to be used to run the algorithm. It is suggested to build a model representing runtime as a function of algorithm parameters and computational system characteristics. This is followed by determination of features to be used for functional dependence recovery. A two-step method of problem solution using linear and non-linear machine learning algorithms is proposed. The paper examines peculiarities of software algorithms and suggests a method for processing experimental data provided by computational systems. It also features a comparative analysis of runtime prediction results for solution of several linear algebra problems on 84 personal computers and servers using a number of machine learning algorithms. Use of a random forest combined with the linear least square method shows an error of less than 15% for most computational systems of similar architecture.

read more

Citations
More filters

Dynamic Algorithm Portfolios.

TL;DR: This work reformulates algorithm selection as a time allocation problem: all candidate algorithms are run in parallel, and their relative priorities are continually updated based on runtime information, with the aim of minimizing the time to reach a desired performance level.
Book ChapterDOI

Hardware-Specific Selection the Most Fast-Running Software Components

Alexey Sidnev
TL;DR: The paper presents a comparative analysis of runtime prediction results for solving several linear algebra problems on 84 personal computers and servers and shows an error of less than 22% for computational systems represented in the training data set.
References
More filters
Journal ArticleDOI

Random Forests

TL;DR: Internal estimates monitor error, strength, and correlation and these are used to show the response to increasing the number of features used in the forest, and are also applicable to regression.
Book

The Elements of Statistical Learning: Data Mining, Inference, and Prediction

TL;DR: In this paper, the authors describe the important ideas in these areas in a common conceptual framework, and the emphasis is on concepts rather than mathematics, with a liberal use of color graphics.
Journal ArticleDOI

A fast learning algorithm for deep belief nets

TL;DR: A fast, greedy algorithm is derived that can learn deep, directed belief networks one layer at a time, provided the top two layers form an undirected associative memory.
Book ChapterDOI

Introduction to Algorithms

Xin-She Yang
TL;DR: This chapter provides an overview of the fundamentals of algorithms and their links to self-organization, exploration, and exploitation.
Journal Article

Radial Basis Functions, Multi-Variable Functional Interpolation and Adaptive Networks

David S. Broomhead, +1 more
- 28 Mar 1988 - 
TL;DR: The relationship between 'learning' in adaptive layered networks and the fitting of data with high dimensional surfaces is discussed, leading naturally to a picture of 'generalization in terms of interpolation between known data points and suggests a rational approach to the theory of such networks.
Related Papers (5)