scispace - formally typeset
Proceedings ArticleDOI

On the (dis)similarity of transactional memory workloads

Reads0
Chats0
TLDR
The proposed transactional memory workload characterization techniques will help TM architects select a small, diverse, set of TM workloads for their design evaluation, and show that the methods presented in this paper can be used to identify specific feature subsets.
Abstract
Programming to exploit the resources in a multicore system remains a major obstacle for both computer and software engineers Transactional memory offers an attractive alternative to traditional concurrent programming but implementations emerged before the programming model, leaving a gap in the design process In previous research, transactional microbenchmarks have been used to evaluate designs or lock-based multithreaded workloads have been manually converted into their transactional equivalents; others have even created dedicated transactional benchmarks Yet, throughout all of the investigations, transactional memory researchers have not settled on a way to describe the runtime characteristics that these programs exhibit; nor has there been any attempt to unify the way transactional memory implementations are evaluated In addition, the similarity (or redundancy) of these workloads is largely unknown Evaluating transactional memory designs using workloads that exhibit similar characteristics will unnecessarily increase the number of simulations without contributing new insight On the other hand, arbitrarily choosing a subset of transactional memory workloads for evaluation can miss important features and lead to biased or incorrect conclusions In this work, we propose a set of architecture-independent transaction-oriented workload characteristics that can accurately capture the behavior of transactional code We apply principle component analysis and clustering algorithms to analyze the proposed workload characteristics collected from a set of SPLASH-2, STAMP, and PARSEC transactional memory programs Our results show that using transactional characteristics to cluster the chosen benchmarks can reduce the number of required simulations by almost half We also show that the methods presented in this paper can be used to identify specific feature subsets With the increasing number of TM workloads in the future, we believe that the proposed transactional memory workload characterization techniques will help TM architects select a small, diverse, set of TM workloads for their design evaluation

read more

Citations
More filters
Proceedings ArticleDOI

Exploring GPGPU workloads: Characterization methodology, analysis and microarchitecture evaluation implications

TL;DR: A set of microarchitecture agnostic GPGPU workload characteristics are proposed to represent them in a microarch Architecture independent space and a set of evaluation metrics to accurately evaluate the GPG PU design space are proposed.
Proceedings ArticleDOI

Eigenbench: A simple exploration tool for orthogonal TM characteristics

TL;DR: EigenBench, a lightweight yet powerful microbenchmark for fully evaluating a transactional memory system, is presented and it is shown that EigenBench is useful for thoroughly exploring the orthogonal space of TM application characteristics.
Proceedings ArticleDOI

RMS-TM: a comprehensive benchmark suite for transactional memory systems

TL;DR: RMS-TM is introduced, a Transactional Memory benchmark suite composed of seven real-world applications from the Recognition, Mining and Synthesis domain that provide a mix of short and long transactions with small/large read and write sets with low/medium/high contention rates.
Proceedings ArticleDOI

Optimizing throughput/power trade-offs in hardware transactional memory using DVFS and intelligent scheduling

TL;DR: New techniques to merge the power and transactional memory domains are proposed and are shown to reduce the energy delay squared product (ED2P) of the STAMP and SPLASH-2 benchmarks by an average of 18% when combined.
Proceedings ArticleDOI

SUV: A Novel Single-Update Version-Management Scheme for Hardware Transactional Memory Systems

TL;DR: This work proposes a novel Single-Update Version-management (SUV) scheme to redirect each transactional store operation to another memory address, track the mapping information between the original and redirected addresses, and switch to the proper version of data upon the transaction's commit or abort.
References
More filters
Book

Principal Component Analysis

TL;DR: In this article, the authors present a graphical representation of data using Principal Component Analysis (PCA) for time series and other non-independent data, as well as a generalization and adaptation of principal component analysis.
Reference EntryDOI

Principal Component Analysis

TL;DR: Principal component analysis (PCA) as discussed by the authors replaces the p original variables by a smaller number, q, of derived variables, the principal components, which are linear combinations of the original variables.
Proceedings ArticleDOI

The SPLASH-2 programs: characterization and methodological considerations

TL;DR: This paper quantitatively characterize the SPLASH-2 programs in terms of fundamental properties and architectural interactions that are important to understand them well, including the computational load balance, communication to computation ratio and traffic needs, important working set sizes, and issues related to spatial locality.
Proceedings ArticleDOI

The PARSEC benchmark suite: characterization and architectural implications

TL;DR: This paper presents and characterizes the Princeton Application Repository for Shared-Memory Computers (PARSEC), a benchmark suite for studies of Chip-Multiprocessors (CMPs), and shows that the benchmark suite covers a wide spectrum of working sets, locality, data sharing, synchronization and off-chip traffic.
BookDOI

Principal components analysis

TL;DR: In this paper, the concept of principal components is introduced and a number of techniques related to principal component analysis are presented, such as using principal components to select a subset of variables for regression analysis, detecting outliers, and detecting influential observations.
Related Papers (5)