scispace - formally typeset
Search or ask a question
Conference

Foundational and Practical Aspects of Resource Analysis 

About: Foundational and Practical Aspects of Resource Analysis is an academic conference. The conference publishes majorly in the area(s): Abstract interpretation & Heap (data structure). Over the lifetime, 41 publications have been published by the conference receiving 283 citations.

Papers
More filters
Book ChapterDOI
11 Apr 2015
TL;DR: In insights into the trade-off of precision versus analyzability at these levels, a tool for experimentation with static analysis which infers such energy functions at two levels, the instruction set architecture and the intermediate code (LLVM IR) levels, and reflects it upwards to the higher source code level.
Abstract: The static estimation of the energy consumed by program executions is an important challenge, which has applications in program optimization and verification, and is instrumental in energy-aware software development. Our objective is to estimate such energy consumption in the form of functions on the input data sizes of programs. We have developed a tool for experimentation with static analysis which infers such energy functions at two levels, the instruction set architecture (ISA) and the intermediate code (LLVM IR) levels, and reflects it upwards to the higher source code level. This required the development of a translation from LLVM IR to an intermediate representation and its integration with existing components, a translation from ISA to the same representation, a resource analyzer, an ISA-level energy model, and a mapping from this model to LLVM IR. The approach has been applied to programs written in the XC language running on XCore architectures, but is general enough to be applied to other languages. Experimental results show that our LLVM IR level analysis is reasonably accurate (less than \(6.4\,\%\) average error vs. hardware measurements) and more powerful than analysis at the ISA level. This paper provides insights into the trade-off of precision versus analyzability at these levels.

37 citations

Book ChapterDOI
06 Nov 2009
TL;DR: It is shown that in the context of orthogonal term rewriting systems, derivational complexity is an invariant cost model, both in innermost and in outermost reduction.
Abstract: We show that in the context of orthogonal term rewriting systems, derivational complexity is an invariant cost model, both in innermost and in outermost reduction. This has some interesting consequences for (asymptotic) complexity analysis, since many existing methodologies only guarantee bounded derivational complexity.

36 citations

Book ChapterDOI
29 Aug 2013
TL;DR: The main achievement is the development of a technique for analysing non-functional properties of programs at the source level with little or no loss of accuracy and a small trusted code base.
Abstract: We provide an overview of the FET-Open Project CerCo (‘Certified Complexity’). Our main achievement is the development of a technique for analysing non-functional properties of programs (time, space) at the source level with little or no loss of accuracy and a small trusted code base. The core component is a C compiler, verified in Matita, that produces an instrumented copy of the source code in addition to generating object code. This instrumentation exposes, and tracks precisely, the actual (non-asymptotic) computational cost of the input program at the source level. Untrusted invariant generators and trusted theorem provers may then be used to compute and certify the parametric execution time of the code.

24 citations

Book ChapterDOI
06 Nov 2009
TL;DR: The approach first syntactically transforms functions into simpler forms and then applies a number of sufficient conditions which guarantee that a set of expressions is smaller than another expression, which indicates that the approach can be useful in practice.
Abstract: Cost functions provide information about the amount of resources required to execute a program in terms of the sizes of input arguments. They can provide an upper-bound, a lower-bound, or the average-case cost. Motivated by the existence of a number of automatic cost analyzers which produce cost functions, we propose an approach for automatically proving that a cost function is smaller than another one. In all applications of resource analysis, such as resource-usage verification, program synthesis and optimization, etc., it is essential to compare cost functions. This allows choosing an implementation with smaller cost or guaranteeing that the given resource-usage bounds are preserved. Unfortunately, automatically generated cost functions for realistic programs tend to be rather intricate, defined by multiple cases, involving non-linear subexpressions (e.g., exponential, polynomial and logarithmic) and they can contain multiple variables, possibly related by means of constraints. Thus, comparing cost functions is far from trivial. Our approach first syntactically transforms functions into simpler forms and then applies a number of sufficient conditions which guarantee that a set of expressions is smaller than another expression. Our preliminary implementation in the COSTA system indicates that the approach can be useful in practice.

19 citations

Book ChapterDOI
29 Aug 2013
TL;DR: Energy inefficient software implementations may cause battery drain for small systems and high energy costs for large systems and dynamic energy analysis is often applied to mitigate these issues.
Abstract: Energy inefficient software implementations may cause battery drain for small systems and high energy costs for large systems. Dynamic energy analysis is often applied to mitigate these issues. However, this is often hardware-specific and requires repetitive measurements using special equipment.

14 citations

Performance
Metrics
No. of papers from the Conference in previous years
YearPapers
20191
20172
20156
201311
20121
20119