scispace - formally typeset
Search or ask a question
Author

Jakob Engblom

Other affiliations: IAR Systems, Intel, Wind River Systems
Bio: Jakob Engblom is an academic researcher from Uppsala University. The author has contributed to research in topics: Worst-case execution time & Simics. The author has an hindex of 20, co-authored 47 publications receiving 3143 citations. Previous affiliations of Jakob Engblom include IAR Systems & Intel.

Papers
More filters
Journal ArticleDOI
TL;DR: Different approaches to the determination of upper bounds on execution times are described and several commercially available tools1 and research prototypes are surveyed.
Abstract: The determination of upper bounds on execution times, commonly called worst-case execution times (WCETs), is a necessary step in the development and validation process for hard real-time systems. This problem is hard if the underlying processor architecture has components, such as caches, pipelines, branch prediction, and other speculative components. This article describes different approaches to this problem and surveys several commercially available tools1 and research prototypes.

1,946 citations

Book
01 Jan 2002
TL;DR: Worst-Case Execution Time (WCET) estimates for programs are necessary when building real-time systems to ensure timely responses from interrupts, to guarantee the throughput of cycli systems.
Abstract: Worst-Case Execution Time (WCET) estimates for programs are necessary when building real-time systems. They are used to ensure timely responses from interrupts, to guarantee the throughput of cycli ...

157 citations

Proceedings ArticleDOI
27 Nov 2000
TL;DR: This paper presents a method for representing program flow information that is compact while still being strong enough to handle the types of flow previously considered in WCET research, and extends the set of representable flows compared to previous research.
Abstract: Knowing the worst-case execution time (WCET) of a program is necessary when designing and verifying real-time systems. The WCET depends both on the program flow (like loop iterations and function calls), and on hardware factors like caches and pipelines. In this paper, we present a method for representing program flow information that is compact while still being strong enough to handle the types of flow previously considered in WCET research. We also extend the set of representable flows compared to previous research. We give an algorithm for converting the flow information to the linear constraints used in calculating a WCET estimate in our WCET analysis tool. We demonstrate the practicality of the representation by modeling the flow of a number of programs, and show that execution time estimates can be made tighter by using flow information.

129 citations

Journal ArticleDOI
TL;DR: In this article, the authors present an overview of the worst-case execution time (WCET) analysis research performed by the WCET group of the ASTEC Competence Centre at Uppsala University.
Abstract: In this article we give an overview of the worst-case execution time (WCET) analysis research performed by the WCET group of the ASTEC Competence Centre at Uppsala University. Knowing the WCET of a program is necessary when designing and verifying real-time systems. The WCET depends both on the program flow, such as loop iterations and function calls, and on hardware factors, such as caches and pipelines. WCET estimates should be both safe (no underestimation allowed) and tight (as little overestimation as possible). We have defined a modular architecture for a WCET tool, used both to identify the components of the overall WCET analysis problem, and as a starting point for the development of a WCET tool prototype. Within this framework we have proposed solutions to several key problems in WCET analysis, including representation and analysis of the control flow of programs, modeling of the behavior and timing of pipelines and other low-level timing aspects, integration of control flow information and low-level timing to obtain a safe and tight WCET estimate, and validation of our tools and methods. We have focussed on the needs of embedded real-time systems in designing our tools and directing our research. Our long-term goal is to provide WCET analysis as a part of the standard tool chain for embedded development (together with compilers, debuggers, and simulators). This is facilitated by our cooperation with the embedded systems programming-tools vendor IAR Systems.

125 citations

Proceedings ArticleDOI
16 Nov 2001
TL;DR: A fast and effective WCET calculation method that takes account of low-level machine aspects like pipelining and caches, and high-level program flow like loops and infeasible paths, and is more efficient than previous path-based approaches, and can easily handle complex programs.
Abstract: Current development tools for embedded real-time systems do not efficiently support the timing aspect. The most important timing parameter for scheduling and system analysis is the Worst-Case Execution Time (WCET) of a program.This paper presents a fast and effective WCET calculation method that takes account of low-level machine aspects like pipelining and caches, and high-level program flow like loops and infeasible paths. The method is more efficient than previous path-based approaches, and can easily handle complex programs. By separating the low-level from the high-level analysis, the method is easy to retarget.Experiments confirm that speed does not sacrifice precision, and that programs with extreme numbers of potential execution paths can be analyzed quickly.

80 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: Different approaches to the determination of upper bounds on execution times are described and several commercially available tools1 and research prototypes are surveyed.
Abstract: The determination of upper bounds on execution times, commonly called worst-case execution times (WCETs), is a necessary step in the development and validation process for hard real-time systems. This problem is hard if the underlying processor architecture has components, such as caches, pipelines, branch prediction, and other speculative components. This article describes different approaches to this problem and surveys several commercially available tools1 and research prototypes.

1,946 citations

Journal ArticleDOI
TL;DR: The survey outlines fundamental results about multiprocessor real-time scheduling that hold independent of the scheduling algorithms employed, and provides a taxonomy of the different scheduling methods, and considers the various performance metrics that can be used for comparison purposes.
Abstract: This survey covers hard real-time scheduling algorithms and schedulability analysis techniques for homogeneous multiprocessor systems. It reviews the key results in this field from its origins in the late 1960s to the latest research published in late 2009. The survey outlines fundamental results about multiprocessor real-time scheduling that hold independent of the scheduling algorithms employed. It provides a taxonomy of the different scheduling methods, and considers the various performance metrics that can be used for comparison purposes. A detailed review is provided covering partitioned, global, and hybrid scheduling algorithms, approaches to resource sharing, and the latest results from empirical investigations. The survey identifies open issues, key research challenges, and likely productive research directions.

910 citations

Journal ArticleDOI
26 Feb 2015-Sensors
TL;DR: Two projects show that deterministic CPS models with faithful physical realizations are possible and practical and shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction.
Abstract: This paper is about better engineering of cyber-physical systems (CPSs) through better models. Deterministic models have historically proven extremely useful and arguably form the kingpin of the industrial revolution and the digital and information technology revolutions. Key deterministic models that have proven successful include differential equations, synchronous digital logic and single-threaded imperative programs. Cyber-physical systems, however, combine these models in such a way that determinism is not preserved. Two projects show that deterministic CPS models with faithful physical realizations are possible and practical. The first project is PRET, which shows that the timing precision of synchronous digital logic can be practically made available at the software level of abstraction. The second project is Ptides (programming temporally-integrated distributed embedded systems), which shows that deterministic models for distributed cyber-physical systems have practical faithful realizations. These projects are existence proofs that deterministic CPS models are possible and practical.

468 citations

Journal ArticleDOI
TL;DR: A comprehensive state-of-the-art survey of more than 20 performance prediction and measurement approaches for component-based software systems, classified according to the expressiveness of their component performance modelling languages.

374 citations