scispace - formally typeset
Search or ask a question
Author

Timo Hämäläinen

Other affiliations: Dalian Medical University, Nokia, Dublin Institute of Technology  ...read more
Bio: Timo Hämäläinen is an academic researcher from University of Jyväskylä. The author has contributed to research in topics: Quality of service & Encoder. The author has an hindex of 38, co-authored 560 publications receiving 7648 citations. Previous affiliations of Timo Hämäläinen include Dalian Medical University & Nokia.


Papers
More filters
Journal ArticleDOI
TL;DR: Results show that the quality of decoded video can be improved by 1 dB with transparent connections compared to connections designed for general packet data, and a video coding subsystem must have access to the error control in a wireless link for the best quality in varying conditions.
Abstract: An experimental comparison of video protection methods targeted for wireless networks is presented. Basic methods are the data partitioning, reversible variable length coding, and macroblock row interleaving as well as macroblock scattering for packet loss protection. An implementation is described, in which scalable video is protected unequally with forward error correcting codes and retransmissions. Comparisons are performed for simulated wideband code division multiple access channel, and measurements are carried out with wireless local area network, Bluetooth as well as with GSM high speed circuit switched data. For the measurements, point-to-point connections are used. The achieved video quality is examined in our real-time wireless video demonstrator. The performance is measured with peak-signal-to-noise-ratio of received video, data overhead, communication delay, number of lost video frames, and decoding frame rate. Results show that the quality of decoded video can be improved by 1 dB with transparent connections compared to connections designed for general packet data. As a conclusion, a video coding subsystem must have access to the error control in a wireless link for the best quality in varying conditions.

6 citations

Proceedings ArticleDOI
01 Oct 2016
TL;DR: A novel Behavior Mining Language (BML) for a log file analyzing framework called LOGDIG is proposed for logs that include temporal data and extra-log system-specific data, which are present e.g. in Real Time Passenger Information Systems (RTPIS).
Abstract: Log files are often the only way to identify and locate errors in a deployed system. This paper proposes a novel Behavior Mining Language (BML) for a log file analyzing framework called LOGDIG. It is proposed for logs that include temporal data (timestamps) and extra-log system-specific data (e.g. spatial data with coordinates of moving objects), which are present e.g. in Real Time Passenger Information Systems (RTPIS). BML is state-machine-based, and specifies searches for desired events from the log files by adjustable accuracy. The analysis output is static behavioral knowledge and human friendly composite log files for reporting the results in legacy tools. Field data from a commercial RTPIS is used as a proof-of-concept case study. BML is Python-based for excellent development support, as well as self-explanatory and self-descriptive for correct-by-construct usage. Compared to a general language approach, BML code is much shorter and easier to maintain. In the RTPIS case study we compare BML to the closest related log file analysis language LFAL2. BML can be applied to complicated cases that are not possible to capture using LFAL2, but at the penalty of more setting-up effort. Thus, BML is positioned between the simple specific and the fully general programming languages used in log file analysis. BML fulfills specifically the needs for log analysis on RTPIS kind of systems.

6 citations

Journal ArticleDOI
TL;DR: Kactus2 is based on IEEE 1685-2014 “IP-Xact” standard, which defines an XML format for documents describing the components, designs and configurations, and offers the users the easiest to use tool to accomplish IP-XACT related EDA tasks.
Abstract: Kactus2 is based on IEEE 1685-2014 “IP-XACT” standard (“IEEE Standard for Ip-Xact, Standard Structure for Packaging, Integrating, and Reusing Ip Within Tool Flows” 2014), which defines an XML format for documents describing the components, designs and configurations. Ideally, this enables vendor independent integration between standard compatible tools. The IP-XACT standard is complex and versatile, but Kactus2 hides most of the complexity and offers the users the easiest to use tool to accomplish IP-XACT related EDA tasks. In addition, Kactus2 includes extensions for software components, software on hardware mapping and API abstraction, as well as physical product hierarchy including printed circuit board level.

6 citations

Proceedings ArticleDOI
16 May 2005
TL;DR: A packet scheduling method which guarantees bandwidth of the connection and optimizes revenue of the network service provider and a mechanism for guaranteeing a specified mean bandwidth for different service classes is presented.
Abstract: In this paper we present a packet scheduling method which guarantees bandwidth of the connection and optimizes revenue of the network service provider. A closed form formula for updating the adaptive weights of a packet scheduler is derived from a revenue-based optimization problem. The weight updating procedure is fast and independent on the assumption of the connections' statistical behavior. The features of the algorithm are simulated and analyzed with a call admission control (CAC) mechanism. We also show in context with the CAC procedure a mechanism for guaranteeing a specified mean bandwidth for different service classes.

6 citations

Journal ArticleDOI
TL;DR: In this article , the authors study the effects and feasibility of exploiting Log4j2 vulnerabilities in mission-critical aviation and maritime environments using the ACARS, ADS-B, and AIS protocols.
Abstract: Apache Log4j2 is a prevalent logging library for Java-based applications. In December 2021, several critical and high-impact software vulnerabilities, including CVE-2021-44228, were publicly disclosed, enabling remote code execution (RCE) and denial of service (DoS) attacks. To date, these vulnerabilities are considered critical and the consequences of their disclosure far-reaching. The vulnerabilities potentially affect a wide range of internet of things (IoT) devices, embedded devices, critical infrastructure (CI), and cyber-physical systems (CPSs). In this paper, we study the effects and feasibility of exploiting these vulnerabilities in mission-critical aviation and maritime environments using the ACARS, ADS-B, and AIS protocols. We develop a systematic methodology and an experimental setup to study and identify the protocols’ exploitable fields and associated attack payload features. For our experiments, we employ software-defined radios (SDRs), use open-source software, develop novel tools, and develop features to existing software. We evaluate the feasibility of the attacks and demonstrate end-to-end RCE with all three studied protocols. We demonstrate that the aviation and maritime environments are susceptible to the exploitation of the Log4j2 vulnerabilities, and that the attacks are feasible for non-sophisticated attackers. To facilitate further studies related to Log4j2 attacks on aerospace, aviation, and maritime infrastructures, we release relevant artifacts (e.g., software, documentation, and scripts) as open-source, complemented by patches for bugs in open-source software used in this study.

6 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Nov 2007
TL;DR: Comprehensive performance comparisons including accuracy, precision, complexity, scalability, robustness, and cost are presented.
Abstract: Wireless indoor positioning systems have become very popular in recent years. These systems have been successfully used in many applications such as asset tracking and inventory management. This paper provides an overview of the existing wireless indoor positioning solutions and attempts to classify different techniques and systems. Three typical location estimation schemes of triangulation, scene analysis, and proximity are analyzed. We also discuss location fingerprinting in detail since it is used in most current system or solutions. We then examine a set of properties by which location systems are evaluated, and apply this evaluation method to survey a number of existing systems. Comprehensive performance comparisons including accuracy, precision, complexity, scalability, robustness, and cost are presented.

4,123 citations

01 Jan 2006

3,012 citations

01 Jan 1990
TL;DR: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article, where the authors present an overview of their work.
Abstract: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article.

2,933 citations