scispace - formally typeset
Search or ask a question
Author

Timo Hämäläinen

Other affiliations: Dalian Medical University, Nokia, Dublin Institute of Technology  ...read more
Bio: Timo Hämäläinen is an academic researcher from University of Jyväskylä. The author has contributed to research in topics: Quality of service & Encoder. The author has an hindex of 38, co-authored 560 publications receiving 7648 citations. Previous affiliations of Timo Hämäläinen include Dalian Medical University & Nokia.


Papers
More filters
Proceedings ArticleDOI
21 Sep 2003
TL;DR: The adaptive resource allocation model that is based on the WFQ service policy is proposed and it is shown that the total revenue can be increased due to the allocation of unused resources to more expensive service classes.
Abstract: This paper proposes the adaptive resource allocation model that is based on the WFQ service policy. This model ensures QoS requirements and at the same time tries to maximize a service provider's revenue by manipulating weights of the WFQ policy. The model is flexible in that different network services are grouped into service classes and are given such QoS characteristics as bandwidth and delay. To adjust dynamically weights it is proposed to use the usage-based revenue criterion that enables the allocation of free resource between service classes. The simulation considers a single node with the implemented model that serves several service classes with different QoS requirements and traffic characteristics. It is shown that the total revenue can be increased due to the allocation of unused resources to more expensive service classes. Though the simulation has revealed that certain QoS threshold are violated, the percentage of them is insignificant and can be easily tolerated by many network applications.

7 citations

Proceedings ArticleDOI
23 May 2005
TL;DR: Several platform independent optimizations for a baseline profile H.264/AVC encoder include adaptive diamond pattern based motion estimation, fast sub-pel motion vector refinement and heuristic intra prediction, which achieves an encoding rate well above real-time limits.
Abstract: Several platform independent optimizations for a baseline profile H.264/AVC encoder are described. The optimizations include adaptive diamond pattern based motion estimation, fast sub-pel motion vector refinement and heuristic intra prediction. In addition, loop unrolling, early out thresholds and adaptive inverse transforms are used. An experimental complexity analysis is presented studying effect of optimizations on the encoding frame rate on the AMD Athlon processor. Trade-offs in rate-distortion performance are also measured. Compared to a public reference encoder, speed-ups of 4-8 have been obtained with 0.6-0.8 dB loss in image quality. In practice, our software only H.264 encoder achieves an encoding rate of 86 QCIF frames/s that is well above real-time limits.

7 citations

Journal ArticleDOI
TL;DR: The flow of design for neural network hardware is discussed and the design constraints and implementation possibilities are explored and the performance measures and problems of different measurements are discussed.
Abstract: In this article we discuss the flow of design for neural network hardware and go deeper into the design constraints and implementation possibilities. The performance measures and problems of different measurements are also discussed. It is noted that performance is one comparison criteria, but there are also many others, some of which are also discussed. In order to anchor the discussion to real life, the article includes a case study of our TUTNC neurocomputer. In addition, examples of commercial neural computing systems and their world wide web pages are given.

7 citations

Proceedings ArticleDOI
05 Dec 2005
TL;DR: A performance analysis of a recently proposed method flow-based fast handover method for mobile IPv6 (FFHMIPv6) in a real mobile IPv 6 environment and it has been found to be an efficient and a simple way to reduce the handover delay.
Abstract: In this paper we present a performance analysis of a recently proposed method flow-based fast handover method for mobile IPv6 (FFHMIPv6) in a real mobile IPv6 environment. The FFHMIPv6 uses the flow-state information and encapsulation to reduce the packet loss during the location update processes of mobile IPv6. In the experiments, the FFHMIPv6 handover is compared with the mobile IPv6 handover using the MIPL-based (mobile IPv6 for Linux) real network scenario. The effect of the distance of the nodes MN is communicating with, VoIP traffic and the processing delay caused by the FFHMIPv6 method are considered in the experiments. Also, some security aspects of the method have been considered. The FFHMIPv6 method has been found to be an efficient and a simple way to reduce the handover delay.

7 citations

Proceedings ArticleDOI
01 Aug 2017
TL;DR: The paper is aiming to provide a novel model for Cybersecurity Economics and Analysis (CEA) and measure and increase effectiveness of cybersecurity programs, while the cost-benefit framework will help to increase the economic and financial viability, effectiveness and value generation of cybersecurity solutions for organisation's strategic, tactical and operational imperative.
Abstract: In recent times, major cybersecurity breaches and cyber fraud had huge negative impact on victim organisations. The biggest impact made on major areas of business activities. Majority of organisations facing cybersecurity adversity and advanced threats suffers from huge financial and reputation loss. The current security technologies, policies and processes are providing necessary capabilities and cybersecurity mechanism to solve cyber threats and risks. However, current solutions are not providing required mechanism for decision making on impact of cybersecurity breaches and fraud. In this paper, we are reporting initial findings and proposing conceptual solution. The paper is aiming to provide a novel model for Cybersecurity Economics and Analysis (CEA). We will contribute to increasing harmonization of European cybersecurity initiatives and reducing fragmented practices of cybersecurity solutions and also helping to reach EU Digital Single Market goal. By introducing Cybersecurity Readiness Level Metrics the project will measure and increase effectiveness of cybersecurity programs, while the cost-benefit framework will help to increase the economic and financial viability, effectiveness and value generation of cybersecurity solutions for organisation's strategic, tactical and operational imperative. The ambition of the research development and innovation (RDI) is to increase and re-establish the trust of the European citizens in European digital environments through practical solutions.

7 citations


Cited by
More filters
Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Journal ArticleDOI
01 Nov 2007
TL;DR: Comprehensive performance comparisons including accuracy, precision, complexity, scalability, robustness, and cost are presented.
Abstract: Wireless indoor positioning systems have become very popular in recent years. These systems have been successfully used in many applications such as asset tracking and inventory management. This paper provides an overview of the existing wireless indoor positioning solutions and attempts to classify different techniques and systems. Three typical location estimation schemes of triangulation, scene analysis, and proximity are analyzed. We also discuss location fingerprinting in detail since it is used in most current system or solutions. We then examine a set of properties by which location systems are evaluated, and apply this evaluation method to survey a number of existing systems. Comprehensive performance comparisons including accuracy, precision, complexity, scalability, robustness, and cost are presented.

4,123 citations

01 Jan 2006

3,012 citations

01 Jan 1990
TL;DR: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article, where the authors present an overview of their work.
Abstract: An overview of the self-organizing map algorithm, on which the papers in this issue are based, is presented in this article.

2,933 citations