scispace - formally typeset
Search or ask a question
Author

Xin Liu

Bio: Xin Liu is an academic researcher from Dalian University of Technology. The author has contributed to research in topics: Medicine & Cognitive radio. The author has an hindex of 69, co-authored 885 publications receiving 19231 citations. Previous affiliations of Xin Liu include Singapore Science Park & Nanjing University of Aeronautics and Astronautics.


Papers
More filters
Journal ArticleDOI
TL;DR: It is demonstrated via simulation results that the opportunistic transmission scheduling scheme is robust to estimation errors and also works well for nonstationary scenarios, resulting in performance improvements of 20%-150% compared with a scheduling scheme that does not take into account channel conditions.
Abstract: We present an "opportunistic" transmission scheduling policy that exploits time-varying channel conditions and maximizes the system performance stochastically under a certain resource allocation constraint. We establish the optimality of the scheduling scheme and also that every user experiences a performance improvement over any nonopportunistic scheduling policy when users have independent performance values. We demonstrate via simulation results that the scheme is robust to estimation errors and also works well for nonstationary scenarios, resulting in performance improvements of 20%-150% compared with a scheduling scheme that does not take into account channel conditions. Last, we discuss an extension of our opportunistic scheduling scheme to improve "short-term" performance.

652 citations

Journal ArticleDOI
TL;DR: It is discussed how dynamic operation of cellular base stations, in which redundant base stations are switched off during periods of low traffic such as at night, can provide significant energy savings, and quantitatively estimate these potential savings through a first-order analysis.
Abstract: The operation of cellular network infrastructure incurs significant electrical energy consumption. From the perspective of cellular network operators, reducing this consumption is not only a matter of showing environmental responsibility, but also of substantially reducing their operational expenditure. We discuss how dynamic operation of cellular base stations, in which redundant base stations are switched off during periods of low traffic such as at night, can provide significant energy savings. We quantitatively estimate these potential savings through a first-order analysis based on real cellular traffic traces and information regarding base station locations in a part of Manchester, United Kingdom. We also discuss a number of open issues pertinent to implementing such energy-efficient dynamic base station operation schemes, such as various approaches to ensure coverage, and interoperator coordination.

587 citations

Journal ArticleDOI
TL;DR: It is demonstrated via simulation that opportunistic scheduling schemes result in significant performance improvement compared with non-opportunistic alternatives.

544 citations

Proceedings ArticleDOI
13 Apr 2008
TL;DR: This work provides closed form analysis on secondary user performance, presents a tight capacity upper bound, and reveals the impact of various design options, such as sensing, packet length distribution, back-off time, packet overhead, and grouping.
Abstract: Driven by regulatory initiatives and radio technology advances, opportunistic spectrum access has the potential to mitigate spectrum scarcity and meet the increasing demand for spectrum. In this paper, we consider a scenario where secondary users can opportunistically access unused spectrum vacated by idle primaries. We introduce two metrics to protect primary performance, namely collision probability and overlapping time. We present three spectrum access schemes using different sensing, back-off, and transmission mechanisms. We show that they achieve indistinguishable secondary performance under given primary constraints. We provide closed form analysis on secondary user performance, present a tight capacity upper bound, and reveal the impact of various design options, such as sensing, packet length distribution, back-off time, packet overhead, and grouping. Our work sheds light on the fundamental properties and design criteria on opportunistic spectrum access.

400 citations

Journal ArticleDOI
TL;DR: This tutorial review discusses recent progress in developing and synthesizing doped semiconductor and metal oxide nanocrystal with LSPR, and in studying the optical properties of these plasmonic nanocrystals, and discusses their growing potential for advancing biomedical and optoelectronic applications.
Abstract: The creation and study of non-metallic nanomaterials that exhibit localized surface plasmon resonance (LSPR) interactions with light is a rapidly growing field of research. These doped nanocrystals, mainly self-doped semiconductor nanocrystals (NCs) and extrinsically-doped metal oxide NCs, have extremely high concentrations of free charge carriers, which allows them to exhibit LSPR at near infrared (NIR) wavelengths. In this tutorial review, we discuss recent progress in developing and synthesizing doped semiconductor and metal oxide nanocrystals with LSPR, and in studying the optical properties of these plasmonic nanocrystals. We go on to discuss their growing potential for advancing biomedical and optoelectronic applications.

340 citations


Cited by
More filters
01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations

Journal ArticleDOI
TL;DR: Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis.
Abstract: Machine Learning is the study of methods for programming computers to learn. Computers are applied to a wide range of tasks, and for most of these it is relatively easy for programmers to design and implement the necessary software. However, there are many tasks for which this is difficult or impossible. These can be divided into four general categories. First, there are problems for which there exist no human experts. For example, in modern automated manufacturing facilities, there is a need to predict machine failures before they occur by analyzing sensor readings. Because the machines are new, there are no human experts who can be interviewed by a programmer to provide the knowledge necessary to build a computer system. A machine learning system can study recorded data and subsequent machine failures and learn prediction rules. Second, there are problems where human experts exist, but where they are unable to explain their expertise. This is the case in many perceptual tasks, such as speech recognition, hand-writing recognition, and natural language understanding. Virtually all humans exhibit expert-level abilities on these tasks, but none of them can describe the detailed steps that they follow as they perform them. Fortunately, humans can provide machines with examples of the inputs and correct outputs for these tasks, so machine learning algorithms can learn to map the inputs to the outputs. Third, there are problems where phenomena are changing rapidly. In finance, for example, people would like to predict the future behavior of the stock market, of consumer purchases, or of exchange rates. These behaviors change frequently, so that even if a programmer could construct a good predictive computer program, it would need to be rewritten frequently. A learning program can relieve the programmer of this burden by constantly modifying and tuning a set of learned prediction rules. Fourth, there are applications that need to be customized for each computer user separately. Consider, for example, a program to filter unwanted electronic mail messages. Different users will need different filters. It is unreasonable to expect each user to program his or her own rules, and it is infeasible to provide every user with a software engineer to keep the rules up-to-date. A machine learning system can learn which mail messages the user rejects and maintain the filtering rules automatically. Machine learning addresses many of the same research questions as the fields of statistics, data mining, and psychology, but with differences of emphasis. Statistics focuses on understanding the phenomena that have generated the data, often with the goal of testing different hypotheses about those phenomena. Data mining seeks to find patterns in the data that are understandable by people. Psychological studies of human learning aspire to understand the mechanisms underlying the various learning behaviors exhibited by people (concept learning, skill acquisition, strategy change, etc.).

13,246 citations

Book
01 Jan 2005

9,038 citations

Journal ArticleDOI
TL;DR: Convergence of Probability Measures as mentioned in this paper is a well-known convergence of probability measures. But it does not consider the relationship between probability measures and the probability distribution of probabilities.
Abstract: Convergence of Probability Measures. By P. Billingsley. Chichester, Sussex, Wiley, 1968. xii, 253 p. 9 1/4“. 117s.

5,689 citations

Journal ArticleDOI
TL;DR: In this paper, a survey of spectrum sensing methodologies for cognitive radio is presented and the cooperative sensing concept and its various forms are explained.
Abstract: The spectrum sensing problem has gained new aspects with cognitive radio and opportunistic spectrum access concepts. It is one of the most challenging issues in cognitive radio systems. In this paper, a survey of spectrum sensing methodologies for cognitive radio is presented. Various aspects of spectrum sensing problem are studied from a cognitive radio perspective and multi-dimensional spectrum sensing concept is introduced. Challenges associated with spectrum sensing are given and enabling spectrum sensing methods are reviewed. The paper explains the cooperative sensing concept and its various forms. External sensing algorithms and other alternative sensing methods are discussed. Furthermore, statistical modeling of network traffic and utilization of these models for prediction of primary user behavior is studied. Finally, sensing features of some current wireless standards are given.

4,812 citations