scispace - formally typeset
Search or ask a question
Dissertation

A framework for evaluating proposed technologies for next-generation wireless systems

01 Jan 2019-
TL;DR: A framework for evaluating proposed technologies for next-generation wireless systems, using systems modelling approaches, mixes hard systems modelling into a soft approach providing a method for managing complexity and facilitating learning points for the development of future wireless systems.
Abstract: This thesis presents a framework for evaluating proposed technologies for next-generation wireless systems, using systems modelling approaches. First, the socio-economic system is explored addressing the challenging question of how to develop a strategy for research investment in the complex development space of Fifth Generation (5G) era technologies. By the application of Problem Structuring Methods, and focusing on developing a clearer understanding of the industry landscape, a methodology for strategic decision making is proposed. The approach is used to identify key areas of wireless technology research for the 5G era. Subsequently, identified key areas of wireless technology including, full-duplex, beamforming, clear channel assessment and transmission power adaptation are explored in single and multi-hop wireless networks. A novel conceptual simulation modelling methodology is proposed and applied to investigate the performance impact of these technologies when implemented in the context of Carrier Sense Multiple Access with Collision Avoidance wireless networks. The methodology is designed to aid researchers in the environment of a corporate research and development lab with the goal of developing innovations and intellectual property that can bring commercial success. Whilst each technology is capable in principle of improving system performance, often the gain is limited when implementing in a network environment. The methodology is used to propose strategies for maximising performance gain with quantitative results to support the conclusions. The framework mixes hard systems modelling into a soft approach providing a method for managing complexity and facilitating learning points for the development of future wireless systems.
Citations
More filters
Journal ArticleDOI
TL;DR: The expected future WLAN scenarios and use cases that justify the push for a new PHY/MAC IEEE 802.11ax-2019 amendment are reviewed and a set of new technical features that may be included are overviewed.
Abstract: IEEE 802.11ax-2019 will replace both IEEE 802.11n-2009 and IEEE 802.11ac-2013 as the next high-throughput Wireless Local Area Network (WLAN) amendment. In this paper, we review the expected future WLAN scenarios and use-cases that justify the push for a new PHY/MAC IEEE 802.11 amendment. After that, we overview a set of new technical features that may be included in the IEEE 802.11ax-2019 amendment and describe both their advantages and drawbacks. Finally, we discuss some of the network-level functionalities that are required to fully improve the user experience in next-generation WLANs and note their relation with other on-going IEEE 802.11 amendments.

51 citations

01 Jan 2016
TL;DR: Thank you very much for downloading discrete event simulation modeling programming and analysis, it will help people to cope with some infectious bugs inside their computer.
Abstract: Thank you very much for downloading discrete event simulation modeling programming and analysis. Maybe you have knowledge that, people have search hundreds times for their favorite novels like this discrete event simulation modeling programming and analysis, but end up in infectious downloads. Rather than enjoying a good book with a cup of tea in the afternoon, instead they cope with some infectious bugs inside their computer.

49 citations

01 Jan 2011
TL;DR: It is shown that the stationary distribution of the CSMA system is in fact insensitive with respect to the transmission durations and the back-off times, and the stability conditions in a few relevant scenarios are identified.
Abstract: Random-access algorithms such as the Carrier-Sense Multiple-Access (CSMA) protocol provide a popular mechanism for distributed medium access control in large-scale wireless networks. In recent years, fairly tractable models have been shown to yield remarkably accurate throughput estimates for CSMA networks. These models typically assume that both the transmission durations and the back-off periods are exponentially distributed. We show that the stationary distribution of the system is in fact insensitive with respect to the transmission durations and the back-off times. These models primarily pertain to a saturated scenario where nodes always have packets to transmit. In reality however, the buffers may occasionally be empty as packets are randomly generated and transmitted over time. The resulting interplay between the activity states and the buffer contents gives rise to quite complicated queueing dynamics, and even establishing the stability criteria is usually a serious challenge. We explicitly identify the stability conditions in a few relevant scenarios, and illustrate the difficulties arising in other cases.

3 citations

References
More filters
Journal ArticleDOI
04 Jun 1998-Nature
TL;DR: Simple models of networks that can be tuned through this middle ground: regular networks ‘rewired’ to introduce increasing amounts of disorder are explored, finding that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs.
Abstract: Networks of coupled dynamical systems have been used to model biological oscillators, Josephson junction arrays, excitable media, neural networks, spatial games, genetic control networks and many other self-organizing systems. Ordinarily, the connection topology is assumed to be either completely regular or completely random. But many biological, technological and social networks lie somewhere between these two extremes. Here we explore simple models of networks that can be tuned through this middle ground: regular networks 'rewired' to introduce increasing amounts of disorder. We find that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs. We call them 'small-world' networks, by analogy with the small-world phenomenon (popularly known as six degrees of separation. The neural network of the worm Caenorhabditis elegans, the power grid of the western United States, and the collaboration graph of film actors are shown to be small-world networks. Models of dynamical systems with small-world coupling display enhanced signal-propagation speed, computational power, and synchronizability. In particular, infectious diseases spread more easily in small-world networks than in regular lattices.

39,297 citations


"A framework for evaluating proposed..." refers background or methods in this paper

  • ...For all E matrices, prior to any adaptation being applied (legacy conditions (R1)), the mean local clustering coefficient was calculated [22, 268]....

    [...]

  • ...Asymmetry, number of hidden nodes, number of suppressed transmitters, clustering coefficient [22, 268] and other measures were considered....

    [...]

  • ...It is shown that the average throughput gain, for some rules, is higher when only applied to networks with a low clustering coefficient [22, 268]....

    [...]

Book
15 Jan 1996
TL;DR: WireWireless Communications: Principles and Practice, Second Edition is the definitive modern text for wireless communications technology and system design as discussed by the authors, which covers the fundamental issues impacting all wireless networks and reviews virtually every important new wireless standard and technological development, offering especially comprehensive coverage of the 3G systems and wireless local area networks (WLANs).
Abstract: From the Publisher: The indispensable guide to wireless communications—now fully revised and updated! Wireless Communications: Principles and Practice, Second Edition is the definitive modern text for wireless communications technology and system design. Building on his classic first edition, Theodore S. Rappaport covers the fundamental issues impacting all wireless networks and reviews virtually every important new wireless standard and technological development, offering especially comprehensive coverage of the 3G systems and wireless local area networks (WLANs) that will transform communications in the coming years. Rappaport illustrates each key concept with practical examples, thoroughly explained and solved step by step. Coverage includes: An overview of key wireless technologies: voice, data, cordless, paging, fixed and mobile broadband wireless systems, and beyond Wireless system design fundamentals: channel assignment, handoffs, trunking efficiency, interference, frequency reuse, capacity planning, large-scale fading, and more Path loss, small-scale fading, multipath, reflection, diffraction, scattering, shadowing, spatial-temporal channel modeling, and microcell/indoor propagation Modulation, equalization, diversity, channel coding, and speech coding New wireless LAN technologies: IEEE 802.11a/b, HIPERLAN, BRAN, and other alternatives New 3G air interface standards, including W-CDMA, cdma2000, GPRS, UMTS, and EDGE Bluetooth wearable computers, fixed wireless and Local Multipoint Distribution Service (LMDS), and other advanced technologies Updated glossary of abbreviations and acronyms, and a thorolist of references Dozens of new examples and end-of-chapter problems Whether you're a communications/network professional, manager, researcher, or student, Wireless Communications: Principles and Practice, Second Edition gives you an in-depth understanding of the state of the art in wireless technology—today's and tomorrow's.

17,102 citations


"A framework for evaluating proposed..." refers background in this paper

  • ...Considering a single transmitter-receiver link with a bandwidth subject to Additive White Gaussian Noise (AWGN), the maximum capacity over the channel is given by the Shannon-Hartley formula [209]...

    [...]

  • ..., rate of successful message delivery measured in bits per second (bit/s or bps) or data packets per second (p/s or pps)) approaching Shannon’s limit [209], whereas improvements to medium access control (MAC) layer technologies have accelerated....

    [...]

Journal ArticleDOI
TL;DR: The search for scientific bases for confronting problems of social policy is bound to fail, becuase of the nature of these problems as discussed by the authors, whereas science has developed to deal with tame problems.
Abstract: The search for scientific bases for confronting problems of social policy is bound to fail, becuase of the nature of these problems. They are “wicked” problems, whereas science has developed to deal with “tame” problems. Policy problems cannot be definitively described. Moreover, in a pluralistic society there is nothing like the undisputable public good; there is no objective definition of equity; policies that respond to social problems cannot be meaningfully correct or false; and it makes no sense to talk about “optimal solutions” to social problems unless severe qualifications are imposed first. Even worse, there are no “solutions” in the sense of definitive and objective answers.

13,262 citations


"A framework for evaluating proposed..." refers background or methods in this paper

  • ...This thesis considers the development of a strategy to address research priorities in the development of next-generation communications technologies as a wicked problem [214]....

    [...]

  • ...the purpose of the model is to bring clarity to a complex, wicked problem [214], the model can still provide a significant contribution....

    [...]

  • ...From the Rittel and Webber definition above, there is unlikely to be a clear finishing line for declaring a successful exploitation of any technology development in this space....

    [...]

  • ...The definition of a wicked problem used is based in this chapter on the original 1973 work of Rittel and Webber [214]: 1....

    [...]

  • ...The definition of a wicked problem used is based in this chapter on the original 1973 work of Rittel and Webber [214]:...

    [...]

Journal ArticleDOI
TL;DR: In this paper, the authors position mixed methods research (mixed research is a synonym) as the natural complement to traditional qualitative and quantitative research, and present pragmatism as offering an attractive philosophical partner for mixed method research.
Abstract: The purposes of this article are to position mixed methods research (mixed research is a synonym) as the natural complement to traditional qualitative and quantitative research, to present pragmatism as offering an attractive philosophical partner for mixed methods research, and to provide a framework for designing and conducting mixed methods research. In doing this, we briefly review the paradigm “wars” and incompatibility thesis, we show some commonalities between quantitative and qualitative research, we explain the tenets of pragmatism, we explain the fundamental principle of mixed research and how to apply it, we provide specific sets of designs for the two major types of mixed methods research (mixed-model designs and mixed-method designs), and, finally, we explain mixed methods research as following (recursively) an eight-step process. A key feature of mixed methods research is its methodological pluralism or eclecticism, which frequently results in superior research (compared to monomethod resear...

11,330 citations


"A framework for evaluating proposed..." refers background or methods in this paper

  • ...Further, there is no guarantee that knowledge generated is transferable to other examples [131]....

    [...]

  • ...Applying a multi-methodological approach [131] to research in wireless networks, by develop-...

    [...]

Journal ArticleDOI
TL;DR: When n identical randomly located nodes, each capable of transmitting at W bits per second and using a fixed range, form a wireless network, the throughput /spl lambda/(n) obtainable by each node for a randomly chosen destination is /spl Theta/(W//spl radic/(nlogn)) bits persecond under a noninterference protocol.
Abstract: When n identical randomly located nodes, each capable of transmitting at W bits per second and using a fixed range, form a wireless network, the throughput /spl lambda/(n) obtainable by each node for a randomly chosen destination is /spl Theta/(W//spl radic/(nlogn)) bits per second under a noninterference protocol. If the nodes are optimally placed in a disk of unit area, traffic patterns are optimally assigned, and each transmission's range is optimally chosen, the bit-distance product that can be transported by the network per second is /spl Theta/(W/spl radic/An) bit-meters per second. Thus even under optimal circumstances, the throughput is only /spl Theta/(W//spl radic/n) bits per second for each node for a destination nonvanishingly far away. Similar results also hold under an alternate physical model where a required signal-to-interference ratio is specified for successful receptions. Fundamentally, it is the need for every node all over the domain to share whatever portion of the channel it is utilizing with nodes in its local neighborhood that is the reason for the constriction in capacity. Splitting the channel into several subchannels does not change any of the results. Some implications may be worth considering by designers. Since the throughput furnished to each user diminishes to zero as the number of users is increased, perhaps networks connecting smaller numbers of users, or featuring connections mostly with nearby neighbors, may be more likely to be find acceptance.

9,008 citations


"A framework for evaluating proposed..." refers background or methods in this paper

  • ...Gupta and Kumar [107], like Bianchi, use an analytic model considering a situation where all the transmitters in the network are required to transmit at the same bit-rate and further assumptions are made in relation to physical propagation....

    [...]

  • ...been widely used to capture the impact of interference on network connectivity [76, 107, 108]....

    [...]

  • ...The authors apply a Poisson bipolar model [16] which inherits the simple interference structure of the classic Gupta and Kumar model [107]....

    [...]

  • ...Key to the methodology for characterising the capacity region in this paper is that demand to each node is controlled individually, unlike the works of Bianchi [29] and Gupta and Kumar [107] which assume demand for each node to be equal (for more discussion in relation to the capacity region see Sec....

    [...]