scispace - formally typeset
Search or ask a question
Author

Guokai Zeng

Bio: Guokai Zeng is an academic researcher from Michigan State University. The author has contributed to research in topics: Multicast & Pragmatic General Multicast. The author has an hindex of 7, co-authored 10 publications receiving 415 citations.

Papers
More filters
Journal ArticleDOI
Guokai Zeng1, Bo Wang1, Yong Ding1, Li Xiao1, Matt W. Mutka1 
TL;DR: Simulations show that those algorithms greatly outperform the single-channel multicast algorithm, and observe that MCM achieves better throughput and shorter delay while LCA can be realized in distributed manner.
Abstract: The wireless mesh network is an emerging technology that provides high quality service to end users as the "last milerdquo of the Internet. Furthermore, multicast communication is a key technology for wireless mesh networks. Multicast provides efficient data distribution among a group of nodes. However, unlike other wireless networks, such as sensor networks and MANETs, where multicast algorithms are designed to be energy efficient and to achieve optimal route discovery among mobile nodes, wireless mesh networks need to maximize throughput. This paper proposes two multicast algorithms: the level channel assignment (LCA) algorithm and the multichannel multicast (MCM) to improve the throughput for multichannel and multi-interface mesh networks. The algorithms build efficient multicast trees by minimizing the number of relay nodes and total hop count distances of the trees. The algorithms use dedicated channel assignment strategies to reduce the interference to improve the network capacity. We also demonstrate that using partially overlapping channels can further diminish the interference. Furthermore, additional interfaces help to increase the bandwidth, and multiple gateways can further shorten the total hop count distance. Simulations show that those algorithms greatly outperform the single-channel multicast algorithm. We also observe that MCM achieves better throughput and shorter delay while LCA can be realized in distributed manner.

127 citations

Proceedings ArticleDOI
Guokai Zeng1, Bo Wang1, Yong Ding1, Li Xiao1, Matt W. Mutka1 
05 Nov 2007
TL;DR: This work proposes a level channel assignment (LCA) algorithm and a multi-channel multicast (MCM) algorithm to optimize throughput for multi-Channel and multi-interface mesh networks and observes that MCM achieves better throughput and shorter delay while LCA can be realized in distributed manner.
Abstract: Multicast is a key technology that provides efficient data communication among a set of nodes for wireless multi-hop networks. In sensor networks and MANETs, multicast algorithms are designed to be energy efficient and to achieve optimal route discovery among mobile nodes, respectively. However, in wireless mesh networks, which are required to provide high quality service to end users as the "last-mile" of the Internet, throughput maximization conflicting with scarce bandwidth has the paramount priority. We propose a level channel assignment (LCA) algorithm and a multi-channel multicast (MCM) algorithm to optimize throughput for multi-channel and multi-interface mesh networks. The algorithms first build a multicast structure by minimizing the number of relay nodes and hop count distances between the source and destinations, and use dedicated channel assignment strategies to improve the network capacity by reducing interference. We also illustrate that the use of partially overlapping channels can further improve the throughput. Simulations show that our algorithms greatly outperform the single-channel multicast algorithm. We observe that MCM achieves better throughput and shorter delay while LCA can be realized in distributed manner.

121 citations

Journal ArticleDOI
TL;DR: It is demonstrated that the overall network throughput can be dramatically improved by properly utilizing the partially overlapping channels and the genetic algorithm outperforms the greedy algorithm in mitigating the network interference, and therefore leads to higher network throughput.
Abstract: Wireless mesh networks have attracted great interest in the research community recently. Much effort has been devoted to maximizing the network performance using limited channel resources in a multichannel multiradio wireless mesh network. It is believed that the limited spectrum resource can be fully exploited by utilizing partially overlapping channels in addition to nonoverlapping channels in 802.11b/g networks. However, there are only few studies of channel assignment algorithms for partially overlapping channels. In this paper, we first formulate the optimal channel assignment problem with the goal of maximizing the overall network throughput or maximizing the throughput of a multicast application. For both cases, we present a greedy algorithm for partially overlapping channel assignment, and then propose a novel genetic algorithm, which has the potential to obtain better solutions. Through evaluation, we demonstrate that the overall network throughput can be dramatically improved by properly utilizing the partially overlapping channels. The genetic algorithm outperforms the greedy algorithm in mitigating the network interference, and therefore leads to higher network throughput. In addition, the multicast throughput can also be dramatically improved by using our algorithms compared to previous work.

90 citations

Proceedings ArticleDOI
17 Nov 2008
TL;DR: An extension to the traditional conflict graph model, weighted conflict graph, is proposed to model the interference between wireless links more accurately and demonstrates that the network performance can be dramatically improved by properly utilizing the partially overlapping channels.
Abstract: Many efforts have been devoted to maximizing the network throughput with limited channel resources in multi-radio multi-channel wireless mesh networks. It has been believed that the limited spectrum resource can be fully exploited by utilizing partially overlapping channels in addition to non-overlapping channels in 802.11b/g networks. However, there are only few studies of channel assignment algorithms for partially overlapping channels. In this paper, an extension to the traditional conflict graph model, weighted conflict graph, is proposed to model the interference between wireless links more accurately. Based on this model, we first present a greedy algorithm for partially overlapping channel assignment, and then propose a novel genetic algorithm, which has the potential to obtain better solutions. Through evaluation, we demonstrate that the network performance can be dramatically improved by properly utilizing the partially overlapping channels. In addition, the genetic algorithm outperforms the greedy algorithm in mitigating the interference within the network and therefore leads to higher network throughput.

45 citations

Journal ArticleDOI
Guokai Zeng1, Bo Wang1, Matt W. Mutka1, Li Xiao1, Eric Torng1 
TL;DR: Simulations show that the algorithm significantly outperforms the current best WMN multicast algorithm by both increasing throughput and reducing delay and to improve the effectiveness of the algorithm, a dedicated channel assignment algorithm is designed.
Abstract: A wireless mesh network (WMN) is a type of communication network made up of wireless devices and organized in a mesh topology. Multicast is a fundamental service in WMNs because it efficiently distributes data among a group of nodes. Multicast algorithms in WMNs are designed to maximize system throughput and minimize delay in order to satisfy the end users' requirement. Previous work has unrealistically assumed that the underlying WMN is link-homogeneous. We consider one important form of link heterogeneity: different link loss ratios, or equivalently different ETX. Different from other work addressing multicast in wireless networks, we point out that the local broadcast quality relies on the worst involved link. We model different link loss ratios by defining a new graph theory problem, Heterogeneous Weighted Steiner Connected Dominating Set (HW-SCDS), on an edge-weighted directed graph, where the edge weights model ETX, the reciprocal of link loss ratios. We minimize the number of transmissions in a multicast by computing a minimum HW-SCDS in the edge-weighted graph. We prove that HW-SCDS is NP-hard and devise a greedy algorithm for it. To improve the effectiveness of our algorithm, we design a dedicated channel assignment algorithm. Simulations show that our algorithm significantly outperforms the current best WMN multicast algorithm by both increasing throughput and reducing delay.

10 citations


Cited by
More filters
Journal Article
TL;DR: In this article, Stann et al. present RMST (Reliable Multi-Segment Transport), a new transport layer for Directed Diffusion, which provides guaranteed delivery and fragmentation/reassembly for applications that require them.
Abstract: Appearing in 1st IEEE International Workshop on Sensor Net Protocols and Applications (SNPA). Anchorage, Alaska, USA. May 11, 2003. RMST: Reliable Data Transport in Sensor Networks Fred Stann, John Heidemann Abstract – Reliable data transport in wireless sensor networks is a multifaceted problem influenced by the physical, MAC, network, and transport layers. Because sensor networks are subject to strict resource constraints and are deployed by single organizations, they encourage revisiting traditional layering and are less bound by standardized placement of services such as reliability. This paper presents analysis and experiments resulting in specific recommendations for implementing reliable data transport in sensor nets. To explore reliability at the transport layer, we present RMST (Reliable Multi- Segment Transport), a new transport layer for Directed Diffusion. RMST provides guaranteed delivery and fragmentation/reassembly for applications that require them. RMST is a selective NACK-based protocol that can be configured for in-network caching and repair. Second, these energy constraints, plus relatively low wireless bandwidths, make in-network processing both feasible and desirable [3]. Third, because nodes in sensor networks are usually collaborating towards a common task, rather than representing independent users, optimization of the shared network focuses on throughput rather than fairness. Finally, because sensor networks are often deployed by a single organization with inexpensive hardware, there is less need for interoperability with existing standards. For all of these reasons, sensor networks provide an environment that encourages rethinking the structure of traditional communications protocols. The main contribution is an evaluation of the placement of reliability for data transport at different levels of the protocol stack. We consider implementing reliability in the MAC, transport layer, application, and combinations of these. We conclude that reliability is important at the MAC layer and the transport layer. MAC-level reliability is important not just to provide hop-by-hop error recovery for the transport layer, but also because it is needed for route discovery and maintenance. (This conclusion differs from previous studies in reliability for sensor nets that did not simulate routing. [4]) Second, we have developed RMST (Reliable Multi-Segment Transport), a new transport layer, in order to understand the role of in- network processing for reliable data transfer. RMST benefits from diffusion routing, adding minimal additional control traffic. RMST guarantees delivery, even when multiple hops exhibit very high error rates. 1 Introduction Wireless sensor networks provide an economical, fully distributed, sensing and computing solution for environments where conventional networks are impractical. This paper explores the design decisions related to providing reliable data transport in sensor nets. The reliable data transport problem in sensor nets is multi-faceted. The emphasis on energy conservation in sensor nets implies that poor paths should not be artificially bolstered via mechanisms such as MAC layer ARQ during route discovery and path selection [1]. Path maintenance, on the other hand, benefits from well- engineered recovery either at the MAC layer or the transport layer, or both. Recovery should not be costly however, since many applications in sensor nets are impervious to occasional packet loss, relying on the regular delivery of coarse-grained event descriptions. Other applications require loss detection and repair. These aspects of reliable data transport include the provision of guaranteed delivery and fragmentation/ reassembly of data entities larger than the network MTU. Sensor networks have different constraints than traditional wired nets. First, energy constraints are paramount in sensor networks since nodes can often not be recharged, so any wasted energy shortens their useful lifetime [2]. This work was supported by DARPA under grant DABT63-99-1-0011 as part of the SCAADS project, and was also made possible in part due to support from Intel Corporation and Xerox Corporation. Fred Stann and John Heidemann are with USC/Information Sciences Institute, 4676 Admiralty Way, Marina Del Rey, CA, USA E-mail: fstann@usc.edu, johnh@isi.edu. 2 Architectural Choices There are a number of key areas to consider when engineering reliability for sensor nets. Many current sensor networks exhibit high loss rates compared to wired networks (2% to 30% to immediate neighbors)[1,5,6]. While error detection and correction at the physical layer are important, approaches at the MAC layer and higher adapt well to the very wide range of loss rates seen in sensor networks and are the focus of this paper. MAC layer protocols can ameliorate PHY layer unreliability, and transport layers can guarantee delivery. An important question for this paper is the trade off between implementation of reliability at the MAC layer (i.e. hop to hop) vs. the Transport layer, which has traditionally been concerned with end-to-end reliability. Because sensor net applications are distributed, we also considered implementing reliability at the application layer. Our goal is to minimize the cost of repair in terms of transmission.

650 citations

Journal ArticleDOI
TL;DR: This paper provides a taxonomy for opportunistic routing proposals, based on their routing objectives as well as the optimization tools and approaches used in the routing design, and identifies and discusses the main future research directions related to the opportunistic routed design, optimization, and deployment.
Abstract: The great advances made in the wireless technology have enabled the deployment of wireless communication networks in some of the harshest environments such as volcanoes, hurricane-affected regions, and underground mines. In such challenging environments suffering from the lack of infrastructure, traditional routing is not efficient and sometimes not even feasible. Moreover, the exponential growth of the number of wireless connected devices has created the need for a new routing paradigm that could benefit from the potentials offered by these heterogeneous wireless devices. Hence, in order to overcome the traditional routing limitations, and to increase the capacity of current dynamic heterogeneous wireless networks, the opportunistic routing paradigm has been proposed and developed in recent research works. Motivated by the great interest that has been attributed to this new paradigm within the last decade, we provide a comprehensive survey of the existing literature related to opportunistic routing. We first study the main design building blocks of opportunistic routing. Then, we provide a taxonomy for opportunistic routing proposals, based on their routing objectives as well as the optimization tools and approaches used in the routing design. Hence, five opportunistic routing classes are defined and studied in this paper, namely, geographic opportunistic routing, link-state-aware opportunistic routing, probabilistic opportunistic routing, optimization-based opportunistic routing, and cross-layer opportunistic routing. We also review the main protocols proposed in the literature for each class. Finally, we identify and discuss the main future research directions related to the opportunistic routing design, optimization, and deployment.

229 citations

Dissertation
15 Dec 2010
TL;DR: A new quality link metric, Interference and Bandwidth Adjusted ETX (IBETX) is proposed and implemented, keeping in view the issues of WMhNs, increasing demands of users and capabilities of routing protocols, and it succeeds to reduce average end-to-end delay up to 16% less than Expected Link Performance (ELP) and 24% lessthan ETX.
Abstract: This dissertation endeavors to contribute enhancements in goodputsof the IEEE 802.11-based Wireless Multi-hop Networks (WMhNs).By performing exhaustive simulations, for the deep analysis and detailed assessment of both reactive (AODV, DSR, DYMO) and proactive (DSDV, FSR, OLSR) protocols for varying mobilities, speeds, network loads and scalabilities, it is observed that a routing link metric is a significant component of a routing protocol. In addition to finding all available paths, the fastest end-to-end route is selected by a link metric for the routing protocol. This study aims the quality routing. In the class of quality link metrics, Expected Transmission Count (ETX) is extensively used. Thus, the most recently proposed ETX-based metrics have been analyzed. Though, newly developed metrics over perform ETX but still they can be improved. By profound analysis and particularized comparison of routing protocols depending upon their classes (reactive and proactive) and ETX-based metrics, we come to realize that users always demand proficient networks. In fact, WMhNs are facing several troubles which they expect to be resolved by the routing protocol operating them. Consequently, the protocol depends upon the link metric for providing quality paths. So, we identify and analyze the requirements to design a new routing link metric for WMhNs. Because, considering these requirements, when a link metric is proposed, then : firstly, both the design and implementation of the link metric with a routing protocol become easy. Secondly, the underlying network issues can easily be tackled. Thirdly, an appreciable performance of the network is guaranteed. Keeping in view the issues of WMhNs, increasing demands of users and capabilities of routing protocols, we propose and implement a new quality link metric, Interference and Bandwidth Adjusted ETX (IBETX). As, MAC layer affects the link performance and consequently the route quality, the metric therefore, tackles the issue by achieving twofold MAC-awareness. Firstly, interference is calculated using cross-layered approach by sending probes to MAC layer. Secondly, the nominal bit rate information is provided to all nodes in the same contention domain by considering the bandwidth sharing mechanism of 802.11. Like ETX, our metric also calculates link delivery ratios that directly affect throughput and selects those routes that bypass dense regions in the network. Simulation results by NS-2 show that IBETX gives 19% higher through put than ETX and 10% higher than Expected Throughput (ETP). Our metric also succeeds to reduce average end-to-end delay up to 16% less than Expected Link Performance (ELP) and 24% less than ETX

133 citations

Journal ArticleDOI
Yuhua Xu, Qihui Wu, Liang Shen, Jinlong Wang, Alagan Anpalagan1 
TL;DR: This article investigates the problem of distributed channel selection for opportunistic spectrum access systems, where multiple cognitive radio users are spatially located and mutual interference only emerges between neighboring users, and proposes two uncoupled learning algorithms, with which the CR users intelligently learn the desirable actions from their individual action-utility history.
Abstract: This article investigates the problem of distributed channel selection for opportunistic spectrum access systems, where multiple cognitive radio (CR) users are spatially located and mutual interference only emerges between neighboring users. In addition, there is no information exchange among CR users. We first propose a MAC-layer interference minimization game, in which the utility of a player is defined as a function of the number of neighbors competing for the same channel. We prove that the game is a potential game with the optimal Nash equilibrium (NE) point minimizing the aggregate MAC-layer interference. Although this result is promising, it is challenging to achieve a NE point without information exchange, not to mention the optimal one. The reason is that traditional algorithms belong to coupled algorithms which need information of other users during the convergence towards NE solutions. We propose two uncoupled learning algorithms, with which the CR users intelligently learn the desirable actions from their individual action-utility history. Specifically, the first algorithm asymptotically minimizes the aggregate MAC-layer interference and needs a common control channel to assist learning scheduling, and the second one does not need a control channel and averagely achieves suboptimal solutions.

95 citations

Journal ArticleDOI
TL;DR: It is demonstrated that the overall network throughput can be dramatically improved by properly utilizing the partially overlapping channels and the genetic algorithm outperforms the greedy algorithm in mitigating the network interference, and therefore leads to higher network throughput.
Abstract: Wireless mesh networks have attracted great interest in the research community recently. Much effort has been devoted to maximizing the network performance using limited channel resources in a multichannel multiradio wireless mesh network. It is believed that the limited spectrum resource can be fully exploited by utilizing partially overlapping channels in addition to nonoverlapping channels in 802.11b/g networks. However, there are only few studies of channel assignment algorithms for partially overlapping channels. In this paper, we first formulate the optimal channel assignment problem with the goal of maximizing the overall network throughput or maximizing the throughput of a multicast application. For both cases, we present a greedy algorithm for partially overlapping channel assignment, and then propose a novel genetic algorithm, which has the potential to obtain better solutions. Through evaluation, we demonstrate that the overall network throughput can be dramatically improved by properly utilizing the partially overlapping channels. The genetic algorithm outperforms the greedy algorithm in mitigating the network interference, and therefore leads to higher network throughput. In addition, the multicast throughput can also be dramatically improved by using our algorithms compared to previous work.

90 citations