scispace - formally typeset
Search or ask a question

Showing papers by "Georgia Institute of Technology published in 2008"


Journal ArticleDOI
TL;DR: How the unique tunability of the plasmon resonance properties of metal nanoparticles through variation of their size, shape, composition, and medium allows chemists to design nanostructures geared for specific bio-applications is emphasized.
Abstract: Noble metal nanostructures attract much interest because of their unique properties, including large optical field enhancements resulting in the strong scattering and absorption of light. The enhancement in the optical and photothermal properties of noble metal nanoparticles arises from resonant oscillation of their free electrons in the presence of light, also known as localized surface plasmon resonance (LSPR). The plasmon resonance can either radiate light (Mie scattering), a process that finds great utility in optical and imaging fields, or be rapidly converted to heat (absorption); the latter mechanism of dissipation has opened up applications in several new areas. The ability to integrate metal nanoparticles into biological systems has had greatest impact in biology and biomedicine. In this Account, we discuss the plasmonic properties of gold and silver nanostructures and present examples of how they are being utilized for biodiagnostics, biophysical studies, and medical therapy. For instance, takin...

3,617 citations


Journal ArticleDOI
TL;DR: Third-generation delivery systems target their effects to skin's barrier layer of stratum corneum using microneedles, thermal ablation, microdermabrasion, electroporation and cavitational ultrasound for delivery of macromolecules and vaccines.
Abstract: Transdermal drug delivery has made an important contribution to medical practice, but has yet to fully achieve its potential as an alternative to oral delivery and hypodermic injections. First-generation transdermal delivery systems have continued their steady increase in clinical use for delivery of small, lipophilic, low-dose drugs. Second-generation delivery systems using chemical enhancers, noncavitational ultrasound and iontophoresis have also resulted in clinical products; the ability of iontophoresis to control delivery rates in real time provides added functionality. Third-generation delivery systems target their effects to skin's barrier layer of stratum corneum using microneedles, thermal ablation, microdermabrasion, electroporation and cavitational ultrasound. Microneedles and thermal ablation are currently progressing through clinical trials for delivery of macromolecules and vaccines, such as insulin, parathyroid hormone and influenza vaccine. Using these novel second- and third-generation enhancement strategies, transdermal delivery is poised to significantly increase its impact on medicine.

2,595 citations


Journal ArticleDOI
TL;DR: It is intended to demonstrate that a properly modified SA approach can be competitive and even significantly outperform the SAA method for a certain class of convex stochastic problems.
Abstract: In this paper we consider optimization problems where the objective function is given in a form of the expectation. A basic difficulty of solving such stochastic optimization problems is that the involved multidimensional integrals (expectations) cannot be computed with high accuracy. The aim of this paper is to compare two computational approaches based on Monte Carlo sampling techniques, namely, the stochastic approximation (SA) and the sample average approximation (SAA) methods. Both approaches, the SA and SAA methods, have a long history. Current opinion is that the SAA method can efficiently use a specific (say, linear) structure of the considered problem, while the SA approach is a crude subgradient method, which often performs poorly in practice. We intend to demonstrate that a properly modified SA approach can be competitive and even significantly outperform the SAA method for a certain class of convex stochastic problems. We extend the analysis to the case of convex-concave stochastic saddle point problems and present (in our opinion highly encouraging) results of numerical experiments.

2,346 citations


Journal ArticleDOI
TL;DR: This paper presents a detailed overview of the basic concepts of PSO and its variants, and provides a comprehensive survey on the power system applications that have benefited from the powerful nature ofPSO as an optimization technique.
Abstract: Many areas in power systems require solving one or more nonlinear optimization problems. While analytical methods might suffer from slow convergence and the curse of dimensionality, heuristics-based swarm intelligence can be an efficient alternative. Particle swarm optimization (PSO), part of the swarm intelligence family, is known to effectively solve large-scale nonlinear optimization problems. This paper presents a detailed overview of the basic concepts of PSO and its variants. Also, it provides a comprehensive survey on the power system applications that have benefited from the powerful nature of PSO as an optimization technique. For each application, technical details that are required for applying PSO, such as its type, particle formulation (solution representation), and the most efficient fitness functions are also discussed.

2,147 citations


Journal ArticleDOI
TL;DR: The development of the PPTT method is discussed with special emphasis on the recent in vitro and in vivo success using gold nanospheres coupled with visible lasers and gold nanorods and silica–gold nanoshells coupled with near-infrared lasers.
Abstract: The use of lasers, over the past few decades, has emerged to be highly promising for cancer therapy modalities, most commonly the photothermal therapy method, which employs light absorbing dyes for achieving the photothermal damage of tumors, and the photodynamic therapy, which employs chemical photosensitizers that generate singlet oxygen that is capable of tumor destruction. However, recent advances in the field of nanoscience have seen the emergence of noble metal nanostructures with unique photophysical properties, well suited for applications in cancer phototherapy. Noble metal nanoparticles, on account of the phenomenon of surface plasmon resonance, possess strongly enhanced visible and near-infrared light absorption, several orders of magnitude more intense compared to conventional laser phototherapy agents. The use of plasmonic nanoparticles as highly enhanced photoabsorbing agents has thus introduced a much more selective and efficient cancer therapy strategy, viz. plasmonic photothermal therapy (PPTT). The synthetic tunability of the optothermal properties and the bio-targeting abilities of the plasmonic gold nanostructures make the PPTT method furthermore promising. In this review, we discuss the development of the PPTT method with special emphasis on the recent in vitro and in vivo success using gold nanospheres coupled with visible lasers and gold nanorods and silica-gold nanoshells coupled with near-infrared lasers.

2,024 citations


PatentDOI
TL;DR: In this article, the authors described the nanostructures, methods of preparing nanostructure, methods for detecting targets in subjects, and methods of treating diseases in subjects. An embodiment, among others, includes a metallic gold surface-enhanced Raman scattering nanoparticle, a Raman reporter and a protection structure.
Abstract: Nanostructures, methods of preparing nanostructures, methods of detecting targets in subjects, and methods of treating diseases in subjects, are disclosed An embodiment, among others, of the nanostructure includes a metallic gold surface- enhanced Raman scattering nanoparticle, a Raman reporter and a protection structure The protection structure may include a thiol-polyethylene glycol to which may be attached a target-specific probe

1,938 citations


Journal ArticleDOI
TL;DR: This article reviews the progress made in CO2 separation and capture research and engineering and various technologies, such as absorption, adsorption, and membrane separation are thoroughly discussed.
Abstract: This article reviews the progress made in CO2 separation and capture research and engineering. Various technologies, such as absorption, adsorption, and membrane separation, are thoroughly discussed. New concepts such as chemical-looping combustion and hydrate-based separation are also introduced briefly. Future directions are suggested. Sequestration methods, such as forestation, ocean fertilization and mineral carbonation techniques are also covered. Underground injection and direct ocean dump are not covered.

1,899 citations


Journal ArticleDOI
TL;DR: A practical secure communication protocol is developed, which uses a four-step procedure to ensure wireless information-theoretic security and is shown that the protocol is effective in secure key renewal-even in the presence of imperfect channel state information.
Abstract: This paper considers the transmission of confidential data over wireless channels. Based on an information-theoretic formulation of the problem, in which two legitimates partners communicate over a quasi-static fading channel and an eavesdropper observes their transmissions through a second independent quasi-static fading channel, the important role of fading is characterized in terms of average secure communication rates and outage probability. Based on the insights from this analysis, a practical secure communication protocol is developed, which uses a four-step procedure to ensure wireless information-theoretic security: (i) common randomness via opportunistic transmission, (ii) message reconciliation, (iii) common key generation via privacy amplification, and (iv) message protection with a secret key. A reconciliation procedure based on multilevel coding and optimized low-density parity-check (LDPC) codes is introduced, which allows to achieve communication rates close to the fundamental security limits in several relevant instances. Finally, a set of metrics for assessing average secure key generation rates is established, and it is shown that the protocol is effective in secure key renewal-even in the presence of imperfect channel state information.

1,759 citations


Journal ArticleDOI
TL;DR: Recent developments and open research issues in spectrum management in CR networks are presented and four main challenges of spectrum management are discussed: spectrum sensing, spectrum decision, spectrum sharing, and spectrum mobility.
Abstract: Cognitive radio networks will provide high bandwidth to mobile users via heterogeneous wireless architectures and dynamic spectrum access techniques. However, CR networks impose challenges due to the fluctuating nature of the available spectrum, as well as the diverse QoS requirements of various applications. Spectrum management functions can address these challenges for the realization of this new network paradigm. To provide a better understanding of CR networks, this article presents recent developments and open research issues in spectrum management in CR networks. More specifically, the discussion is focused on the development of CR networks that require no modification of existing networks. First, a brief overview of cognitive radio and the CR network architecture is provided. Then four main challenges of spectrum management are discussed: spectrum sensing, spectrum decision, spectrum sharing, and spectrum mobility.

1,722 citations


Journal ArticleDOI
13 Nov 2008-Nature
TL;DR: Analysis of molecular divergence compared with yeasts and metazoans reveals rapid rates of gene diversification in diatoms, and documents the presence of hundreds of genes from bacteria, likely to provide novel possibilities for metabolite management and for perception of environmental signals.
Abstract: Diatoms are photosynthetic secondary endosymbionts found throughout marine and freshwater environments, and are believed to be responsible for around one- fifth of the primary productivity on Earth(1,2). The genome sequence of the marine centric diatom Thalassiosira pseudonana was recently reported, revealing a wealth of information about diatom biology(3-5). Here we report the complete genome sequence of the pennate diatom Phaeodactylum tricornutum and compare it with that of T. pseudonana to clarify evolutionary origins, functional significance and ubiquity of these features throughout diatoms. In spite of the fact that the pennate and centric lineages have only been diverging for 90 million years, their genome structures are dramatically different and a substantial fraction of genes (similar to 40%) are not shared by these representatives of the two lineages. Analysis of molecular divergence compared with yeasts and metazoans reveals rapid rates of gene diversification in diatoms. Contributing factors include selective gene family expansions, differential losses and gains of genes and introns, and differential mobilization of transposable elements. Most significantly, we document the presence of hundreds of genes from bacteria. More than 300 of these gene transfers are found in both diatoms, attesting to their ancient origins, and many are likely to provide novel possibilities for metabolite management and for perception of environmental signals. These findings go a long way towards explaining the incredible diversity and success of the diatoms in contemporary oceans.

1,500 citations


Journal ArticleDOI
14 Feb 2008-Nature
TL;DR: This work establishes a methodology for scavenging light-wind energy and body-movement energy using fabrics and presents a simple, low-cost approach that converts low-frequency vibration/friction energy into electricity using piezoelectric zinc oxide nanowires grown radially around textile fibres.
Abstract: Nanodevices don't use much energy, and if the little they do need can be scavenged from vibrations associated with foot steps, heart beats, noises and air flow, a whole range of applications in personal electronics, sensing and defence technologies opens up. Energy gathering of that type requires a technology that works at low frequency range (below 10 Hz), ideally based on soft, flexible materials. A group working at Georgia Institute of Technology has now come up with a system that converts low-frequency vibration/friction energy into electricity using piezoelectric zinc oxide nanowires grown radially around textile fibres. By entangling two fibres and brushing their associated nanowires together, mechanical energy is converted into electricity via a coupled piezoelectric-semiconductor process. This work shows a potential method for creating fabrics which scavenge energy from light winds and body movement. A self-powering nanosystem that harvests its operating energy from the environment is an attractive proposition for sensing, personal electronics and defence technologies1. This is in principle feasible for nanodevices owing to their extremely low power consumption2,3,4,5. Solar, thermal and mechanical (wind, friction, body movement) energies are common and may be scavenged from the environment, but the type of energy source to be chosen has to be decided on the basis of specific applications. Military sensing/surveillance node placement, for example, may involve difficult-to-reach locations, may need to be hidden, and may be in environments that are dusty, rainy, dark and/or in deep forest. In a moving vehicle or aeroplane, harvesting energy from a rotating tyre or wind blowing on the body is a possible choice to power wireless devices implanted in the surface of the vehicle. Nanowire nanogenerators built on hard substrates were demonstrated for harvesting local mechanical energy produced by high-frequency ultrasonic waves6,7. To harvest the energy from vibration or disturbance originating from footsteps, heartbeats, ambient noise and air flow, it is important to explore innovative technologies that work at low frequencies (such as <10 Hz) and that are based on flexible soft materials. Here we present a simple, low-cost approach that converts low-frequency vibration/friction energy into electricity using piezoelectric zinc oxide nanowires grown radially around textile fibres. By entangling two fibres and brushing the nanowires rooted on them with respect to each other, mechanical energy is converted into electricity owing to a coupled piezoelectric–semiconductor process8,9. This work establishes a methodology for scavenging light-wind energy and body-movement energy using fabrics.

Journal ArticleDOI
TL;DR: A unified view of principles that underlie the stability of particles protected by thiolate or phosphine and halide ligands is provided and is best described by a “noble-gas superatom” analogy.
Abstract: Synthesis, characterization, and functionalization of self-assembled, ligand-stabilized gold nanoparticles are long-standing issues in the chemistry of nanomaterials. Factors driving the thermodynamic stability of well documented discrete sizes are largely unknown. Herein, we provide a unified view of principles that underlie the stability of particles protected by thiolate (SR) or phosphine and halide (PR3, X) ligands. The picture has emerged from analysis of large-scale density functional theory calculations of structurally characterized compounds, namely Au102(SR)44, Au39(PR3)14X6−, Au11(PR3)7X3, and Au13(PR3)10X23+, where X is either a halogen or a thiolate. Attributable to a compact, symmetric core and complete steric protection, each compound has a filled spherical electronic shell and a major energy gap to unoccupied states. Consequently, the exceptional stability is best described by a “noble-gas superatom” analogy. The explanatory power of this concept is shown by its application to many monomeric and oligomeric compounds of precisely known composition and structure, and its predictive power is indicated through suggestions offered for a series of anomalously stable cluster compositions which are still awaiting a precise structure determination.

Journal ArticleDOI
TL;DR: In this paper, the authors used a unique data set based on both chronologically compiled ratings as well as reviewer characteristics for a given set of products and geographical location-based purchasing behavior from Amazon, and provided evidence that community norms are an antecedent to reviewer disclosure of identity-descriptive information.
Abstract: Consumer-generated product reviews have proliferated online, driven by the notion that consumers' decision to purchase or not purchase a product is based on the positive or negative information about that product they obtain from fellow consumers. Using research on information processing as a foundation, we suggest that in the context of an online community, reviewer disclosure of identity-descriptive information is used by consumers to supplement or replace product information when making purchase decisions and evaluating the helpfulness of online reviews. Using a unique data set based on both chronologically compiled ratings as well as reviewer characteristics for a given set of products and geographical location-based purchasing behavior from Amazon, we provide evidence that community norms are an antecedent to reviewer disclosure of identity-descriptive information. Online community members rate reviews containing identity-descriptive information more positively, and the prevalence of reviewer disclosure of identity information is associated with increases in subsequent online product sales. In addition, we show that shared geographical location increases the relationship between disclosure and product sales, thus highlighting the important role of geography in electronic commerce. Taken together, our results suggest that identity-relevant information about reviewers shapes community members' judgment of products and reviews. Implications for research on the relationship between online word-of-mouth WOM and sales, peer recognition and reputation systems, and conformity to online community norms are discussed.

Journal ArticleDOI
TL;DR: The state-of-the-art in nano-machines, including architectural aspects, expected features of future nano-MACHines, and current developments are presented for a better understanding of nanonetwork scenarios and nanonetworks features and components are explained and compared with traditional communication networks.

Proceedings Article
28 Jul 2008
TL;DR: This paper presents a general detection framework that is independent of botnet C&C protocol and structure, and requires no a priori knowledge of botnets (such as captured bot binaries and hence the botnet signatures, and C &C server names/addresses).
Abstract: Botnets are now the key platform for many Internet attacks, such as spam, distributed denial-of-service (DDoS), identity theft, and phishing. Most of the current botnet detection approaches work only on specific botnet command and control (C&C) protocols (e.g., IRC) and structures (e.g., centralized), and can become ineffective as botnets change their C&C techniques. In this paper, we present a general detection framework that is independent of botnet C&C protocol and structure, and requires no a priori knowledge of botnets (such as captured bot binaries and hence the botnet signatures, and C&C server names/addresses). We start from the definition and essential properties of botnets. We define a botnet as a coordinated group of malware instances that are controlled via C&C communication channels. The essential properties of a botnet are that the bots communicate with some C&C servers/peers, perform malicious activities, and do so in a similar or correlated way. Accordingly, our detection framework clusters similar communication traffic and similar malicious traffic, and performs cross cluster correlation to identify the hosts that share both similar communication patterns and similar malicious activity patterns. These hosts are thus bots in the monitored network. We have implemented our BotMiner prototype system and evaluated it using many real network traces. The results show that it can detect real-world botnets (IRC-based, HTTP-based, and P2P botnets including Nugache and Storm worm), and has a very low false positive rate.

Reference EntryDOI
15 Mar 2008
TL;DR: In this paper, a general definition and terminology for the Determination of Isotherm Isotherms is defined and a methodology for the determination of the isotherm is presented.
Abstract: The sections in this article are Introduction General Definitions and Terminology Methodology Methods for the Determination of Adsorption Isotherms Operational Definitions of Adsorption Experimental Procedures Outgassing the Adsorbent Determination of the Adsorption Isotherm Evaluation of Adsorption Data Presentation of Primary Data Classification of Adsorption Isotherms Adsorption Hysteresis Determination of Surface Area Application of the BET Method Empirical Procedures for Isotherm Analysis Assessment of Mesoporosity Properties of Porous Materials Application of the Kelvin Equation Computation of Mesopore Size Distribution Assessment of Microporosity Terminology Concept of Surface Area Assessment of Micropore Volume General Conclusions and Recommendations Keywords: physisorption data; IUPAC; adsorption isotherms; surface area; BET isotherm

Journal ArticleDOI
TL;DR: A natural framework that allows any region-based segmentation energy to be re-formulated in a local way is proposed and the localization of three well-known energies are demonstrated in order to illustrate how this framework can be applied to any energy.
Abstract: In this paper, we propose a natural framework that allows any region-based segmentation energy to be re-formulated in a local way. We consider local rather than global image statistics and evolve a contour based on local information. Localized contours are capable of segmenting objects with heterogeneous feature profiles that would be difficult to capture correctly using a standard global method. The presented technique is versatile enough to be used with any global region-based active contour energy and instill in it the benefits of localization. We describe this framework and demonstrate the localization of three well-known energies in order to illustrate how our framework can be applied to any energy. We then compare each localized energy to its global counterpart to show the improvements that can be achieved. Next, an in-depth study of the behaviors of these energies in response to the degree of localization is given. Finally, we show results on challenging images to illustrate the robust and accurate segmentations that are possible with this new class of active contour models.

Journal ArticleDOI
TL;DR: The synthesis and development of state-of-the-art QD probes and their use for molecular and cellular imaging are discussed and key issues for in vivo imaging and therapy, such as nanoparticle biodistribution, pharmacokinetics, and toxicology are examined.


Journal ArticleDOI
TL;DR: The present work demonstrates the feasibility of in vivo PPTT treatment of deep-tissue malignancies using easily-prepared plasmonic gold nanorods and a small, portable, inexpensive near-infrared (NIR) laser.

Journal ArticleDOI
TL;DR: iSAM is efficient even for robot trajectories with many loops as it avoids unnecessary fill-in in the factor matrix by periodic variable reordering and provides efficient algorithms to access the estimation uncertainties of interest based on the factored information matrix.
Abstract: In this paper, we present incremental smoothing and mapping (iSAM), which is a novel approach to the simultaneous localization and mapping problem that is based on fast incremental matrix factorization. iSAM provides an efficient and exact solution by updating a QR factorization of the naturally sparse smoothing information matrix, thereby recalculating only those matrix entries that actually change. iSAM is efficient even for robot trajectories with many loops as it avoids unnecessary fill-in in the factor matrix by periodic variable reordering. Also, to enable data association in real time, we provide efficient algorithms to access the estimation uncertainties of interest based on the factored information matrix. We systematically evaluate the different components of iSAM as well as the overall algorithm using various simulated and real-world datasets for both landmark and pose-only settings.

Book ChapterDOI
TL;DR: The possibilities to collect and store data increase at a faster rate than the ability to use it for making decisions, and in most applications, raw data has no value in itself; instead the authors want to extract the information contained in it.
Abstract: We are living in a world which faces a rapidly increasing amount of data to be dealt with on a daily basis. In the last decade, the steady improvement of data storage devices and means to create and collect data along the way influenced our way of dealing with information: Most of the time, data is stored without filtering and refinement for later use. Virtually every branch of industry or business, and any political or personal activity nowadays generate vast amounts of data. Making matters worse, the possibilities to collect and store data increase at a faster rate than our ability to use it for making decisions. However, in most applications, raw data has no value in itself; instead we want to extract the information contained in it.

Journal ArticleDOI
TL;DR: This article introduces compressive sampling and recovery using convex programming, which converts high-resolution images into a relatively small bit streams in effect turning a large digital data set into a substantially smaller one.
Abstract: Image compression algorithms convert high-resolution images into a relatively small bit streams in effect turning a large digital data set into a substantially smaller one. This article introduces compressive sampling and recovery using convex programming.

Journal ArticleDOI
TL;DR: A new class of colloidal metal nanoparticles that is able to enhance the efficiencies of surface-enhanced Raman scattering (SERS) by as much as 10(14)-10(15) fold is discussed, which allows spectroscopic detection and identification of single molecules located on the nanoparticle surface or at the junction of two particles under ambient conditions.
Abstract: This tutorial review discusses a new class of colloidal metal nanoparticles that is able to enhance the efficiencies of surface-enhanced Raman scattering (SERS) by as much as 1014–1015 fold. This enormous enhancement allows spectroscopic detection and identification of single molecules located on the nanoparticle surface or at the junction of two particles under ambient conditions. Considerable progress has been made in understanding the enhancement mechanisms, including definitive evidence for the single-molecule origin of fluctuating SERS signals. For applications, SERS nanoparticle tags have been developed based on the use of embedded reporter molecules and a silica or polymer encapsulation layer. The SERS nanoparticle tags are capable of providing detailed spectroscopic information and are much brighter than semiconductor quantum dots in the near-infrared spectral window. These properties have raised new opportunities for multiplexed molecular diagnosis and in vivoRaman spectroscopy and imaging.

Journal ArticleDOI
TL;DR: The North Pacific Gyre Oscillation (NPGO) as mentioned in this paper is the most widely used index of large-scale climate variability in the Northeast Pacific region and has been shown to be correlated with previously unexplained fluctuations of salinity, nutrients, chlorophyll, and zooplankton taxa.
Abstract: Decadal fluctuations in salinity, nutrients, chlorophyll, a variety of zooplankton taxa, and fish stocks in the Northeast Pacific are often poorly correlated with the most widely-used index of large-scale climate variability in the region - the Pacific Decadal Oscillation (PDO). We define a new pattern of climate change, the North Pacific Gyre Oscillation (NPGO) and show that its variability is significantly correlated with previously unexplained fluctuations of salinity, nutrients and chlorophyll. Fluctuations in the NPGO are driven by regional and basin-scale variations in wind-driven upwelling and horizontal advection - the fundamental processes controlling salinity and nutrient concentrations. Nutrient fluctuations drive concomitant changes in phytoplankton concentrations, and may force similar variability in higher trophic levels. The NPGO thus provides a strong indicator of fluctuations in the mechanisms driving planktonic ecosystem dynamics. The NPGO pattern extends beyond the North Pacific and is part of a global-scale mode of climate variability that is evident in global sea level trends and sea surface temperature. Therefore the amplification of the NPGO variance found in observations and in global warming simulations implies that the NPGO may play an increasingly important role in forcing global-scale decadal changes in marine ecosystems.

Journal ArticleDOI
TL;DR: It is shown in this paper that an SNR wall reduction can be achieved by employing cooperation among independent cognitive radio users and a new softened hard combination scheme with two-bit overhead for each user is proposed to achieve a good tradeoff between detection performance and complexity.
Abstract: In this letter, we consider cooperative spectrum sensing based on energy detection in cognitive radio networks. Soft combination of the observed energies from different cognitive radio users is investigated. Based on the Neyman-Pearson criterion, we obtain an optimal soft combination scheme that maximizes the detection probability for a given false alarm probability. Encouraged by the performance gain of soft combination, we further propose a new softened hard combination scheme with two-bit overhead for each user and achieve a good tradeoff between detection performance and complexity.

Journal ArticleDOI
TL;DR: A scalable architecture for protecting the location privacy from various privacy threats resulting from uncontrolled usage of LBSs is described, including the development of a personalized location anonymization model and a suite of location perturbation algorithms.
Abstract: Continued advances in mobile networks and positioning technologies have created a strong market push for location-based applications. Examples include location-aware emergency response, location-based advertisement, and location-based entertainment. An important challenge in the wide deployment of location-based services (LBSs) is the privacy-aware management of location information, providing safeguards for location privacy of mobile clients against vulnerabilities for abuse. This paper describes a scalable architecture for protecting the location privacy from various privacy threats resulting from uncontrolled usage of LBSs. This architecture includes the development of a personalized location anonymization model and a suite of location perturbation algorithms. A unique characteristic of our location privacy architecture is the use of a flexible privacy personalization framework to support location k-anonymity for a wide range of mobile clients with context-sensitive privacy requirements. This framework enables each mobile client to specify the minimum level of anonymity that it desires and the maximum temporal and spatial tolerances that it is willing to accept when requesting k-anonymity-preserving LBSs. We devise an efficient message perturbation engine to implement the proposed location privacy framework. The prototype that we develop is designed to be run by the anonymity server on a trusted platform and performs location anonymization on LBS request messages of mobile clients such as identity removal and spatio-temporal cloaking of the location information. We study the effectiveness of our location cloaking algorithms under various conditions by using realistic location data that is synthetically generated from real road maps and traffic volume data. Our experiments show that the personalized location k-anonymity model, together with our location perturbation engine, can achieve high resilience to location privacy threats without introducing any significant performance penalty.

Journal ArticleDOI
TL;DR: Simulation results show that the proposed sensing framework can achieve maximum sensing efficiency and opportunities in multi-user/multi-spectrum environments, satisfying interference constraints.
Abstract: Spectrum sensing is the key enabling technology for cognitive radio networks. The main objective of spectrum sensing is to provide more spectrum access opportunities to cognitive radio users without interfering with the operations of the licensed network. Hence, recent research has been focused on the interference avoidance problem. Moreover, current radio frequency (RF) front-ends cannot perform sensing and transmission at the same time, which inevitably decreases their transmission opportunities, leading to the so-called sensing efficiency problem. In this paper, in order to solve both the interference avoidance and the spectrum efficiency problem, an optimal spectrum sensing framework is developed. More specifically, first a theoretical framework is developed to optimize the sensing parameters in such a way as to maximize the sensing efficiency subject to interference avoidance constraints. Second, in order to exploit multiple spectrum bands, spectrum selection and scheduling methods are proposed where the best spectrum bands for sensing are selected to maximize the sensing capacity. Finally, an adaptive and cooperative spectrum sensing method is proposed where the sensing parameters are optimized adaptively to the number of cooperating users. Simulation results show that the proposed sensing framework can achieve maximum sensing efficiency and opportunities in multi-user/multi-spectrum environments, satisfying interference constraints.

Proceedings Article
01 Jan 2008
TL;DR: This paper proposes an approach that uses network-based anomaly detection to identify botnet C&C channels in a local area network without any prior knowledge of signatures or C &C server addresses, and shows that BotSniffer can detect real-world botnets with high accuracy and has a very low false positive rate.
Abstract: Botnets are now recognized as one of the most serious security threats. In contrast to previous malware, botnets have the characteristic of a command and control (C&C) channel. Botnets also often use existing common protocols, e.g., IRC, HTTP, and in protocol-conforming manners. This makes the detection of botnet C&C a challenging problem. In this paper, we propose an approach that uses network-based anomaly detection to identify botnet C&C channels in a local area network without any prior knowledge of signatures or C&C server addresses. This detection approach can identify both the C&C servers and infected hosts in the network. Our approach is based on the observation that, because of the pre-programmed activities related to C&C, bots within the same botnet will likely demonstrate spatial-temporal correlation and similarity. For example, they engage in coordinated communication, propagation, and attack and fraudulent activities. Our prototype system, BotSniffer, can capture this spatial-temporal correlation in network traffic and utilize statistical algorithms to detect botnets with theoretical bounds on the false positive and false negative rates. We evaluated BotSniffer using many real-world network traces. The results show that BotSniffer can detect real-world botnets with high accuracy and has a very low false positive rate.

Journal ArticleDOI
TL;DR: In this article, the authors present a literature survey of the issues, problems and problematic decisions currently limiting LCA's goal and scope definition and life cycle inventory phases, and identify 15 major problem areas and organize them by the LCA phases in which each appears.
Abstract: Life cycle assessment (LCA) stands as the pre-eminent tool for estimating environmental effects caused by products and processes from ‘cradle to grave’ or ‘cradle to cradle.’ It exists in multiple forms, claims a growing list of practitioners, and remains a focus of continuing research. Despite its popularity and codification by organizations such as the International Organization for Standards and the Society of Environmental Toxicology and Chemistry, life cycle assessment is a tool in need of improvement. Multiple authors have written about its individual problems, but a unified treatment of the subject is lacking. The following literature survey gathers and explains issues, problems and problematic decisions currently limiting LCA’s goal and scope definition and life cycle inventory phases. The review identifies 15 major problem areas and organizes them by the LCA phases in which each appears. This part of the review focuses on the first 7 of these problems occurring during the goal and scope definition and life cycle inventory phases. It is meant as a concise summary for practitioners interested in methodological limitations which might degrade the accuracy of their assessments. For new researchers, it provides an overview of pertinent problem areas toward which they might wish to direct their research efforts. Multiple problems occur in each of LCA’s four phases and reduce the accuracy of this tool. Considering problem severity and the adequacy of current solutions, six of the 15 discussed problems are of paramount importance. In LCA’s first two phases, functional unit definition, boundary selection, and allocation are critical problems requiring particular attention. Problems encountered during goal and scope definition arise from decisions about inclusion and exclusion while those in inventory analysis involve flows and transformations. Foundational decisions about the basis of comparison (functional unit), bounds of the study, and physical relationships between included processes largely dictate the representativeness and, therefore, the value of an LCA. It is for this reason that problems in functional unit definition, boundary selection, and allocation are the most critical examined in the first part of this review.