scispace - formally typeset
Search or ask a question

Showing papers by "University of Maryland, Baltimore County published in 2013"


Journal ArticleDOI
TL;DR: The Collection 6 (C6) algorithm as mentioned in this paper was proposed to retrieve aerosol optical depth (AOD) and aerosol size parameters from MODIS-observed spectral reflectance.
Abstract: . The twin Moderate resolution Imaging Spectroradiometer (MODIS) sensors have been flying on Terra since 2000 and Aqua since 2002, creating an extensive data set of global Earth observations. Here, we introduce the Collection 6 (C6) algorithm to retrieve aerosol optical depth (AOD) and aerosol size parameters from MODIS-observed spectral reflectance. While not a major overhaul from the previous Collection 5 (C5) version, there are enough changes that there are significant impacts to the products and their interpretation. The C6 aerosol data set will be created from three separate retrieval algorithms that operate over different surface types. These are the two "Dark Target" (DT) algorithms for retrieving (1) over ocean (dark in visible and longer wavelengths) and (2) over vegetated/dark-soiled land (dark in the visible), plus the "Deep Blue" (DB) algorithm developed originally for retrieving (3) over desert/arid land (bright in the visible). Here, we focus on DT-ocean and DT-land (#1 and #2). We have updated assumptions for central wavelengths, Rayleigh optical depths and gas (H2O, O3, CO2, etc.) absorption corrections, while relaxing the solar zenith angle limit (up to ≤ 84°) to increase poleward coverage. For DT-land, we have updated the cloud mask to allow heavy smoke retrievals, fine-tuned the assignments for aerosol type as function of season/location, corrected bugs in the Quality Assurance (QA) logic, and added diagnostic parameters such topographic altitude. For DT-ocean, improvements include a revised cloud mask for thin-cirrus detection, inclusion of wind speed dependence on the surface reflectance, updates to logic of QA Confidence flag (QAC) assignment, and additions of important diagnostic information. At the same time, we quantified how "upstream" changes to instrument calibration, land/sea masking and cloud masking will also impact the statistics of global AOD, and affect Terra and Aqua differently. For Aqua, all changes will result in reduced global AOD (by 0.02) over ocean and increased AOD (by 0.02) over land, along with changes in spatial coverage. We compared preliminary data to surface-based sun photometer data, and show that C6 should improve upon C5. C6 will include a merged DT/DB product over semi-arid land surfaces for reduced-gap coverage and better visualization, and new information about clouds in the aerosol field. Responding to the needs of the air quality community, in addition to the standard 10 km product, C6 will include a global (DT-land and DT-ocean) aerosol product at 3 km resolution.

1,628 citations


Journal ArticleDOI
A. A. Abdo1, A. A. Abdo2, Marco Ajello3, Alice Allafort4  +254 moreInstitutions (60)
TL;DR: In this article, a catalog of gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite is presented.
Abstract: This catalog summarizes 117 high-confidence > 0.1 GeV gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite. Half are neutron stars discovered using LAT data, through periodicity searches in gamma-ray and radio data around LAT unassociated source positions. The 117 pulsars are evenly divided into three groups: millisecond pulsars, young radio-loud pulsars, and young radio-quiet pulsars. We characterize the pulse profiles and energy spectra and derive luminosities when distance information exists. Spectral analysis of the off-peak phase intervals indicates probable pulsar wind nebula emission for four pulsars, and off-peak magnetospheric emission for several young and millisecond pulsars. We compare the gamma-ray properties with those in the radio, optical, and X-ray bands. We provide flux limits for pulsars with no observed gamma-ray emission, highlighting a small number of gamma-faint, radio-loud pulsars. The large, varied gamma-ray pulsar sample constrains emission models. Fermi's selection biases complement those of radio surveys, enhancing comparisons with predicted population distributions.

929 citations


Journal ArticleDOI
Markus Ackermann, Marco Ajello1, Alice Allafort2, Luca Baldini3  +197 moreInstitutions (42)
15 Feb 2013-Science
TL;DR: The characteristic pion-decay feature is detected in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope, providing direct evidence that cosmic-ray protons are accelerated in SNRs.
Abstract: Cosmic rays are particles (mostly protons) accelerated to relativistic speeds. Despite wide agreement that supernova remnants (SNRs) are the sources of galactic cosmic rays, unequivocal evidence for the acceleration of protons in these objects is still lacking. When accelerated protons encounter interstellar material, they produce neutral pions, which in turn decay into gamma rays. This offers a compelling way to detect the acceleration sites of protons. The identification of pion-decay gamma rays has been difficult because high-energy electrons also produce gamma rays via bremsstrahlung and inverse Compton scattering. We detected the characteristic pion-decay feature in the gamma-ray spectra of two SNRs, IC 443 and W44, with the Fermi Large Area Telescope. This detection provides direct evidence that cosmic-ray protons are accelerated in SNRs.

846 citations


Journal ArticleDOI
TL;DR: An extended model of the SEIPS, SEIPS 2.0 is a new human factors/ergonomics framework for studying and improving health and healthcare that describes how sociotechnical systems shape health-related work done by professionals and non-professionals, independently and collaboratively.
Abstract: Healthcare practitioners, patient safety leaders, educators and researchers increasingly recognise the value of human factors/ergonomics and make use of the discipline's person-centred models of sociotechnical systems. This paper first reviews one of the most widely used healthcare human factors systems models, the Systems Engineering Initiative for Patient Safety (SEIPS) model, and then introduces an extended model, 'SEIPS 2.0'. SEIPS 2.0 incorporates three novel concepts into the original model: configuration, engagement and adaptation. The concept of configuration highlights the dynamic, hierarchical and interactive properties of sociotechnical systems, making it possible to depict how health-related performance is shaped at 'a moment in time'. Engagement conveys that various individuals and teams can perform health-related activities separately and collaboratively. Engaged individuals often include patients, family caregivers and other non-professionals. Adaptation is introduced as a feedback mechanism that explains how dynamic systems evolve in planned and unplanned ways. Key implications and future directions for human factors research in healthcare are discussed.

773 citations


Journal ArticleDOI
TL;DR: Recent scientific evidence and theory is synthesized to explain why relatively small human populations likely caused widespread and profound ecological changes more than 3,000 y ago, whereas the largest and wealthiest human populations in history are using less arable land per person every decade.
Abstract: Human use of land has transformed ecosystem pattern and process across most of the terrestrial biosphere, a global change often described as historically recent and potentially catastrophic for both humanity and the biosphere. Interdisciplinary paleoecological, archaeological, and historical studies challenge this view, indicating that land use has been extensive and sustained for millennia in some regions and that recent trends may represent as much a recovery as an acceleration. Here we synthesize recent scientific evidence and theory on the emergence, history, and future of land use as a process transforming the Earth System and use this to explain why relatively small human populations likely caused widespread and profound ecological changes more than 3,000 y ago, whereas the largest and wealthiest human populations in history are using less arable land per person every decade. Contrasting two spatially explicit global reconstructions of land-use history shows that reconstructions incorporating adaptive changes in land-use systems over time, including land-use intensification, offer a more spatially detailed and plausible assessment of our planet's history, with a biosphere and perhaps even climate long ago affected by humans. Although land-use processes are now shifting rapidly from historical patterns in both type and scale, integrative global land-use models that incorporate dynamic adaptations in human-environment relationships help to advance our understanding of both past and future land-use changes, including their sustainability and potential global effects.

608 citations


Proceedings ArticleDOI
13 May 2013
TL;DR: The role of Twitter, during Hurricane Sandy (2012) to spread fake images about the disaster was highlighted, and automated techniques can be used in identifying real images from fake images posted on Twitter.
Abstract: In today's world, online social media plays a vital role during real world events, especially crisis events. There are both positive and negative effects of social media coverage of events, it can be used by authorities for effective disaster management or by malicious entities to spread rumors and fake news. The aim of this paper, is to highlight the role of Twitter, during Hurricane Sandy (2012) to spread fake images about the disaster. We identified 10,350 unique tweets containing fake images that were circulated on Twitter, during Hurricane Sandy. We performed a characterization analysis, to understand the temporal, social reputation and influence patterns for the spread of fake images. Eighty six percent of tweets spreading the fake images were retweets, hence very few were original tweets. Our results showed that top thirty users out of 10,215 users (0.3%) resulted in 90% of the retweets of fake images; also network links such as follower relationships of Twitter, contributed very less (only 11%) to the spread of these fake photos URLs. Next, we used classification models, to distinguish fake images from real images of Hurricane Sandy. Best results were obtained from Decision Tree classifier, we got 97% accuracy in predicting fake images from real. Also, tweet based features were very effective in distinguishing fake images tweets from real, while the performance of user based features was very poor. Our results, showed that, automated techniques can be used in identifying real images from fake images posted on Twitter.

586 citations


Journal ArticleDOI
TL;DR: In this article, the spectral signatures of three basic components: atmospheric absorption, surface reflectance, and fluorescence radiance are disentangled using a principal component analysis (PCA) approach.
Abstract: . Globally mapped terrestrial chlorophyll fluorescence retrievals are of high interest because they can provide information on the functional status of vegetation including light-use efficiency and global primary productivity that can be used for global carbon cycle modeling and agricultural applications. Previous satellite retrievals of fluorescence have relied solely upon the filling-in of solar Fraunhofer lines that are not significantly affected by atmospheric absorption. Although these measurements provide near-global coverage on a monthly basis, they suffer from relatively low precision and sparse spatial sampling. Here, we describe a new methodology to retrieve global far-red fluorescence information; we use hyperspectral data with a simplified radiative transfer model to disentangle the spectral signatures of three basic components: atmospheric absorption, surface reflectance, and fluorescence radiance. An empirically based principal component analysis approach is employed, primarily using cloudy data over ocean, to model and solve for the atmospheric absorption. Through detailed simulations, we demonstrate the feasibility of the approach and show that moderate-spectral-resolution measurements with a relatively high signal-to-noise ratio can be used to retrieve far-red fluorescence information with good precision and accuracy. The method is then applied to data from the Global Ozone Monitoring Instrument 2 (GOME-2). The GOME-2 fluorescence retrievals display similar spatial structure as compared with those from a simpler technique applied to the Greenhouse gases Observing SATellite (GOSAT). GOME-2 enables global mapping of far-red fluorescence with higher precision over smaller spatial and temporal scales than is possible with GOSAT. Near-global coverage is provided within a few days. We are able to show clearly for the first time physically plausible variations in fluorescence over the course of a single month at a spatial resolution of 0.5° × 0.5°. We also show some significant differences between fluorescence and coincident normalized difference vegetation indices (NDVI) retrievals.

536 citations


Journal ArticleDOI
TL;DR: The Swift/Burst Alert Telescope (BAT) provides near real-time coverage of the X-ray sky in the energy range 15-50 keV with a detection sensitivity of 5.3 mCrab for a full-day observation and a time resolution as fine as 64 s as mentioned in this paper.
Abstract: The Swift/Burst Alert Telescope (BAT) hard X-ray transient monitor provides near real-time coverage of the X-ray sky in the energy range 15-50 keV. The BAT observes 88% of the sky each day with a detection sensitivity of 5.3 mCrab for a full-day observation and a time resolution as fine as 64 s. The three main purposes of the monitor are (1) the discovery of new transient X-ray sources, (2) the detection of outbursts or other changes in the flux of known X-ray sources, and (3) the generation of light curves of more than 900 sources spanning over eight years. The primary interface for the BAT transient monitor is a public Web site. Between 2005 February 12 and 2013 April 30, 245 sources have been detected in the monitor, 146 of them persistent and 99 detected only in outburst. Among these sources, 17 were previously unknown and were discovered in the transient monitor. In this paper, we discuss the methodology and the data processing and filtering for the BAT transient monitor and review its sensitivity and exposure. We provide a summary of the source detections and classify them according to the variability of their light curves. Finally, we review all new BAT monitor discoveries. For the new sources that are previously unpublished, we present basic data analysis and interpretations.

520 citations


Journal ArticleDOI
TL;DR: This paper reviews the history of mechanochemistry, which begins with prehistoric times, when reactions could be initiated during grinding and rubbing accidentally, and follows the main developments until recent results and current trends.
Abstract: This paper reviews the history of mechanochemistry. It begins with prehistoric times, when reactions could be initiated during grinding and rubbing accidentally, and follows the main developments until recent results and current trends. There are very few records on mechanochemistry until the first systematic investigations by Spring and Lea at the end of the 19th century. For the next decades, mechanochemistry developed slowly; minerals, inorganic compounds, and polymers were the main subjects of investigation. The area became more organized in the 1960s, when several large groups were established and the first dedicated conferences were held. Mechanical alloying was invented in 1966 independently and it became a subject of intense research. Interaction between the two topics was established in the 1990s. In recent years, the mechanochemical synthesis of organic compounds was added to the main subjects and the invention of the atomic force microscope provided new ways to manipulate atoms and molecules by direct mechanical action. The theoretical explanation of mechanochemical phenomena is difficult, as the mechanism is system specific and several length and time scales are involved. Thiessen proposed the first theory, the magma–plasma model, in 1967, and deeper insight is being obtained by computer modelling combined with empirical work. Practical applications have been an important motivation throughout the history of mechanochemistry. It is used alone or in combination with other steps in an increasing number of technologies.

454 citations


Journal ArticleDOI
TL;DR: In this article, the authors demonstrate a new aerial remote sensing system enabling routine and inexpensive aerial 3D measurements of canopy structure and spectral attributes, with properties similar to those of LIDAR, but with RGB (red-green-blue) spectral attributes for each point.

451 citations


Journal ArticleDOI
08 Feb 2013-Science
TL;DR: The Moon's gravity field reveals that impacts have homogenized the density of the crust and fractured it extensively, and GRAIL elucidates the role of impact bombardment in homogenizing the distribution of shallow density anomalies on terrestrial planetary bodies.
Abstract: Spacecraft-to-spacecraft tracking observations from the Gravity Recovery and Interior Laboratory (GRAIL) have been used to construct a gravitational field of the Moon to spherical harmonic degree and order 420. The GRAIL field reveals features not previously resolved, including tectonic structures, volcanic landforms, basin rings, crater central peaks, and numerous simple craters. From degrees 80 through 300, over 98% of the gravitational signature is associated with topography, a result that reflects the preservation of crater relief in highly fractured crust. The remaining 2% represents fine details of subsurface structure not previously resolved. GRAIL elucidates the role of impact bombardment in homogenizing the distribution of shallow density anomalies on terrestrial planetary bodies.

Journal ArticleDOI
27 Sep 2013-Science
TL;DR: Samples from the Rocknest aeolian deposit were heated to ~835°C under helium flow and evolved gases analyzed by Curiosity's Sample Analysis at Mars instrument suite, suggesting that oxygen is produced from thermal decomposition of an oxychloride compound.
Abstract: Samples from the Rocknest aeolian deposit were heated to ~835°C under helium flow and evolved gases analyzed by Curiosity's Sample Analysis at Mars instrument suite. H2O, SO2, CO2, and O2 were the major gases released. Water abundance (1.5 to 3 weight percent) and release temperature suggest that H2O is bound within an amorphous component of the sample. Decomposition of fine-grained Fe or Mg carbonate is the likely source of much of the evolved CO2. Evolved O2 is coincident with the release of Cl, suggesting that oxygen is produced from thermal decomposition of an oxychloride compound. Elevated δD values are consistent with recent atmospheric exchange. Carbon isotopes indicate multiple carbon sources in the fines. Several simple organic compounds were detected, but they are not definitively martian in origin.

Journal ArticleDOI
TL;DR: This work states that metal-Enhanced Fluorescence is a viable alternative to ZnO Platforms for Enhanced Directional Fluorescence Applications and that the currentzinc Oxide Nanomaterials market is likely to shrink in the coming years.
Abstract: Preface. Contributors. Mental-Enhanced Fluorescence: Progress Towards a Unified Plasmon-Fluorophore Description (Kadir Aslan and Chris D. Geddes). Spectral Profile Modifications In Metal-Enhanced Fluorescence (E. C. Le Ru, J. Grand, N. Felidj, J. Aubard, G. Levi, A. Hohenau, J. R. Krenn, E. Blackie and P. G. Etchegoin). The Role Of Plasmonic Engineering In Metal-Enhanced Fluorescence (Daniel J. Ross, Nicholas P.W. Pieczonka and R. F. Aroca). Importance of Spectral Overlap: Fluorescence Enhancement by Single Metal Nanoparticles (Keiko Munechika, Yeechi Chen, Jessica M. Smith and.David S. Ginger). Near-IR Metal Enhanced Fluorescence And Controlled Colloidal Aggregation (Jon P. Anderson, Mark Griffiths, John G. Williams, Daniel L. Grone, Dave L. Steffens, and Lyle M. Middendorf). Optimisation Of Plasmonic Enhancement Of Fluorescence For Optical Biosensor Applications (Colette McDonagh, Ondrej Stranik, Robert Nooney and Brian D. MacCraith). Microwave-Accelerated Metal-Enhanced Fluorescence (Kadir Aslan and Chris D. Geddes). Localized Surface Plasmon Coupled Fluorescence Fiber Optic Based Biosensing (Chien Chou, Ja-An Annie Ho, Chii-Chang Chen, Ming-Yaw, Wei-Chih Liu, Ying-Feng Chang, Chen Fu, Si-Han Chen and Ting-Yang Kuo). Surface Plasmon Enhanced Photochemistry (Stephen K. Gray). Metal-Enhanced Generation of Oxygen Rich Species (Yongxia Zhang, Kadir Aslan and Chris D. Geddes). Synthesis Of Anisotropic Noble Metal Nanoparticles (Damian Aherne, Deirdre M. Ledwith and John M. Kelly). Enhanced Fluorescence Detection Enabled By Zinc Oxide Nanomaterials (Jong-in Hahm). ZnO Platforms For Enhanced Directional Fluorescence Applications (H.C. Ong, D.Y. Lei, J. Li and J.B. Xu). E-Beam Lithography And Spontaneous Galvanic Displacement Reactions For Spatially Controlled MEF Applications (Luigi Martiradonna, S. Shiv Shankar and Pier Paolo Pompa). Metal-Enhanced Chemiluminescence (Yongxia Zhang, Kadir Aslan and Chris D. Geddes). Enhanced Fluorescence From Gratings (Chii-Wann Lin, Nan-Fu Chiu, Jiun-Haw Lee and Chih-Kung Lee). Enhancing Fluorescence with Sub-Wavelength Metallic Apertures (Steve Blair and Jerome Wenger). Enhanced Multi-Photon Excitation of Tryptophan-Silver Colloid (Renato E. de Araujo, Diego Rativa and Anderson S. L. Gomes). Plasmon-enhanced radiative rates and applications to organic electronics (Lewis Rothberg and Shanlin Pan). Fluorescent Quenching Gold Nanoparticles: Potential Biomedical Applications (Xiaohua Huang, Ivan H. El-Sayed, and Mostafa A. El-Sayed). Index.

Proceedings Article
13 Jun 2013
TL;DR: Three semantic text similarity systems developed for the *SEM 2013 STS shared task used a simple term alignment algorithm augmented with penalty terms, and two used support vector regression models to combine larger sets of features.
Abstract: We describe three semantic text similarity systems developed for the *SEM 2013 STS shared task and the results of the corresponding three runs. All of them shared a word similarity feature that combined LSA word similarity and WordNet knowledge. The first, which achieved the best mean score of the 89 submitted runs, used a simple term alignment algorithm augmented with penalty terms. The other two runs, ranked second and fourth, used support vector regression models to combine larger sets of features.

Journal ArticleDOI
TL;DR: A research framework with an integrated view of social commerce that consists of four key components: business, technology, people, and information is proposed and helps to understand the development of social Commerce research and practice to date.

Book
07 Aug 2013
TL;DR: In this paper, the U.S. Department of Energy through I the Atmospheric Radiation Measurement (ARM) program has constructed four long-term atmospheric observing sites in strategic climate regimes (north central Oklahoma, In Barrow, Alaska, and Nauru and Manus Islands in the tropical western Pacific).
Abstract: Atmospheric radiative forcing, surface radiation budget, and top of the atmosphere radiance interpretation involves a knowledge of the vertical height structure of overlying cloud and aerosol layers. During the last decade, the U.S. Department of Energy through I the Atmospheric Radiation Measurement (ARM) program has constructed four long- term atmospheric observing sites in strategic climate regimes (north central Oklahoma, In Barrow. Alaska, and Nauru and Manus Islands in the tropical western Pacific). Micro Pulse Lidar (MPL) systems provide continuous, autonomous observation of all significant atmospheric cloud and aerosol at each of the central ARM facilities. Systems are compact and transmitted pulses are eye-safe. Eye-safety is achieved by expanding relatively low-powered outgoing Pulse energy through a shared, coaxial transmit/receive telescope. ARM NIPL system specifications, and specific unit optical designs are discussed. Data normalization and calibration techniques are presented. A multiple cloud boundary detection algorithm is also described. These techniques in tandem represent an operational value added processing package used to produce normalized data products for Cloud and aerosol research and the historical ARM data archive.

Journal ArticleDOI
TL;DR: In this paper, the authors address the recent trajectory of local e-government in the United States and compare it with the predictions of early eGovernment writings, using empirical data from two nationwide surveys of eGovernment among American local governments.
Abstract: In this article, the authors address the recent trajectory of local e-government in the United States and compare it with the predictions of early e-government writings, using empirical data from two nationwide surveys of e-government among American local governments. The authors find that local e-government has not produced the results that those writings predicted. Instead, its development has largely been incremental, and local e-government is mainly about delivering information and services online, followed by a few transactions and limited interactivity. Local e-government is also mainly one way, from government to citizens, and there is little or no evidence that it is transformative in any way. This disparity between early predictions and actual results is partly attributable to the incremental nature of American public administration. Other reasons include a lack of attention by early writers to the history of information technology in government and the influence of technological determinism on those writings.

Journal ArticleDOI
TL;DR: A single scoop of the Rocknest aeolian deposit was sieved and four separate sample portions, each with a mass of ~50mg, were delivered to individual cups inside the Sample Analysis at Mars (SAM) instrument by the Mars Science Laboratory rover's ample acquisition system.
Abstract: [1] A single scoop of the Rocknest aeolian deposit was sieved (<150 μm), and four separate sample portions, each with a mass of ~50mg, were delivered to individual cups inside the Sample Analysis at Mars (SAM) instrument by the Mars Science Laboratory rover’ ss ample acquisition system. The samples were analyzed separately by the SAM pyrolysis evolved gas and gas chromatograph mass spectrometer analysis modes. Several chlorinated hydrocarbons including chloromethane, dichloromethane, trichloromethane, a chloromethylpropene, and chlorobenzene were identified by SAM above background levels with abundances of ~0.01 to 2.3nmol. The evolution of the chloromethanes observed during pyrolysis is coincident with the increase in O2 released from the Rocknest sample and the decomposition of a product of N-methyl-N-(tert-butyldimethylsilyl)-trifluoroacetamide (MTBSTFA), a chemical whose vapors were released from a derivatization cup inside SAM. The best candidate for the oxychlorine compounds in Rocknest is a hydrated calcium perchlorate (Ca(ClO4)2·nH2O), based on the temperature release of O2 that correlates with the release of the chlorinated hydrocarbons measured by SAM, although other chlorine-bearing phases are being considered. Laboratory analog experiments suggest that the reaction of Martian chlorine from perchlorate decomposition with terrestrial organic carbon from MTBSTFA during pyrolysis can explain the presence of three chloromethanes and a chloromethylpropene detected by SAM. Chlorobenzene may be attributed to reactionsofMartian chlorine released during pyrolysiswith terrestrial benzene or toluene derived from 2,6-diphenylphenylene oxide (Tenax) on the SAM hydrocarbon trap. At this time we do not have definitive evidence to support a nonterrestrial carbon source for these chlorinated hydrocarbons, nor do we exclude the possibility that future SAM analyses will reveal the presence of organic compounds native to the Martian regolith.

Journal ArticleDOI
TL;DR: In this article, the MODIS aerosol team released a nominal 3 km product as part of their Collection 6 release, which is able to retrieve over the ocean closer to islands and coastlines, and is better able to resolve fine aerosol features such as smoke plumes over both ocean and land.
Abstract: . After more than a decade of producing a nominal 10 km aerosol product based on the dark target method, the MODerate resolution Imaging Spectroradiometer (MODIS) aerosol team will be releasing a nominal 3 km product as part of their Collection 6 release. The new product differs from the original 10 km product only in the manner in which reflectance pixels are ingested, organized and selected by the aerosol algorithm. Overall, the 3 km product closely mirrors the 10 km product. However, the finer resolution product is able to retrieve over the ocean closer to islands and coastlines, and is better able to resolve fine aerosol features such as smoke plumes over both ocean and land. In some situations, it provides retrievals over entire regions that the 10 km product barely samples. In situations traditionally difficult for the dark target algorithm such as over bright or urban surfaces, the 3 km product introduces isolated spikes of artificially high aerosol optical depth (AOD) that the 10 km algorithm avoids. Over land, globally, the 3 km product appears to be 0.01 to 0.02 higher than the 10 km product, while over ocean, the 3 km algorithm is retrieving a proportionally greater number of very low aerosol loading situations. Based on collocations with ground-based observations for only six months, expected errors associated with the 3 km land product are determined to be greater than that of the 10 km product: ± 0.05 ± 0.20 AOD. Over ocean, the suggestion is for expected errors to be the same as the 10 km product: ± 0.03 ± 0.05 AOD, but slightly less accurate in the coastal zone. The advantage of the product is on the local scale, which will require continued evaluation not addressed here. Nevertheless, the new 3 km product is expected to provide important information complementary to existing satellite-derived products and become an important tool for the aerosol community.

Journal ArticleDOI
TL;DR: In this article, the authors abstracted all study data and verified all study information and results by a senior author (CE or CM) and verified the results with their own data.
Abstract: ion of Data Four of the authors (CE, CM, RS, & KW) abstracted all study data. All study information and results were checked and confirmed by a senior author (CE or CM).

Journal ArticleDOI
TL;DR: The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip.
Abstract: The Spitzer Extended Deep Survey (SEDS) is a very deep infrared survey within five well-known extragalactic science fields: the UKIDSS Ultra-Deep Survey, the Extended Chandra Deep Field South, COSMOS, the Hubble Deep Field North, and the Extended Groth Strip. SEDS covers a total area of 1.46 deg(2) to a depth of 26 AB mag (3s) in both of the warm Infrared Array Camera (IRAC) bands at 3.6 and 4.5 mu m. Because of its uniform depth of coverage in so many widely-separated fields, SEDS is subject to roughly 25% smaller errors due to cosmic variance than a single-field survey of the same size. SEDS was designed to detect and characterize galaxies from intermediate to high redshifts (z = 2-7) with a built-in means of assessing the impact of cosmic variance on the individual fields. Because the full SEDS depth was accumulated in at least three separate visits to each field, typically with six- month intervals between visits, SEDS also furnishes an opportunity to assess the infrared variability of faint objects. This paper describes the SEDS survey design, processing, and publicly-available data products. Deep IRAC counts for the more than 300,000 galaxies detected by SEDS are consistent with models based on known galaxy populations. Discrete IRAC sources contribute 5.6 +/- 1.0 and 4.4 +/- 0.8 nW m(-2) sr(-1) at 3.6 and 4.5 mu m to the diffuse cosmic infrared background (CIB). IRAC sources cannot contribute more than half of the total CIB flux estimated from DIRBE data. Barring an unexpected error in the DIRBE flux estimates, half the CIB flux must therefore come from a diffuse component.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated light absorption of organic aerosol (OA) in fresh and photo-chemically aged biomass-burning emissions and found that SOA is less absorptive than POA in the long visible, but exhibits stronger wavelength-dependence and is more absorptive in the short visible and near-UV.
Abstract: . Experiments were conducted to investigate light absorption of organic aerosol (OA) in fresh and photo-chemically aged biomass-burning emissions. The experiments considered residential hardwood fuel (oak) and fuels commonly consumed in wild-land and prescribed fires in the United States (pocosin pine and gallberry). Photo-chemical aging was performed in an environmental chamber. We constrained the effective light-absorption properties of the OA using conservative limiting assumptions, and found that both primary organic aerosol (POA) in the fresh emissions and secondary organic aerosol (SOA) produced by photo-chemical aging contain brown carbon, and absorb light to a significant extent. This work presents the first direct evidence that SOA produced in aged biomass-burning emissions is absorptive. For the investigated fuels, SOA is less absorptive than POA in the long visible, but exhibits stronger wavelength-dependence and is more absorptive in the short visible and near-UV. Light absorption by SOA in biomass-burning emissions might be an important contributor to the global radiative forcing budget.

Journal ArticleDOI
TL;DR: An assessment of the state and historic development of evaluation practices as reported in papers published at the IEEE Visualization conference found that evaluations specific to assessing resulting images and algorithm performance are the most prevalent and generally the studies reporting requirements analyses and domain-specific work practices are too informally reported.
Abstract: We present an assessment of the state and historic development of evaluation practices as reported in papers published at the IEEE Visualization conference. Our goal is to reflect on a meta-level about evaluation in our community through a systematic understanding of the characteristics and goals of presented evaluations. For this purpose we conducted a systematic review of ten years of evaluations in the published papers using and extending a coding scheme previously established by Lam et al. [2012]. The results of our review include an overview of the most common evaluation goals in the community, how they evolved over time, and how they contrast or align to those of the IEEE Information Visualization conference. In particular, we found that evaluations specific to assessing resulting images and algorithm performance are the most prevalent (with consistently 80-90% of all papers since 1997). However, especially over the last six years there is a steady increase in evaluation methods that include participants, either by evaluating their performances and subjective feedback or by evaluating their work practices and their improved analysis and reasoning capabilities using visual tools. Up to 2010, this trend in the IEEE Visualization conference was much more pronounced than in the IEEE Information Visualization conference which only showed an increasing percentage of evaluation through user performance and experience testing. Since 2011, however, also papers in IEEE Information Visualization show such an increase of evaluations of work practices and analysis as well as reasoning using visual tools. Further, we found that generally the studies reporting requirements analyses and domain-specific work practices are too informally reported which hinders cross-comparison and lowers external validity.

Journal ArticleDOI
TL;DR: A real-time biosensor capable of continuously tracking a wide range of circulating drugs in living subjects, and which could replace periodic and disruptive blood draws at the patient’s bedside, much like continuous glucose monitors in widespread use today for diabetes.
Abstract: A sensor capable of continuously measuring specific molecules in the bloodstream in vivo would give clinicians a valuable window into patients’ health and their response to therapeutics. Such technology would enable truly personalized medicine, wherein therapeutic agents could be tailored with optimal doses for each patient to maximize efficacy and minimize side effects. Unfortunately, continuous, real-time measurement is currently only possible for a handful of targets, such as glucose, lactose, and oxygen, and the few existing platforms for continuous measurement are not generalizable for the monitoring of other analytes, such as small-molecule therapeutics. In response, we have developed a real-time biosensor capable of continuously tracking a wide range of circulating drugs in living subjects. Our microfluidic electrochemical detector for in vivo continuous monitoring (MEDIC) requires no exogenous reagents, operates at room temperature, and can be reconfigured to measure different target molecules by exchanging probes in a modular manner. To demonstrate the system’s versatility, we measured therapeutic in vivo concentrations of doxorubicin (a chemotherapeutic) and kanamycin (an antibiotic) in live rats and in human whole blood for several hours with high sensitivity and specificity at subminute temporal resolution. We show that MEDIC can also obtain pharmacokinetic parameters for individual animals in real time. Accordingly, just as continuous glucose monitoring technology is currently revolutionizing diabetes care, we believe that MEDIC could be a powerful enabler for personalized medicine by ensuring delivery of optimal drug doses for individual patients based on direct detection of physiological parameters.

Journal ArticleDOI
TL;DR: In this article, the authors performed a blind search for Fexxv He and/or Fexvi Ly absorption lines in a large sample of 51 type 1:0 1:9 AGN.
Abstract: We present the results of a new spectroscopic study of Fe K-band absorption in Active Galactic Nuclei (AGN). Using data obtained from the Suzaku public archive we have performed a statistically driven blind search for Fexxv He and/or Fexxvi Ly absorption lines in a large sample of 51 type 1:0 1:9 AGN. Through extensive Monte Carlo simulations we nd that statistically signicant absorption is detected at E & 6:7 keV in 20/51 sources at the PMC > 95% level, which corresponds to 40% of the total sample. In all cases, individual absorption lines are detected independently and simultaneously amongst the two (or three) available XIS detectors which conrms the robustness of the line detections. The most frequently observed outow phenomenology consists of two discrete absorption troughs corresponding to Fexxv He and Fexxvi Ly at a common velocity shift. From xstar tting the mean column density and ionisation parameter for the Fe K absorption components are log(NH=cm 2 ) 23 and log(= erg cm s 1 ) 4:5, respectively. Measured outow velocities span a continuous range from < 1; 500 km s 1 up to 100; 000 km s 1 , with mean and median values of 0:1 c and 0:056 c, respectively. The results of this work are consistent with those recently obtained using XMM-Newton and independently provides strong evidence for the existence of very highly-ionised circumnuclear material in a signicant fraction of both radio-quiet and radio-loud AGN in the local universe.

Journal ArticleDOI
TL;DR: The results suggest that, in addition to using the default privacy settings, students have developed a number of strategies to address their privacy needs, and these strategies are used primarily to guard against social privacy threats.
Abstract: The privacy paradox describes people's willingness to disclose personal information on social network sites despite expressing high levels of concern. In this study, we employ the distinction between institutional and social privacy to examine this phenomenon. We investigate what strategies undergraduate students have developed, and their motivations for using specific strategies. We employed a mixed-methods approach that included 77 surveys and 21 in-depth interviews. The results suggest that, in addition to using the default privacy settings, students have developed a number of strategies to address their privacy needs. These strategies are used primarily to guard against social privacy threats and consist of excluding contact information, using the limited profile option, untagging and removing photographs, and limiting Friendship requests from strangers. Privacy strategies are geared toward managing the Facebook profile, which we argue functions as a front stage. This active profile management allows ...

Journal ArticleDOI
TL;DR: In this paper, the authors developed an empirical method for retrieving annual, long-term continuous fields of impervious surface cover from the Landsat archive and applied it to the Washington, D.C.-Baltimore, MD megalopolis from 1984 to 2010.

Journal ArticleDOI
TL;DR: The impact of black carbon (BC) aerosols on the global radiation balance is not well constrained as mentioned in this paper, and at least 20% of the present uncertainty in modeled BC direct radiative forcing (RF) is due to diversity in the simulated vertical profile of BC mass.
Abstract: The impact of black carbon (BC) aerosols on the global radiation balance is not well constrained. Here twelve global aerosol models are used to show that at least 20% of the present uncertainty in modeled BC direct radiative forcing (RF) is due to diversity in the simulated vertical profile of BC mass. Results are from phases 1 and 2 of the global aerosol model intercomparison project (AeroCom). Additionally, a significant fraction of the variability is shown to come from high altitudes, as, globally, more than 40% of the total BC RF is exerted above 5 km. BC emission regions and areas with transported BC are found to have differing characteristics. These insights into the importance of the vertical profile of BC lead us to suggest that observational studies are needed to better characterize the global distribution of BC, including in the upper troposphere.

Journal ArticleDOI
TL;DR: In this article, an unsupervised outlier detection approach for wireless sensor networks is proposed, which is flexible with respect to the outlier definition and uses only single-hop communication, thus permitting very simple node failure detection and message reliability assurance mechanisms.
Abstract: To address the problem of unsupervised outlier detection in wireless sensor networks, we develop an approach that (1) is flexible with respect to the outlier definition, (2) computes the result in-network to reduce both bandwidth and energy consumption, (3) uses only single-hop communication, thus permitting very simple node failure detection and message reliability assurance mechanisms (e.g., carrier-sense), and (4) seamlessly accommodates dynamic updates to data. We examine performance by simulation, using real sensor data streams. Our results demonstrate that our approach is accurate and imposes reasonable communication and power consumption demands.

Journal ArticleDOI
TL;DR: It is concluded that spatial heterogeneity in drivers and responses, and lack of strong continental interconnectivity, probably induce relatively smooth changes at the global scale, without an expectation of marked tipping patterns.
Abstract: Tipping points – where systems shift radically and potentially irreversibly into a different state – have received considerable attention in ecology. Although there is convincing evidence that human drivers can cause regime shifts at local and regional scales, the increasingly invoked concept of planetary scale tipping points in the terrestrial biosphere remains unconfirmed. By evaluating potential mechanisms and drivers, we conclude that spatial heterogeneity in drivers and responses, and lack of strong continental interconnectivity, probably induce relatively smooth changes at the global scale, without an expectation of marked tipping patterns. This implies that identifying critical points along global continua of drivers might be unfeasible and that characterizing global biotic change with single aggregates is inapt.