scispace - formally typeset
Search or ask a question

Showing papers on "Context (language use) published in 2012"


Journal ArticleDOI
TL;DR: In this paper, the authors extended the unified theory of acceptance and use of technology (UTAUT) to study acceptance of technology in a consumer context and proposed UTAUT2 incorporating three constructs into UTAAUT: hedonic motivation, price value, and habit.
Abstract: This paper extends the unified theory of acceptance and use of technology (UTAUT) to study acceptance and use of technology in a consumer context. Our proposed UTAUT2 incorporates three constructs into UTAUT: hedonic motivation, price value, and habit. Individual differences--namely, age, gender, and experience--are hypothesized to moderate the effects of these constructs on behavioral intention and technology use. Results from a two-stage online survey, with technology use data collected four months after the first survey, of 1,512 mobile Internet consumers supported our model. Compared to UTAUT, the extensions proposed in UTAUT2 produced a substantial improvement in the variance explained in behavioral intention (56 percent to 74 percent) and technology use (40 percent to 52 percent). The theoretical and managerial implications of these results are discussed.

6,744 citations


MonographDOI
01 Jan 2012
TL;DR: Social networks operate on many levels, from families up to the level of nations, and play a critical role in determining the way problems are solved, organizations are run, and the degree to which individuals achieve their goals.
Abstract: This book introduces the non-specialist reader to the principal ideas, nature and purpose of social network analysis. Social networks operate on many levels, from families up to the level of nations, and play a critical role in determining the way problems are solved, organizations are run, and the degree to which individuals achieve their goals. Social network theory maps these relationships between individual actors. Though relatively new on the scene it has become hugely influential across the social sciences. Assuming no prior knowledge of quantitative sociology, this book presents the key ideas in context through examples and illustrations. Using a structured approach to understanding work in this area, John Scott signposts further reading and online sources so readers can develop their knowledge and skills to become practitioners of this research method. A series of Frequently Asked Questions takes the reader through the main objections raised against social network analysis and answers the various queries that will come up once the reader has worked their way through the book.

5,439 citations


Posted Content
TL;DR: In this paper, the authors extended the unified theory of acceptance and use of technology (UTAUT) to study acceptance of technology in a consumer context and proposed UTAUT2 incorporating three constructs into UTAAUT: hedonic motivation, price value, and habit.
Abstract: This paper extends the unified theory of acceptance and use of technology (UTAUT) to study acceptance and use of technology in a consumer context. Our proposed UTAUT2 incorporates three constructs into UTAUT: hedonic motivation, price value, and habit. Individual differences — namely, age, gender, and experience — are hypothesized to moderate the effects of these constructs on behavioral intention and technology use. Results from a two-stage online survey, with technology use data collected four months after the first survey, of 1,512 mobile Internet consumers supported our model. Compared to UTAUT, the extensions proposed in UTAUT2 produced a substantial improvement in the variance explained in behavioral intention (56 percent to 74 percent) and technology use (40 percent to 52 percent). The theoretical and managerial implications of these results are discussed.

4,986 citations


Journal ArticleDOI
01 Feb 2012-JAMA
TL;DR: The most recent estimates of obesity prevalence in US children and adolescents for 2009-2010 are presented and trend analyses over a 12-year period indicated a significant increase in obesity prevalence between 1999-2000 and 2009- 2010 in males aged 2 through 19 years but not in females.
Abstract: Context: The prevalence of childhood obesity increased in the 1980s and 1990s but there were no significant changes in prevalence between 1999-2000 and 2007-2008 in the United States.

3,941 citations


Journal ArticleDOI
TL;DR: A pre-trained deep neural network hidden Markov model (DNN-HMM) hybrid architecture that trains the DNN to produce a distribution over senones (tied triphone states) as its output that can significantly outperform the conventional context-dependent Gaussian mixture model (GMM)-HMMs.
Abstract: We propose a novel context-dependent (CD) model for large-vocabulary speech recognition (LVSR) that leverages recent advances in using deep belief networks for phone recognition. We describe a pre-trained deep neural network hidden Markov model (DNN-HMM) hybrid architecture that trains the DNN to produce a distribution over senones (tied triphone states) as its output. The deep belief network pre-training algorithm is a robust and often helpful way to initialize deep neural networks generatively that can aid in optimization and reduce generalization error. We illustrate the key components of our model, describe the procedure for applying CD-DNN-HMMs to LVSR, and analyze the effects of various modeling choices on performance. Experiments on a challenging business search dataset demonstrate that CD-DNN-HMMs can significantly outperform the conventional context-dependent Gaussian mixture model (GMM)-HMMs, with an absolute sentence accuracy improvement of 5.8% and 9.2% (or relative error reduction of 16.0% and 23.2%) over the CD-GMM-HMMs trained using the minimum phone error rate (MPE) and maximum-likelihood (ML) criteria, respectively.

3,120 citations


Journal ArticleDOI
TL;DR: It is argued that individual differences in EFs, as measured with simple laboratory tasks, show both unity and diversity and are related to various clinically and societally important phenomena, and show some developmental stability.
Abstract: Executive functions (EFs)—a set of general-purpose control processes that regulate one’s thoughts and behaviors—have become a popular research topic lately and have been studied in many subdisciplines of psychological science. This article summarizes the EF research that our group has conducted to understand the nature of individual differences in EFs and their cognitive and biological underpinnings. In the context of a new theoretical framework that we have been developing (the unity/diversity framework), we describe four general conclusions that have emerged. Specifically, we argue that individual differences in EFs, as measured with simple laboratory tasks, (a) show both unity and diversity (different EFs are correlated yet separable), (b) reflect substantial genetic contributions, (c) are related to various clinically and societally important phenomena, and (d) show some developmental stability.

2,776 citations


Journal ArticleDOI
TL;DR: The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.
Abstract: An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): ‘Knowledge’, ‘Skills’, ‘Social/Professional Role and Identity’, ‘Beliefs about Capabilities’, ‘Optimism’, ‘Beliefs about Consequences’, ‘Reinforcement’, ‘Intentions’, ‘Goals’, ‘Memory, Attention and Decision Processes’, ‘Environmental Context and Resources’, ‘Social Influences’, ‘Emotions’, and ‘Behavioural Regulation’. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

2,663 citations


Journal ArticleDOI
TL;DR: The basic elements of the transforming growth factor-β (TGFβ) pathway were revealed and the concept of how the TGFβ signal travels from the membrane to the nucleus has been enriched with additional findings.
Abstract: The basic elements of the transforming growth factor-β (TGFβ) pathway were revealed more than a decade ago. Since then, the concept of how the TGFβ signal travels from the membrane to the nucleus has been enriched with additional findings, and its multifunctional nature and medical relevance have relentlessly come to light. However, an old mystery has endured: how does the context determine the cellular response to TGFβ? Solving this question is key to understanding TGFβ biology and its many malfunctions. Recent progress is pointing at answers.

2,481 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the use of Wannier functions in the context of electronic-structure theory, including their applications in analyzing the nature of chemical bonding, or as a local probe of phenomena related to electric polarization and orbital magnetization.
Abstract: The electronic ground state of a periodic system is usually described in terms of extended Bloch orbitals, but an alternative representation in terms of localized "Wannier functions" was introduced by Gregory Wannier in 1937. The connection between the Bloch and Wannier representations is realized by families of transformations in a continuous space of unitary matrices, carrying a large degree of arbitrariness. Since 1997, methods have been developed that allow one to iteratively transform the extended Bloch orbitals of a first-principles calculation into a unique set of maximally localized Wannier functions, accomplishing the solid-state equivalent of constructing localized molecular orbitals, or "Boys orbitals" as previously known from the chemistry literature. These developments are reviewed here, and a survey of the applications of these methods is presented. This latter includes a description of their use in analyzing the nature of chemical bonding, or as a local probe of phenomena related to electric polarization and orbital magnetization. Wannier interpolation schemes are also reviewed, by which quantities computed on a coarse reciprocal-space mesh can be used to interpolate onto much finer meshes at low cost, and applications in which Wannier functions are used as efficient basis functions are discussed. Finally the construction and use of Wannier functions outside the context of electronic-structure theory is presented, for cases that include phonon excitations, photonic crystals, and cold-atom optical lattices.

2,217 citations


Book
09 Feb 2012
TL;DR: A new type of output layer that allows recurrent networks to be trained directly for sequence labelling tasks where the alignment between the inputs and the labels is unknown, and an extension of the long short-term memory network architecture to multidimensional data, such as images and video sequences.
Abstract: Recurrent neural networks are powerful sequence learners. They are able to incorporate context information in a flexible way, and are robust to localised distortions of the input data. These properties make them well suited to sequence labelling, where input sequences are transcribed with streams of labels. The aim of this thesis is to advance the state-of-the-art in supervised sequence labelling with recurrent networks. Its two main contributions are (1) a new type of output layer that allows recurrent networks to be trained directly for sequence labelling tasks where the alignment between the inputs and the labels is unknown, and (2) an extension of the long short-term memory network architecture to multidimensional data, such as images and video sequences.

2,101 citations


Journal ArticleDOI
TL;DR: In this paper, the authors synthesize and extend a suite of conceptual frameworks, research findings, and practice-based knowledge into an integrative framework for collaborative governance, which specifies a set of nested dimensions that encompass a larger system context, a collaborative governance regime, and internal collaborative dynamics and actions that can generate impacts and adaptations across the systems.
Abstract: Collaborative governance draws from diverse realms of practice and research in public administration. This article synthesizes and extends a suite of conceptual frameworks, research findings, and practice-based knowledge into an integrative framework for collaborative governance. The framework specifies a set of nested dimensions that encompass a larger system context, a collaborative governance regime, and its internal collaborative dynamics and actions that can generate impacts and adaptations across the systems. The framework provides a broad conceptual map for situating and exploring components of cross-boundary governance systems that range from policy or programbased intergovernmental cooperation to place-based regional collaboration with nongovernmental stakeholders to public-private partnerships. The framework integrates knowledge about individual incentives and barriers to collection action, collaborative social learning and conflict resolution processes, and institutional arrangements for cross-boundary collaboration. It is presented as a general framework that might be applied to analyses at different scales, in different policy arenas, and varying levels of complexity. The article also offers 10 propositions about the dynamic interactions among components within the framework and concludes with a discussion about the implications of the framework for theory, research, evaluation, and practice.

31 Dec 2012
TL;DR: The current state of the SUMO package, its major applications, both by research topic and by example, as well as future developments and extensions are described.
Abstract: SUMO is an open source traffic simulation package including the simulation application itself as well as supporting tools, mainly for network import and demand modeling. SUMO helps to investigate a large variety of research topics, mainly in the context of traffic management and vehicular communications. We describe the current state of the package, its major applications, both by research topic and by example, as well as future developments and extensions. Keywords-microscopic traffic simulation; traffic management; open source; software

Journal ArticleDOI
TL;DR: In this paper, a review of wearable sensors and systems that are relevant to the field of rehabilitation is presented, focusing on health and wellness, safety, home rehabilitation, assessment of treatment efficacy, and early detection of disorders.
Abstract: The aim of this review paper is to summarize recent developments in the field of wearable sensors and systems that are relevant to the field of rehabilitation. The growing body of work focused on the application of wearable technology to monitor older adults and subjects with chronic conditions in the home and community settings justifies the emphasis of this review paper on summarizing clinical applications of wearable technology currently undergoing assessment rather than describing the development of new wearable sensors and systems. A short description of key enabling technologies (i.e. sensor technology, communication technology, and data analysis techniques) that have allowed researchers to implement wearable systems is followed by a detailed description of major areas of application of wearable technology. Applications described in this review paper include those that focus on health and wellness, safety, home rehabilitation, assessment of treatment efficacy, and early detection of disorders. The integration of wearable and ambient sensors is discussed in the context of achieving home monitoring of older adults and subjects with chronic conditions. Future work required to advance the field toward clinical deployment of wearable sensors and systems is discussed.

Posted Content
TL;DR: New estimates of the global economic burden of non-communicable diseases in 2010 are developed, and the size of the burden through 2030 is projected, to capture the thinking of the business community about the impact of NCDs on their enterprises.
Abstract: As policy-makers search for ways to reduce poverty and income inequality, and to achieve sustainable income growth, they are being encouraged to focus on an emerging challenge to health, well-being and development: non-communicable diseases (NCDs). After all, 63% of all deaths worldwide currently stem from NCDs – chiefly cardiovascular diseases, cancers, chronic respiratory diseases and diabetes. These deaths are distributed widely among the world’s population – from highincome to low-income countries and from young to old (about one-quarter of all NCD deaths occur below the age of 60, amounting to approximately 9 million deaths per year). NCDs have a large impact, undercutting productivity and boosting healthcare outlays. Moreover, the number of people affected by NCDs is expected to rise substantially in the coming decades, reflecting an ageing and increasing global population. With this in mind, the United Nations is holding its first High-Level Meeting on NCDs on 19-20 September 2011 – this is only the second time that a high-level UN meeting is being dedicated to a health topic (the first time being on HIV/ AIDS in 2001). Over the years, much work has been done estimating the human toll of NCDs, but work on estimating the economic toll is far less advanced. In this report, the World Economic Forum and the Harvard School of Public Health try to inform and stimulate further debate by developing new estimates of the global economic burden of NCDs in 2010, and projecting the size of the burden through 2030. Three distinct approaches are used to compute the economic burden: (1) the standard cost of illness method; (2) macroeconomic simulation and (3) the value of a statistical life. This report includes not only the four major NCDs (the focus of the UN meeting), but also mental illness, which is a major contributor to the burden of disease worldwide. This evaluation takes place in the context of enormous global health spending, serious concerns about already strained public finances and worries about lacklustre economic growth. The report also tries to capture the thinking of the business community about the impact of NCDs on their enterprises. Five key messages emerge: • First, NCDs already pose a substantial economic burden and this burden will evolve into a staggering one over the next two decades. For example, with respect to cardiovascular disease, chronic respiratory disease, cancer, diabetes and mental health, the macroeconomic simulations suggest a cumulative output loss of US$ 47 trillion over the next two decades. This loss represents 75% of global GDP in 2010 (US$ 63 trillion). It also represents enough money to eradicate two dollar-a-day poverty among the 2.5 billion people in that state for more than half a century. • Second, although high-income countries currently bear the biggest economic burden of NCDs, the developing world, especially middle-income countries, is expected to assume an ever larger share as their economies and populations grow. • Third, cardiovascular disease and mental health conditions are the dominant contributors to the global economic burden of NCDs. • Fourth, NCDs are front and centre on business leaders’ radar. The World Economic Forum’s annual Executive Opinion Survey (EOS), which feeds into its Global Competitiveness Report, shows that about half of all business leaders surveyed worry that at least one NCD will hurt their company’s bottom line in the next five years, with similarly high levels of concern in low-, middle- and high-income countries – especially in countries where the quality of healthcare or access to healthcare is perceived to be poor. These NCD-driven concerns are markedly higher than those reported for the communicable diseases of HIV/AIDS, malaria and tuberculosis. • Fifth, the good news is that there appear to be numerous options available to prevent and control NCDs. For example, the WHO has identified a set of interventions they call “Best Buys”. There is also considerable scope for the design and implementation of programmes aimed at behaviour change among youth and adolescents, and more costeffective models of care – models that reduce the care-taking burden that falls on untrained family members. Further research on the benefits of such interventions in relation to their costs is much needed. It is our hope that this report informs the resource allocation decisions of the world’s economic leaders – top government officials, including finance ministers and their economic advisors – who control large amounts of spending at the national level and have the power to react to the formidable economic threat posed by NCDs.

Journal ArticleDOI
TL;DR: In this paper, the authors examine the nature of access as it contrasts to ownership and sharing, specifically the consumer-object, consumer-consumer, and consumer-marketer relationships, and identify four outcomes of negative reciprocity resulting in a big-brother model of governance, and a deterrence of brand community.
Abstract: Access-based consumption, defined as transactions that can be market mediated but where no transfer of ownership takes place, is becoming increasingly popular, yet it is not well theorized. This study examines the nature of access as it contrasts to ownership and sharing, specifically the consumer-object, consumer-consumer, and consumer-marketer relationships. Six dimensions are identified to distinguish among the range of access-based consumptionscapes: temporality, anonymity, market mediation, consumer involvement, the type of accessed object, and political consumerism. Access-based consumption is examined in the context of car sharing via an interpretive study of Zipcar consumers. Four outcomes of these dimensions in the context of car sharing are identified: lack of identification, varying significance of use and sign value, negative reciprocity resulting in a big-brother model of governance, and a deterrence of brand community. The implications of our findings for understanding the nature of exchange, consumption, and brand community are discussed.

Book
05 Jun 2012
TL;DR: This chapter discusses semantic structure and semantic changes: English perception-verbs in an Indo-European context with a focus on the role of subordination in the meaning ofverbs.
Abstract: Dedication Acknowledgements Preface 1. Introduction 2. Semantic structure and semantic changes: English perception-verbs in an Indo-European context 3. Modality 4. Conjunction, coordination and subordination 5. Conditionals 6. Retrospect and prospect References Index.

Journal ArticleDOI
TL;DR: The current PANTHER process as a whole, as well as the website tools for analysis of user-uploaded data are described, which include stable database identifiers for inferred ancestral genes, which are used to associate inferred gene attributes with particular genes in the common ancestral genomes of extant species.
Abstract: The data and tools in PANTHER—a comprehensive, curated database of protein families, trees, subfamilies and functions available at http://pantherdb.org—have undergone continual, extensive improvement for over a decade. Here, we describe the current PANTHER process as a whole, as well as the website tools for analysis of user-uploaded data. The main goals of PANTHER remain essentially unchanged: the accurate inference (and practical application) of gene and protein function over large sequence databases, using phylogenetic trees to extrapolate from the relatively sparse experimental information from a few model organisms. Yet the focus of PANTHER has continually shifted toward more accurate and detailed representations of evolutionary events in gene family histories. The trees are now designed to represent gene family evolution, including inference of evolutionary events, such as speciation and gene duplication. Subfamilies are still curated and used to define HMMs, but gene ontology functional annotations can now be made at any node in the tree, and are designed to represent gain and loss of function by ancestral genes during evolution. Finally, PANTHER now includes stable database identifiers for inferred ancestral genes, which are used to associate inferred gene attributes with particular genes in the common ancestral genomes of extant species.

Posted Content
TL;DR: Overall, this chapter describes elements of the practice used to successfully and efficiently train and debug large-scale and often deep multi-layer neural networks and closes with open questions about the training difficulties observed with deeper architectures.
Abstract: Learning algorithms related to artificial neural networks and in particular for Deep Learning may seem to involve many bells and whistles, called hyper-parameters. This chapter is meant as a practical guide with recommendations for some of the most commonly used hyper-parameters, in particular in the context of learning algorithms based on back-propagated gradient and gradient-based optimization. It also discusses how to deal with the fact that more interesting results can be obtained when allowing one to adjust many hyper-parameters. Overall, it describes elements of the practice used to successfully and efficiently train and debug large-scale and often deep multi-layer neural networks. It closes with open questions about the training difficulties observed with deeper architectures.

Journal ArticleDOI
TL;DR: In this article, the authors present an overview of the achievements and the status of integrability in the context of the AdS/CFT correspondence as of the year 2010.
Abstract: This is the introductory chapter of a review collection on integrability in the context of the AdS/CFT correspondence. In the collection we present an overview of the achievements and the status of this subject as of the year 2010.

Journal ArticleDOI
15 Nov 2012-Nature
TL;DR: It is shown that the previously reported increase in global drought is overestimated because the PDSI uses a simplified model of potential evaporation that responds only to changes in temperature and thus responds incorrectly to global warming in recent decades.
Abstract: A physically based approach to drought modelling shows that there has been little change in drought from 1950 to 2008, contradicting previous work that suggested an increase in recent years. Published assessments of historic changes in drought during recent decades have suggested that the frequency and area of droughts have been increasing. Here Justin Sheffield et al. show that this prior work was flawed, because of an inappropriate calculation of drought metrics. Using a more physically based approach, the team shows that there has in fact been little change in drought during the period 1950 to 2008. Drought is expected to increase in frequency and severity in the future as a result of climate change, mainly as a consequence of decreases in regional precipitation but also because of increasing evaporation driven by global warming1,2,3. Previous assessments of historic changes in drought over the late twentieth and early twenty-first centuries indicate that this may already be happening globally. In particular, calculations of the Palmer Drought Severity Index (PDSI) show a decrease in moisture globally since the 1970s with a commensurate increase in the area in drought that is attributed, in part, to global warming4,5. The simplicity of the PDSI, which is calculated from a simple water-balance model forced by monthly precipitation and temperature data, makes it an attractive tool in large-scale drought assessments, but may give biased results in the context of climate change6. Here we show that the previously reported increase in global drought is overestimated because the PDSI uses a simplified model of potential evaporation7 that responds only to changes in temperature and thus responds incorrectly to global warming in recent decades. More realistic calculations, based on the underlying physical principles8 that take into account changes in available energy, humidity and wind speed, suggest that there has been little change in drought over the past 60 years. The results have implications for how we interpret the impact of global warming on the hydrological cycle and its extremes, and may help to explain why palaeoclimate drought reconstructions based on tree-ring data diverge from the PDSI-based drought record in recent years9,10.

Journal ArticleDOI
TL;DR: The focus of this Commentary will be on identifying and describing the fundamental features of 3D cell culture systems that influence cell structure, adhesion, mechanotransduction and signaling in response to soluble factors, which regulate overall cellular function in ways that depart dramatically from traditional 2D culture formats.
Abstract: Summary Much of our understanding of the biological mechanisms that underlie cellular functions, such as migration, differentiation and force-sensing has been garnered from studying cells cultured on two-dimensional (2D) glass or plastic surfaces. However, more recently the cell biology field has come to appreciate the dissimilarity between these flat surfaces and the topographically complex, three-dimensional (3D) extracellular environments in which cells routinely operate in vivo. This has spurred substantial efforts towards the development of in vitro 3D biomimetic environments and has encouraged much cross-disciplinary work among biologists, material scientists and tissue engineers. As we move towards more-physiological culture systems for studying fundamental cellular processes, it is crucial to define exactly which factors are operative in 3D microenvironments. Thus, the focus of this Commentary will be on identifying and describing the fundamental features of 3D cell culture systems that influence cell structure, adhesion, mechanotransduction and signaling in response to soluble factors, which – in turn – regulate overall cellular function in ways that depart dramatically from traditional 2D culture formats. Additionally, we will describe experimental scenarios in which 3D culture is particularly relevant, highlight recent advances in materials engineering for studying cell biology, and discuss examples where studying cells in a 3D context provided insights that would not have been observed in traditional 2D systems.

Journal ArticleDOI
TL;DR: In this paper, the authors provide an update on ROS and redox signalling in the context of abiotic stress responses, while addressing their role in retrograde regulation, systemic acquired acclimation and cellular coordination in plants.
Abstract: The redox state of the chloroplast and mitochondria, the two main powerhouses of photosynthesizing eukaryotes, is maintained by a delicate balance between energy production and consumption, and affected by the need to avoid increased production of reactive oxygen species (ROS). These demands are especially critical during exposure to extreme environmental conditions, such as high light (HL) intensity, heat, drought or a combination of different environmental stresses. Under these conditions, ROS and redox cues, generated in the chloroplast and mitochondria, are essential for maintaining normal energy and metabolic fluxes, optimizing different cell functions, activating acclimation responses through retrograde signalling, and controlling whole-plant systemic signalling pathways. Regulation of the multiple redox and ROS signals in plants requires a high degree of coordination and balance between signalling and metabolic pathways in different cellular compartments. In this review, we provide an update on ROS and redox signalling in the context of abiotic stress responses, while addressing their role in retrograde regulation, systemic acquired acclimation and cellular coordination in plants.

Journal ArticleDOI
TL;DR: The Integrated Microbial Genomes system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context and provides tools and viewers for analyzing and reviewing the annotations of genes and genomes inA comparative context.
Abstract: The Integrated Microbial Genomes (IMG) system serves as a community resource for comparative analysis of publicly available genomes in a comprehensive integrated context. IMG integrates publicly available draft and complete genomes from all three domains of life with a large number of plasmids and viruses. IMG provides tools and viewers for analyzing and reviewing the annotations of genes and genomes in a comparative context. IMG's data content and analytical capabilities have been continuously extended through regular updates since its first release in March 2005. IMG is available at http://img.jgi.doe.gov. Companion IMG systems provide support for expert review of genome annotations (IMG/ER: http://img.jgi.doe.gov/er), teaching courses and training in microbial genome analysis (IMG/EDU: http://img.jgi.doe.gov/edu) and analysis of genomes related to the Human Microbiome Project (IMG/HMP: http://www.hmpdacc-resources.org/img_hmp).

Journal ArticleDOI
04 Jan 2012-JAMA
TL;DR: The Swedish Obese Subjects (SOS) study as discussed by the authors was conducted at 25 public surgical departments and 480 primary health care centers in Sweden of 2010 obese participants who underwent bariatric surgery and 2037 contemporaneously matched obese controls who received usual care.
Abstract: Context Obesity is a risk factor for cardiovascular events. Weight loss might protect against cardiovascular events, but solid evidence is lacking. Objective To study the association between bariatric surgery, weight loss, and cardiovascular events. Design, Setting, and Participants The Swedish Obese Subjects (SOS) study is an ongoing, nonrandomized, prospective, controlled study conducted at 25 public surgical departments and 480 primary health care centers in Sweden of 2010 obese participants who underwent bariatric surgery and 2037 contemporaneously matched obese controls who received usual care. Patients were recruited between September 1, 1987, and January 31, 2001. Date of analysis was December 31, 2009, with median follow-up of 14.7 years (range, 0-20 years). Inclusion criteria were age 37 to 60 years and a body mass index of at least 34 in men and at least 38 in women. Exclusion criteria were identical in surgery and control patients. Surgery patients underwent gastric bypass (13.2%), banding (18.7%), or vertical banded gastroplasty (68.1%), and controls received usual care in the Swedish primary health care system. Physical and biochemical examinations and database cross-checks were undertaken at preplanned intervals. Main Outcome Measures The primary end point of the SOS study (total mortality) was published in 2007. Myocardial infarction and stroke were predefined secondary end points, considered separately and combined. Results Bariatric surgery was associated with a reduced number of cardiovascular deaths (28 events among 2010 patients in the surgery group vs 49 events among 2037 patients in the control group; adjusted hazard ratio [HR], 0.47; 95% CI, 0.29-0.76; P = .002). The number of total first time (fatal or nonfatal) cardiovascular events (myocardial infarction or stroke, whichever came first) was lower in the surgery group (199 events among 2010 patients) than in the control group (234 events among 2037 patients; adjusted HR, 0.67; 95% CI, 0.54-0.83; P Conclusion Compared with usual care, bariatric surgery was associated with reduced number of cardiovascular deaths and lower incidence of cardiovascular events in obese adults.

Journal ArticleDOI
TL;DR: In this article, a review of cellulose nanofibril based green composites research and application through examples is presented, where the authors discuss the processing, extraction, properties, chronological events and applications of celluloses and cellulosic-based nanocomposite materials.

Journal ArticleDOI
TL;DR: In this paper, a new approach for the assessment of both vertical and lateral collinearity in variance-based structural equation modeling is proposed and demonstrated in the context of the illustrative analysis.
Abstract: Variance-based structural equation modeling is extensively used in information systems research, and many related findings may have been distorted by hidden collinearity. This is a problem that may extent to multivariate analyses in general, in the field of information systems as well as in many other fields. In multivariate analyses, collinearity is usually assessed as a predictor-predictor relationship phenomenon, where two or more predictors are checked for redundancy. This type of assessment addresses vertical, or “classic,” collinearity. However, another type of collinearity may also exist, called here “lateral” collinearity. It refers to predictor-criterion collinearity. Lateral collinearity problems are exemplified based on an illustrative variance-based structural equation modeling analysis. The analysis employs WarpPLS 2.0, with the results double-checked with other statistical analysis software tools. It is shown that standard validity and reliability tests do not properly capture lateral collinearity. A new approach for the assessment of both vertical and lateral collinearity in variance-based structural equation modeling is proposed and demonstrated in the context of the illustrative analysis.

Proceedings Article
08 Jul 2012
TL;DR: A new neural network architecture is presented which learns word embeddings that better capture the semantics of words by incorporating both local and global document context, and accounts for homonymy and polysemy by learning multiple embedDings per word.
Abstract: Unsupervised word representations are very useful in NLP tasks both as inputs to learning algorithms and as extra word features in NLP systems. However, most of these models are built with only local context and one representation per word. This is problematic because words are often polysemous and global context can also provide useful information for learning word meanings. We present a new neural network architecture which 1) learns word embeddings that better capture the semantics of words by incorporating both local and global document context, and 2) accounts for homonymy and polysemy by learning multiple embeddings per word. We introduce a new dataset with human judgments on pairs of words in sentential context, and evaluate our model on it, showing that our model outperforms competitive baselines and other neural language models.

Journal ArticleDOI
TL;DR: The generalized form of context-dependent PPI approach has increased flexibility of statistical modeling, and potentially improves model fit, specificity to true negative findings, and sensitivity to true positive findings.

Posted Content
TL;DR: For example, Flyvbjerg as discussed by the authors argues that the success of the natural sciences in producing cumulative and predictive theory simply does not work in any of the social sciences, and argues that social scientific research should be transformed into an activity performed in public for publics, sometimes to clarify, often to intervene, and always to serve as eyes and ears in ongoing efforts to understand the present and to deliberate about the future.
Abstract: If we want to empower and re-enchant social scientific research, we need to do three things. First, we must drop all pretence, however indirect, at emulating the success of the natural sciences in producing cumulative and predictive theory, for their approach simply does not work in any of the social sciences. (For the full argument see Flyvbjerg, 2001.) Second, we must address problems that matter to groups in the local, national and global communities in which we live, and we must do it in ways that matter; we must focus on issues of context, values and power, as advocated by great social scientists from Aristotle and Machiavelli to Max Weber and Pierre Bourdieu. Finally, we must effectively and dialogically communicate the results of our research to our fellow citizens, the ‘public’, and carefully listen to their feedback. If we do this – focus on specific values and interests in the context of particular power relations – we may successfully transform social scientific research into an activity performed in public for publics, sometimes to clarify, sometimes to intervene, sometimes to generate new perspectives, and always to serve as eyes and ears in ongoing efforts to understand the present and to deliberate about the future. We may, in short, arrive at social research that matters.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed Kernel Density Estimation (KDE), a more robust alternative to the Probability Density Plot (PDP), which also involves summing a set of Gaussian distributions, but does not explicitly take into account the analytical uncertainties.