scispace - formally typeset
Search or ask a question

Showing papers by "University of Nebraska Omaha published in 2013"


Journal ArticleDOI
TL;DR: A comprehensive molecular phylogeny for bony fishes that includes representatives of all major lineages and the order Perciformes, considered by many a polyphyletic taxonomic waste basket, is defined for the first time as a monophyletic group in the global phylogeny.
Abstract: The tree of life of fishes is in a state of flux because we still lack a comprehensive phylogeny that includes all major groups. The situation is most critical for a large clade of spiny-finned fishes, traditionally referred to as percomorphs, whose uncertain relationships have plagued ichthyologists for over a century. Most of what we know about the higher-level relationships among fish lineages has been based on morphology, but rapid influx of molecular studies is changing many established systematic concepts. We report a comprehensive molecular phylogeny for bony fishes that includes representatives of all major lineages. DNA sequence data for 21 molecular markers (one mitochondrial and 20 nuclear genes) were collected for 1410 bony fish taxa, plus four tetrapod species and two chondrichthyan outgroups (total 1416 terminals). Bony fish diversity is represented by 1093 genera, 369 families, and all traditionally recognized orders. The maximum likelihood tree provides unprecedented resolution and high bootstrap support for most backbone nodes, defining for the first time a global phylogeny of fishes. The general structure of the tree is in agreement with expectations from previous morphological and molecular studies, but significant new clades arise. Most interestingly, the high degree of uncertainty among percomorphs is now resolved into nine well-supported supraordinal groups. The order Perciformes, considered by many a polyphyletic taxonomic waste basket, is defined for the first time as a monophyletic group in the global phylogeny. A new classification that reflects our phylogenetic hypothesis is proposed to facilitate communication about the newly found structure of the tree of life of fishes. Finally, the molecular phylogeny is calibrated using 60 fossil constraints to produce a comprehensive time tree. The new time-calibrated phylogeny will provide the basis for and stimulate new comparative studies to better understand the evolution of the amazing diversity of fishes.

740 citations


Journal ArticleDOI
TL;DR: The results demonstrate that both ApEn and SampEn are extremely sensitive to parameter choices, especially for very short data sets, N ≤ 200, and should be used with extreme caution when choosing parameters for experimental studies with both algorithms.
Abstract: Approximate entropy (ApEn) and sample entropy (SampEn) are mathematical algorithms created to measure the repeatability or predictability within a time series. Both algorithms are extremely sensitive to their input parameters: m (length of the data segment being compared), r (similarity criterion), and N (length of data). There is no established consensus on parameter selection in short data sets, especially for biological data. Therefore, the purpose of this research was to examine the robustness of these two entropy algorithms by exploring the effect of changing parameter values on short data sets. Data with known theoretical entropy qualities as well as experimental data from both healthy young and older adults was utilized. Our results demonstrate that both ApEn and SampEn are extremely sensitive to parameter choices, especially for very short data sets, N ≤ 200. We suggest using N larger than 200, an m of 2 and examine several r values before selecting your parameters. Extreme caution should be used when choosing parameters for experimental studies with both algorithms. Based on our current findings, it appears that SampEn is more reliable for short data sets. SampEn was less sensitive to changes in data length and demonstrated fewer problems with relative consistency.

669 citations


Journal ArticleDOI
08 Nov 2013-BMJ
TL;DR: Overall, among all stent types, the newer generation durable polymer drug eluting stents were the most efficacious (lowest target vessel revascularization rate) stents, and cobalt chromium everolimus elutingStent were the safest with significant reductions in definite stent thrombosis.
Abstract: Objective To compare the efficacy and safety of biodegradable polymer drug eluting stents with those of bare metal stents and durable polymer drug eluting stents. Design Mixed treatment comparison meta-analysis of 258 544 patient years of follow-up from randomized trials. Data sources and study selection PubMed, Embase, and Central were searched for randomized trials comparing any of the Food and Drug Administration approved durable polymer drug eluting stents (sirolimus eluting, paclitaxel eluting, cobalt chromium everolimus eluting, platinum chromium everolimus eluting, zotarolimus eluting-Endeavor, and zotarolimus eluting-Resolute) or biodegradable polymer drug eluting stents, with each other or against bare metal stents. Outcomes Long term efficacy (target vessel revascularization, target lesion revascularization) and safety (death, myocardial infarction, stent thrombosis). Landmark analysis at more than one year was evaluated to assess the potential late benefit of biodegradable polymer drug eluting stents. Results From 126 randomized trials and 258 544 patient years of follow-up, for long term efficacy (target vessel revascularization), biodegradable polymer drug eluting stents were superior to paclitaxel eluting stents (rate ratio 0.66, 95% credibility interval 0.57 to 0.78) and zotarolimus eluting stent-Endeavor (0.69, 0.56 to 0.84) but not to newer generation durable polymer drug eluting stents (for example: 1.03, 0.89 to 1.21 versus cobalt chromium everolimus eluting stents). Similarly, biodegradable polymer drug eluting stents were superior to paclitaxel eluting stents (rate ratio 0.61, 0.37 to 0.89) but inferior to cobalt chromium everolimus eluting stents (2.04, 1.27 to 3.35) for long term safety (definite stent thrombosis). In the landmark analysis after one year, biodegradable polymer drug eluting stents were superior to sirolimus eluting stents for definite stent thrombosis (rate ratio 0.29, 0.10 to 0.82) but were associated with increased mortality compared with cobalt chromium everolimus eluting stents (1.52, 1.02 to 2.22). Overall, among all stent types, the newer generation durable polymer drug eluting stents (zotarolimus eluting stent-Resolute, cobalt chromium everolimus eluting stents, and platinum chromium everolimus eluting stents) were the most efficacious (lowest target vessel revascularization rate) stents, and cobalt chromium everolimus eluting stents were the safest with significant reductions in definite stent thrombosis (rate ratio 0.35, 0.21 to 0.53), myocardial infarction (0.65, 0.55 to 0.75), and death (0.72, 0.58 to 0.90) compared with bare metal stents. Conclusions Biodegradable polymer drug eluting stents are superior to first generation durable polymer drug eluting stents but not to newer generation durable polymer stents in reducing target vessel revascularization. Newer generation durable polymer stents, and especially cobalt chromium everolimus eluting stents, have the best combination of efficacy and safety. The utility of biodegradable polymer stents in the context of excellent clinical outcomes with newer generation durable polymer stents needs to be proven.

276 citations


Journal ArticleDOI
TL;DR: A model for task-oriented resource allocation in a cloud computing environment where an induced bias matrix is used to identify the inconsistent elements and improve the consistency ratio when conflicting weights in various tasks are assigned is proposed.
Abstract: Resource allocation is a complicated task in cloud computing environment because there are many alternative computers with varying capacities. The goal of this paper is to propose a model for task-oriented resource allocation in a cloud computing environment. Resource allocation task is ranked by the pairwise comparison matrix technique and the Analytic Hierarchy Process giving the available resources and user preferences. The computing resources can be allocated according to the rank of tasks. Furthermore, an induced bias matrix is further used to identify the inconsistent elements and improve the consistency ratio when conflicting weights in various tasks are assigned. Two illustrative examples are introduced to validate the proposed method.

251 citations


Journal ArticleDOI
TL;DR: In this paper, a taxonomic theory of crowdsourcing is developed by organizing the empirical variants in nine distinct forms of crowd-sourcing models, focusing on the notion of managerial control systems.
Abstract: In this article, the authors first provide a practical yet rigorous definition of crowdsourcing that incorporates “crowds,” outsourcing, and social web technologies. They then analyze 103 well-known crowdsourcing web sites using content analysis methods and the hermeneutic reading principle. Based on their analysis, they develop a “taxonomic theory” of crowdsourcing by organizing the empirical variants in nine distinct forms of crowdsourcing models. They also discuss key issues and directions, concentrating on the notion of managerial control systems.

244 citations


Journal ArticleDOI
TL;DR: In this paper, the authors explored the relationship between urban form and air pollution among 86 U.S. metropolitan areas and found that areas with higher levels of urban sprawl exhibited higher concentrations and emissions of air pollution and CO2 when controlling for population, land area, and climate.
Abstract: In this article we explore the relationships between urban form and air pollution among 86 U.S. metropolitan areas. Urban form was quantified using preexisting sprawl indexes and spatial metrics applied to remotely sensed land cover data. Air pollution data included the nonpoint source emission of the ozone (O3) precursors nitrogen oxides (NOx) and volatile organic compounds (VOCs), the concentration of O3, the concentration and nonpoint source emission of fine particulate matter (PM2.5), and the emission of carbon dioxide (CO2) from on-road sources. Metropolitan areas that exhibited higher levels of urban sprawl, or sprawl-like urban morphologies, generally exhibited higher concentrations and emissions of air pollution and CO2 when controlling for population, land area, and climate.

209 citations


Proceedings ArticleDOI
07 Jan 2013
TL;DR: A literature survey of crowd sourcing research, focusing on top journals and conferences in the Information Systems field, and shows how existing IS literature applies to the elements of that conceptual model: Problem, People, Individual, and Crowd, Governance, Process, Technology, and Outcome.
Abstract: Crowd sourcing is a collaboration model enabled by people-centric web technologies to solve individual, organizational, and societal problems using a dynamically formed crowd of people who respond to an open call for participation. We report on a literature survey of crowd sourcing research, focusing on top journals and conferences in the Information Systems (IS) field. To our knowledge, ours is the first effort of this type in the IS discipline. Contributions include providing a synopsis of crowd sourcing research to date, a common definition for crowd sourcing, and a conceptual model for guiding future studies of crowd sourcing. We show how existing IS literature applies to the elements of that conceptual model: Problem, People (Problem Owner, Individual, and Crowd), Governance, Process, Technology, and Outcome. We close with suggestions for future research.

202 citations


Journal ArticleDOI
TL;DR: Pregnant women in the Midwestern United States used the internet for health information during pregnancy including information related to physical activity and nutrition and had greater increases in confidence for making decisions from using the internet compared to women who decreased or did not change their physical activity.
Abstract: The purpose of this study was to determine how pregnant women in the Midwestern United States use the internet for health information during pregnancy including information related to physical activity and nutrition, and to determine the impact of the internet on women’s confidence in making decisions about physical activity participation and eating behaviors during pregnancy This was a descriptive, exploratory study using a convenient, non-probabilistic sample Women were recruited through handouts provided in person, fliers posted at venues, or local websites that cater to women who are pregnant or up to 1 year post-partum Overall, 293 women (285 years ± 49) completed the survey online (Survey Monkey) or in-print Data were analyzed using descriptive statistics, paired t tests, and analyses of covariance Almost all women used the internet for health information during their pregnancy Half of women used the internet for information related to physical activity during their pregnancy and some increased their physical activity as a result Women reported an increase in their confidence for making decisions related to physical activity during pregnancy after using the internet for physical activity information Women that reported increases in physical activity during pregnancy, had greater increases in confidence for making decisions from using the internet compared to women who decreased or did not change their physical activity Findings related to nutrition were similar to physical activity However, there were no significant differences in increases in confidence between those who did or did not change the foods they ate This study provides health promotional professionals useful information to consider when designing future physical activity and/or nutrition interventions for pregnant women

157 citations


Journal ArticleDOI
TL;DR: The tunable bandgaps in general and possible indirect-direct bandgap transitions due to tensile strain or external electric field make the TMD heterobilayer materials a viable candidate for optoelectronic applications.
Abstract: We have performed a comprehensive first-principles study of the electronic and magnetic properties of two-dimensional (2D) transition-metal dichalcogenide (TMD) heterobilayers MX2/MoS2 (M = Mo, Cr, W, Fe, V; X = S, Se). For M = Mo, Cr, W; X=S, Se, all heterobilayers show semiconducting characteristics with an indirect bandgap with the exception of the WSe2/MoS2 heterobilayer which retains the direct-band-gap character of the constituent monolayer. For M = Fe, V; X = S, Se, the MX2/MoS2 heterobilayers exhibit metallic characters. Particular attention of this study has been focused on engineering bandgap of the TMD heterobilayer materials via application of either a tensile strain or an external electric field. We find that with increasing either the biaxial or uniaxial tensile strain, the MX2/MoS2 (M=Mo, Cr, W; X=S, Se) heterobilayers can undergo a semiconductor-to-metal transition. For the WSe2/MoS2 heterobilayer, a direct-to-indirect bandgap transition may occur beyond a critical biaxial or uniaxial strain. For M (=Fe, V) and X (=S, Se), the magnetic moments of both metal and chalcogen atoms are enhanced when the MX2/MoS2 heterobilayers are under a biaxial tensile strain. Moreover, the bandgap of MX2/MoS2 (M=Mo, Cr, W; X=S, Se) heterobilayers can be reduced by the electric field. For two heterobilayers MSe2/MoS2 (M=Mo, Cr), PBE calculations suggest that the indirect-to-direct bandgap transition may occur under an external electric field. The transition is attributed to the enhanced spontaneous polarization. The tunable bandgaps in general and possible indirect-direct bandgap transitions due to tensile strain or external electric field endow the TMD heterobilayer materials a viable candidate for optoelectronic applications.

153 citations


Journal ArticleDOI
TL;DR: The findings support prior research indicating that exposure to multiple forms of violence, across multiple domains of life, negatively impacts adolescent outcomes, including substance use and suggest that the context in which exposure to violence occurs should be considered in future research.

137 citations


Journal ArticleDOI
TL;DR: Qualitative data from professionals reveal general acceptance of the emerging temporal organization of professional work, including rising time demands and blurred boundaries around work/nonwork times and places and time work as strategic responses to work intensification, overloads, and boundarylessness.
Abstract: How are professionals responding to the time strains brought on by the stress of their higher status jobs? Qualitative data from professionals reveal (a) general acceptance of the emerging temporal organization of professional work, including rising time demands and blurred boundaries around work/ nonwork times and places, and (b) time work as strategic responses to work intensification, overloads, and boundarylessness. We detected four time-work strategies: prioritizing time, scaling back obligations, blocking out time, and time shifting of obligations. These strategies are often more work-friendly than family-friendly, but "blocking out time" and "time shifting" suggest promising avenues for work-time policy and practice.

Journal ArticleDOI
TL;DR: The Systems Biology Markup Language (SBML) Qualitative Models Package (qual) as discussed by the authors is an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks.
Abstract: Background: Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. Results: We present the Systems Biology Markup Language (SBML) Qualitative Models Package (“qual”), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models. Conclusions: SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.

Journal ArticleDOI
TL;DR: Using ab initio quantum transport simulation, it is revealed for the first time that fT of a graphene transistor still increases with the reduced Lgate when Lgate scales down to a few nm and reaches astonishing a few tens of THz.
Abstract: Sub-10 nm Gate Length Graphene Transistors: Operating at Terahertz Frequencies with Current Saturation

Journal ArticleDOI
TL;DR: The authors compared experts, quasi-experts, and novices in evaluating an engineering product (a mousetrap design) and found that experts seemed to be appropriate raters for short stories, yet results were mixed for the engineer quasiexperts.
Abstract: What is the role of expertise in evaluating creative products? Novices and experts do not assess creativity similarly, indicating domain-specific knowledge’s role in judging creativity. We describe two studies that examined how “quasi-experts” (people who have more experience in a domain than novices but also lack recognized standing as experts) compared with novices and experts in rating creative work. In Study 1, we compared different types of quasi-experts with novices and experts in rating short stories. In Study 2, we compared experts, quasi-experts, and novices in evaluating an engineering product (a mousetrap design). Quasi-experts (regardless of type) seemed to be appropriate raters for short stories, yet results were mixed for the engineer quasi-experts. Some domains may require more expertise than others to properly evaluate creative work.

Journal ArticleDOI
TL;DR: In this paper, the authors analyse the causality between past trading volume and index returns in the Pacific Basin countries and reveal strong nonlinear causality: positive for high return quantiles and negative for low ones.

Posted Content
TL;DR: The collective effort to define the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyse qualitative models.
Abstract: Background: Qualitative frameworks, especially those based on the logical discrete formalism, are increasingly used to model regulatory and signalling networks. A major advantage of these frameworks is that they do not require precise quantitative data, and that they are well-suited for studies of large networks. While numerous groups have developed specific computational tools that provide original methods to analyse qualitative models, a standard format to exchange qualitative models has been missing. Results: We present the System Biology Markup Language (SBML) Qualitative Models Package ("qual"), an extension of the SBML Level 3 standard designed for computer representation of qualitative models of biological networks. We demonstrate the interoperability of models via SBML qual through the analysis of a specific signalling network by three independent software tools. Furthermore, the cooperative development of the SBML qual format paved the way for the development of LogicalModel, an open-source model library, which will facilitate the adoption of the format as well as the collaborative development of algorithms to analyze qualitative models. Conclusion: SBML qual allows the exchange of qualitative models among a number of complementary software tools. SBML qual has the potential to promote collaborative work on the development of novel computational approaches, as well as on the specification and the analysis of comprehensive qualitative models of regulatory and signalling networks.

Journal ArticleDOI
TL;DR: These data represent the largest behavioral database on semantic priming and are available to researchers to aid in selecting stimuli, testing theories, and reducing potential confounds in their studies.
Abstract: Speeded naming and lexical decision data for 1,661 target words following related and unrelated primes were collected from 768 subjects across four different universities. These behavioral measures have been integrated with demographic information for each subject and descriptive characteristics for every item. Subjects also completed portions of the Woodcock–Johnson reading battery, three attentional control tasks, and a circadian rhythm measure. These data are available at a user-friendly Internet-based repository ( http://spp.montana.edu ). This Web site includes a search engine designed to generate lists of prime–target pairs with specific characteristics (e.g., length, frequency, associative strength, latent semantic similarity, priming effect in standardized and raw reaction times). We illustrate the types of questions that can be addressed via the Semantic Priming Project. These data represent the largest behavioral database on semantic priming and are available to researchers to aid in selecting stimuli, testing theories, and reducing potential confounds in their studies.

Journal ArticleDOI
TL;DR: By using the density functional theory with dispersion correction, the interfacial properties of bilayer (BLG) and trilayer graphene (TLG) on metal substrates are investigated for the first time and three categories of interfacial structures are revealed.
Abstract: One popular approach to prepare graphene is to grow them on transition metal substrates via chemical vapor deposition. By using the density functional theory with dispersion correction, we systematically investigate for the first time the interfacial properties of bilayer (BLG) and trilayer graphene (TLG) on metal substrates. Three categories of interfacial structures are revealed. The adsorption of B(T)LG on Al, Ag, Cu, Au and Pt substrates is a weak physisorption, but a band gap can be opened. The adsorption of B(T)LG on Ti, Ni and Co substrates is a strong chemisorption and a stacking-insensitive band gap is opened for the two uncontacted layers of TLG. The adsorption of B(T)LG on Pd substrate is a weaker chemisorption, with a band gap opened for the uncontacted layers. This fundamental study also helps for B(T)LG device study due to inevitable graphene/metal contact.

Journal ArticleDOI
TL;DR: In this article, the structural, electronic, and optical properties of several possible graphdiyne bulk structures were investigated using density functional theory plus van der Waals (vdW) density functional.
Abstract: Graphdiyne is a newly discovered 2D carbon allotrope with many special features. Using density functional theory plus van der Waals (vdW) density functional, we investigate the structural, electronic, and optical properties of several possible graphdiyne bulk structures. We find that bulk graphdiyne can be either a semiconductor or a metal, depending on its stacking configuration. The interlayer vdW force red shifts the optical absorption peaks of bulk graphdiyne relative to those of the monolayer, and spectra of different stackings display notable differences in the energy range below 1 eV. Finally, combining with previous electrical and optical experiments, we identify the structure of the recently synthesized graphdiyne film.

Journal ArticleDOI
TL;DR: In this article, the existence of herding in the global equity market is investigated and the authors apply a methodology which utilises cross-country dispersion in index returns to reveal price patterns indicative of traders' irrationality, especially in basic materials, consumer services, and oil and gas.

Journal ArticleDOI
TL;DR: A collaboration support system that combines a computer-assisted collaboration engineering platform for creating PSAs with a process support system runtime platform for executing PSAs and meets its design goals to reduce development cycles for collaboration systems.
Abstract: The potential benefits of collaboration technologies are typically realized only in groups led by collaboration experts. This raises the facilitator-in-the-box challenge: Can collaboration expertise be packaged with collaboration technology in a form that nonexperts can reuse with no training on either tools or techniques? We address that challenge with process support applications (PSAs). We describe a collaboration support system (CSS) that combines a computer-assisted collaboration engineering platform for creating PSAs with a process support system runtime platform for executing PSAs. We show that the CSS meets its design goals: (1) to reduce development cycles for collaboration systems, (2) to allow nonprogrammers to design and develop PSAs, and (3) to package enough expertise in the tools that nonexperts could execute a well-designed collaborative work process without training.

Journal ArticleDOI
TL;DR: Three techniques to speedup fundamental problems in data mining algorithms on the CUDA platform are proposed: scalable thread scheduling scheme for irregular pattern, parallel distributed top-k scheme, and parallel high dimension reduction scheme.
Abstract: Recent development in Graphics Processing Units (GPUs) has enabled inexpensive high performance computing for general-purpose applications. Compute Unified Device Architecture (CUDA) programming model provides the programmers adequate C language like APIs to better exploit the parallel power of the GPU. Data mining is widely used and has significant applications in various domains. However, current data mining toolkits cannot meet the requirement of applications with large-scale databases in terms of speed. In this paper, we propose three techniques to speedup fundamental problems in data mining algorithms on the CUDA platform: scalable thread scheduling scheme for irregular pattern, parallel distributed top-k scheme, and parallel high dimension reduction scheme. They play a key role in our CUDA-based implementation of three representative data mining algorithms, CU-Apriori, CU-KNN, and CU-K-means. These parallel implementations outperform the other state-of-the-art implementations significantly on a HP xw8600 workstation with a Tesla C1060 GPU and a Core-quad Intel Xeon CPU. Our results have shown that GPU + CUDA parallel architecture is feasible and promising for data mining applications.

Journal ArticleDOI
22 Apr 2013-PLOS ONE
TL;DR: This study produced the most comprehensive genomic resources that have been derived from crucian carp, including thousands of genetic markers, which will not only lay a foundation for further studies on polyploidy origin and anoxic survival but will also facilitate selective breeding of this important aquaculture species.
Abstract: The crucian carp is an important aquaculture species and a potential model to study genome evolution and physiological adaptation. However, so far the genomics and transcriptomics data available for this species are still scarce. We performed de novo transcriptome sequencing of four cDNA libraries representing brain, muscle, liver and kidney tissues respectively, each with six specimens. The removal of low quality reads resulted in 2.62 million raw reads, which were assembled as 127,711 unigenes, including 84,867 isotigs and 42,844 singletons. A total of 22,273 unigenes were found with significant matches to 14,449 unique proteins. Around14,398 unigenes were assigned with at least one Gene Ontology (GO) category in 84,876 total assignments, and 6,382 unigenes were found in 237 predicted KEGG pathways. The gene expression analysis revealed more genes expressed in brain, more up-regulated genes in muscle and more down-regulated genes in liver as compared with gene expression profiles of other tissues. In addition, 23 enzymes in the glycolysis/gluconeogenesis pathway were recovered. Importantly, we identified 5,784 high-quality putative SNP and 11,295 microsatellite markers which include 5,364 microsatellites with flanking sequences ≥50 bp. This study produced the most comprehensive genomic resources that have been derived from crucian carp, including thousands of genetic markers, which will not only lay a foundation for further studies on polyploidy origin and anoxic survival but will also facilitate selective breeding of this important aquaculture species.

Journal ArticleDOI
TL;DR: A personality variable strongly associated with how individuals acknowledge and respond to such social and emotional content is emotional intelligence (EI), which seems contrary to solving a problem with malevolent creativity as mentioned in this paper.
Abstract: Malevolent creativity (MC), or intending to inflict harm in original ways, is an aspect of creativity that has received little empirical attention. It reasons that generating malevolently creative products in response to a problem is dependent upon individual differences and environmental factors, especially with regard to the social and emotional content of a particular problem. A personality variable strongly associated with how individuals acknowledge and respond to such social and emotional content is emotional intelligence (EI). Individuals with higher EI often solve problems in cooperative, beneficial, and positive ways, which seems contrary to solving a problem with MC. In addition to testing whether EI is negatively related to MC in general, we analyzed whether that negative relationship would persist even after controlling for cognitive ability and task effects. Those questions were examined across two studies. Results suggest that individuals with lower EI are more likely to respond to different types of problems with increased instances of MC even when the social or emotional content of those problems are factored out. The implications and limitations of these studies, as well as future directions for the study of MC, are discussed.

Journal ArticleDOI
TL;DR: This proof-of-concept study shows that the gait of older adults may be manipulated using auditory stimuli, and which structures of auditory stimuli lead to improvements in functional status in older adults.
Abstract: Gait variability in the context of a deterministic dynamical system may be quantified using nonlinear time series analyses that characterize the complexity of the system. Pathological gait exhibits altered gait variability. It can be either too periodic and predictable, or too random and disordered, as is the case with aging. While gait therapies often focus on restoration of linear measures such as gait speed or stride length, we propose that the goal of gait therapy should be to restore optimal gait variability, which exhibits chaotic fluctuations and is the balance between predictability and complexity. In this context, our purpose was to investigate how listening to different auditory stimuli affects gait variability. Twenty-seven young and 27 elderly subjects walked on a treadmill for 5 min while listening to white noise, a chaotic rhythm, a metronome, and with no auditory stimulus. Stride length, step width, and stride intervals were calculated for all conditions. Detrended Fluctuation Analysis was then performed on these time series. A quadratic trend analysis determined that an idealized inverted-U shape described the relationship between gait variability and the structure of the auditory stimuli for the elderly group, but not for the young group. This proof-of-concept study shows that the gait of older adults may be manipulated using auditory stimuli. Future work will investigate which structures of auditory stimuli lead to improvements in functional status in older adults.

Journal ArticleDOI
TL;DR: Four distinct boundary strategies emerged from the data, with men and parents of young children having better alignment between preferred and enacted boundaries than women and those without these caregiving duties.

Journal ArticleDOI
TL;DR: Consistent with prior studies showing that spontaneous MI but not procedural myocardial infarction is related to subsequent mortality, the point estimate for reduced mortality with PCI compared with OMT paralleled the prevention of spontaneous MI with PCI.
Abstract: Background—Contemporary studies have shown that spontaneous but not procedural myocardial infarction (MI) is related to subsequent mortality. Whether percutaneous coronary intervention (PCI) reduces spontaneous (nonprocedural) MI is unknown. Methods and Results—PubMed, EMBASE, and Cochrane Central Register of Controlled Trials (CENTRAL) were searched for randomized clinical trials until October 2012 comparing PCI with optimal medical therapy (OMT) for stable ischemic heart disease and reporting MI outcomes: spontaneous nonprocedural MI, procedural MI, and all MI, including procedure-related MI. Given the varying length of follow-up between trials, a mixed-effect Poisson regression meta-analysis was used. From 12 randomized clinical trials with 37 548 patient-years of follow-up, PCI compared with OMT alone was associated with a significantly lower incident rate ratio (IRR) for spontaneous nonprocedural MI (IRR=0.76; 95% confidence interval [CI], 0.58–0.99) at the risk of a higher rate of procedural MI (IRR...

Journal ArticleDOI
TL;DR: In this article, an Embodied Conversational Agent (ECA) was used to maximize consistency and control in questioning, timing, and interviewer nonverbal behavior, thus eliminating potential confounds that may be introduced due to interaction adaptation.
Abstract: Trust is a critical component in effective collaboration, decision-making and negotiation. The goal of effective team leaders should be to send signals and messages that increase trust. We attempt to determine if signals can vary perceptions of trustworthiness and if nonverbal behaviors, such as the voice, contain indicators of trust. In order to investigate the relationship between trust and vocal dynamics, this article presents a study that explores how the voice, measured unobtrusively, reflects a person’s current level of perceived trust. We used an Embodied Conversational Agent (ECA) to maximize consistency and control in questioning, timing, and interviewer nonverbal behavior, thus eliminating potential confounds that may be introduced due to interaction adaptation. Participants ( $$\text{ N}=88$$ ) completed a face-to-face interview with the ECA and reported their perceptions of the ECA’s trustworthiness. The results of the study revealed that vocal pitch was inversely related to perceived trust, but temporally variant; vocal pitch early in the interview reflected trust. The ECA was perceived as more trustworthy when smiling. While the results of this research suggest a relationship between vocal pitch and perceived levels of trust, more work needs to be done to clarify the causal relationship. Similarly, additional study needs to be done in order to integrate additional behavioral measurements that account for variation across diverse situations, people, and cultures.

Journal ArticleDOI
TL;DR: In subjects with moderate-to-severe COPD, FF/ VI 100/25 μg provides rapid and significant sustained bronchodilation at 24 weeks and lung function is improved to a similar extent with FF/VI 50/25-μg and to a somewhat lesser extent with VI 25 μG.

Journal ArticleDOI
TL;DR: The results indicate that meaning exerts a larger influence in the reading aloud of multisyllabic words than monosyllabic word recognition, and that parallel-distributed-processing approaches provide a useful theoretical framework to explain the main effects and interactions.
Abstract: Imageability and age of acquisition (AoA) effects, as well as key interactions between these variables and frequency and consistency, were examined via multiple regression analyses for 1,936 disyllabic words, using reaction time and accuracy measures from the English Lexicon Project Both imageability and AoA accounted for unique variance in lexical decision and naming reaction time performance In addition, across both tasks, AoA and imageability effects were larger for low-frequency words than high-frequency words, and imageability effects were larger for later acquired than earlier acquired words In reading aloud, consistency effects in reaction time were larger for later acquired words than earlier acquired words, but consistency did not interact with imageability in the reaction time analysis These results provide further evidence that multisyllabic word recognition is similar to monosyllabic word recognition and indicate that AoA and imageability are valid predictors of word recognition performance In addition, the results indicate that meaning exerts a larger influence in the reading aloud of multisyllabic words than monosyllabic words Finally, parallel-distributed-processing approaches provide a useful theoretical framework to explain the main effects and interactions