scispace - formally typeset
Search or ask a question

Showing papers by "Wichita State University published in 2016"


Journal ArticleDOI
TL;DR: It can be stated that both the choice of the friction force model and friction parameters involved can significantly affect the simulated/modeled dynamic response of mechanical systems with friction.
Abstract: This study is aimed at examining and comparing several friction force models dealing with different friction phenomena in the context of multibody system dynamics. For this purpose, a comprehensive review of present literature in this field of investigation is first presented. In this process, the main aspects related to friction are discussed, with particular emphasis on the pure dry sliding friction, stick–slip effect, viscous friction and Stribeck effect. In a simple and general way, the friction force models can be classified into two main groups, namely the static friction approaches and the dynamic friction models. The former group mainly describes the steady-state behavior of friction force, while the latter allows capturing more properties by using extra state variables. In the present study, a total of 21 different friction force models are described and their fundamental physical and computational characteristics are discussed and compared in details. The application of those friction models in multibody system dynamic modeling and simulation is then investigated. Two multibody mechanical systems are utilized as demonstrative application examples with the purpose of illustrating the influence of the various frictional approaches on the dynamic response of the systems. From the results obtained, it can be stated that both the choice of the friction force model and friction parameters involved can significantly affect the simulated/modeled dynamic response of mechanical systems with friction.

280 citations


Journal ArticleDOI
P. Adamson1, C. Ader1, M. P. Andrews1, N. Anfimov2  +255 moreInstitutions (38)
TL;DR: The first search for ν_{μ}→ν_{e} transitions by the NOvA experiment finds 6 events in the Far Detector, compared to a background expectation of 0.99±0.11(syst) events based on the Near Detector measurement.
Abstract: We report results from the first search for ν_{μ}→ν_{e} transitions by the NOvA experiment. In an exposure equivalent to 2.74×10^{20} protons on target in the upgraded NuMI beam at Fermilab, we observe 6 events in the Far Detector, compared to a background expectation of 0.99±0.11(syst) events based on the Near Detector measurement. A secondary analysis observes 11 events with a background of 1.07±0.14(syst). The 3.3σ excess of events observed in the primary analysis disfavors 0.1π<δ_{CP}<0.5π in the inverted mass hierarchy at the 90% C.L.

242 citations


Journal ArticleDOI
TL;DR: In this paper, the existence and regularity of a compact Kahler manifold M with edge singularities with cone angle 2 along a smooth divisor D was studied and it was shown that solutions of this problem are polyhomogeneous, i.e., have a complete asymptotic expansion with smooth coecients along D for all 2 < 2.
Abstract: This article considers the existence and regularity of Kahler{Einstein metrics on a compact Kahler manifold M with edge singularities with cone angle 2 along a smooth divisor D. We prove existence of such metrics with negative, zero and some positive cases for all cone angles 2 2 . The results in the positive case parallel those in the smooth case. We also establish that solutions of this problem are polyhomogeneous, i.e., have a complete asymptotic expansion with smooth coecients along D for all 2 < 2 .

239 citations


Proceedings ArticleDOI
01 Jun 2016
TL;DR: Simulation results confirm that QAR outperforms the existing learning solution and provides fast convergence with QoS provisioning, facilitating the practical implementations in large-scale software service-defined networks.
Abstract: Software-defined networks (SDNs) have been recognized as the next-generation networking paradigm that decouples the data forwarding from the centralized control. To realize the merits of dedicated QoS provisioning and fast route (re-)configuration services over the decoupled SDNs, various QoS requirements in packet delay, loss, and throughput should be supported by an efficient transportation with respect to each specific application. In this paper, a QoS-aware adaptive routing (QAR) is proposed in the designed multi-layer hierarchical SDNs. Specifically, the distributed hierarchical control plane architecture is employed to minimize signaling delay in large SDNs via three-levels design of controllers, i.e., the super, domain (or master), and slave controllers. Furthermore, QAR algorithm is proposed with the aid of reinforcement learning and QoS-aware reward function, achieving a time-efficient, adaptive, QoS-provisioning packet forwarding. Simulation results confirm that QAR outperforms the existing learning solution and provides fast convergence with QoS provisioning, facilitating the practical implementations in large-scale software service-defined networks.

200 citations


Journal ArticleDOI
TL;DR: It is found that antiviral treatment led to full recovery of cats when treatment was started at a stage of disease that would be otherwise fatal if left untreated, and results indicate that continuous virus replication is required for progression of immune-mediated inflammatory disease of FIP.
Abstract: Coronaviruses infect animals and humans causing a wide range of diseases. The diversity of coronaviruses in many mammalian species is contributed by relatively high mutation and recombination rates during replication. This dynamic nature of coronaviruses may facilitate cross-species transmission and shifts in tissue or cell tropism in a host, resulting in substantial change in virulence. Feline enteric coronavirus (FECV) causes inapparent or mild enteritis in cats, but a highly fatal disease, called feline infectious peritonitis (FIP), can arise through mutation of FECV to FIP virus (FIPV). The pathogenesis of FIP is intimately associated with immune responses and involves depletion of T cells, features shared by some other coronaviruses like Severe Acute Respiratory Syndrome Coronavirus. The increasing risks of highly virulent coronavirus infections in humans or animals call for effective antiviral drugs, but no such measures are yet available. Previously, we have reported the inhibitors that target 3C-like protease (3CLpro) with broad-spectrum activity against important human and animal coronaviruses. Here, we evaluated the therapeutic efficacy of our 3CLpro inhibitor in laboratory cats with FIP. Experimental FIP is 100% fatal once certain clinical and laboratory signs become apparent. We found that antiviral treatment led to full recovery of cats when treatment was started at a stage of disease that would be otherwise fatal if left untreated. Antiviral treatment was associated with a rapid improvement in fever, ascites, lymphopenia and gross signs of illness and cats returned to normal health within 20 days or less of treatment. Significant reduction in viral titers was also observed in cats. These results indicate that continuous virus replication is required for progression of immune-mediated inflammatory disease of FIP. These findings may provide important insights into devising therapeutic strategies and selection of antiviral compounds for further development for important coronaviruses in animals and humans.

198 citations


Journal ArticleDOI
TL;DR: The findings of this review highlight the importance of working to improve health care strategies for older adults with low health literacy and highlight the need for a standardized and validated clinical health literacy screening tool for Older adults.
Abstract: Objective: The objective of this review was to assess published literature relating to health literacy and older adults. Method: The current review was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta Analyses. Results: Eight articles met inclusion criteria. All studies were conducted in urban settings in the United States. Study sample size ranged from 33 to 3,000 participants. Two studies evaluated health-related outcomes and reported significant associations between low health literacy and poorer health outcomes. Two other studies investigated the impact of health literacy on medication management, reporting mixed findings. Discussion: The findings of this review highlight the importance of working to improve health care strategies for older adults with low health literacy and highlight the need for a standardized and validated clinical health literacy screening tool for older adults.

188 citations


Journal ArticleDOI
TL;DR: In this article, the prevalence of overweight and obesity and their determinants in youth with Down syndrome was reviewed. But, the effectiveness of interventions and the health consequences and the effectiveness for interventions were also examined.

186 citations


Journal ArticleDOI
TL;DR: In this paper, the first all-soluble all-iron flow battery based on iron as the same redox-active element but with different coordination chemistries in an alkaline aqueous system was introduced.
Abstract: The rapid growth of intermittent renewable energy (e.g., wind and solar) demands low-cost and large-scale energy storage systems for smooth and reliable power output, where redox-flow batteries (RFBs) could find their niche. In this work, we introduce the first all-soluble all-iron RFB based on iron as the same redox-active element but with different coordination chemistries in alkaline aqueous system. The adoption of the same redox-active element largely alleviates the challenging problem of cross-contamination of metal ions in RFBs that use two redox-active elements. An all-soluble all-iron RFB is constructed by combining an iron–triethanolamine redox pair (i.e., [Fe(TEOA)OH]−/[Fe(TEOA)(OH)]2–) and an iron–cyanide redox pair (i.e., Fe(CN)63–/Fe(CN)64–), creating 1.34 V of formal cell voltage. Good performance and stability have been demonstrated, after addressing some challenges, including the crossover of the ligand agent. As exemplified by the all-soluble all-iron flow battery, combining redox pairs o...

182 citations


Journal ArticleDOI
TL;DR: The GUESS was developed and validated based on the assessments of over 450 unique video game titles across many popular genres and can be applied across many types of video games in the industry both as a way to assess what aspects of a game contribute to user satisfaction and as a tool to aid in debriefing users on their gaming experience.
Abstract: Objective:The aim of this study was to develop and psychometrically validate a new instrument that comprehensively measures video game satisfaction based on key factors.Background:Playtesting is of...

176 citations


Journal ArticleDOI
TL;DR: In this paper, a series of Co-Fe bimetallic catalysts were used for the hydrogenation of carbon dioxide in a fixed-bed catalytic microreactor operating in the temperature range of 200-270 °C and a pressure of 0.92 MPa.
Abstract: A series of Co–Fe bimetallic catalysts was prepared, characterized, and studied for the hydrogenation of carbon dioxide. The catalyst precursors were prepared via an oxalate coprecipitation method. Monometallic (Co or Fe) and bimetallic (Co–Fe) oxalate precursors were decomposed under a N2 flow at 400 °C and further pretreated under a CO flow at 250 °C. The catalysts (before decomposition of the oxalates or after activation) were characterized by BET, TGA-MS, X-ray diffraction, CO-TPR, SEM, HR-TEM, and Mossbauer spectroscopy techniques. The hydrogenation reaction of CO2 was performed using Co–Fe bimetallic catalysts pretreated in situ in a fixed-bed catalytic microreactor operating in the temperature range of 200–270 °C and a pressure of 0.92 MPa. With increasing Fe fraction, the selectivity to C2–C4 for Co–Fe catalyst increased under all operating conditions. The alcohol selectivity was found to increase with increasing iron content of the Co–Fe catalyst up to 50%, but then it dropped with further additi...

148 citations


Journal ArticleDOI
P. Adamson1, C. Ader1, M. P. Andrews1, N. Anfimov2  +255 moreInstitutions (38)
TL;DR: In this article, the first measurement using the NOvA detectors of νμ disappearance was reported using a 14 kton-equivalent exposure of 2.74×1020 protons on target from the Fermilab NuMI beam.
Abstract: This paper reports the first measurement using the NOvA detectors of νμ disappearance in a νμ beam. The analysis uses a 14 kton-equivalent exposure of 2.74×1020 protons-on-target from the Fermilab NuMI beam. Assuming the normal neutrino mass hierarchy, we measure Δm232=(2.52+0.20−0.18)×10−3 eV2 and sin2θ23 in the range 0.38–0.65, both at the 68% confidence level, with two statistically degenerate best-fit points at sin2θ23=0.43 and 0.60. Results for the inverted mass hierarchy are also presented.

Journal ArticleDOI
TL;DR: A novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information is introduced.
Abstract: Protein quality assessment (QA) useful for ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiments demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful deep learning tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at http://cactus.rnet.missouri.edu/DeepQA/ .

Journal ArticleDOI
TL;DR: This paper examined how social psychological factors, in particular, individuals' perceptions of schools with varying demogrific attributes, affect their perceptions of race segregation in U.S. schools.
Abstract: Racial segregation remains a persistent problem in U.S. schools. In this article, we examine how social psychological factors—in particular, individuals’ perceptions of schools with varying demogra...

Journal ArticleDOI
01 Nov 2016
TL;DR: A literature survey of engineering resilience from the design perspective, with a focus on engineering resilience metrics and their design implications is provided in this paper. But, despite an increase in the usage of the engineering resilience concept, the diversity of its applications in various engineering sectors complicates a universal agreement on its quantification and associated measurement techniques.
Abstract: A resilient system is a system that possesses the ability to survive and recover from the likelihood of damage due to disruptive events or mishaps. The concept that incorporates resiliency into engineering practices is known as engineering resilience. To date, engineering resilience is still predominantly application-oriented. Despite an increase in the usage of engineering resilience concept, the diversity of its applications in various engineering sectors complicates a universal agreement on its quantification and associated measurement techniques. There is a pressing need to develop a generally applicable engineering resilience analysis framework, which standardizes the modeling, assessment, and improvement of engineering resilience for a broader engineering discipline. This paper provides a literature survey of engineering resilience from the design perspective, with a focus on engineering resilience metrics and their design implications. The currently available engineering resilience quantification metrics are reviewed and summarized, the design implications toward the development of resilient-engineered systems are discussed, and further, the challenges of incorporating resilience into engineering design processes are evaluated. The presented study expects to serve as a building block toward developing a generally applicable engineering resilience analysis framework that can be readily used for system design. [DOI: 10.1115/1.4034223]

Journal ArticleDOI
TL;DR: The approach, namely cHRev, is presented, to automatically recommend reviewers who are best suited to participate in a given review, based on their historical contributions as demonstrated in their prior reviews, and is evaluated on three open source systems as well as a commercial codebase at Microsoft.
Abstract: Code review is an important part of the software development process. Recently, many open source projects have begun practicing code review through “modern” tools such as GitHub pull-requests and Gerrit. Many commercial software companies use similar tools for code review internally. These tools enable the owner of a source code change to request individuals to participate in the review, i.e., reviewers. However, this task comes with a challenge. Prior work has shown that the benefits of code review are dependent upon the expertise of the reviewers involved. Thus, a common problem faced by authors of source code changes is that of identifying the best reviewers for their source code change. To address this problem, we present an approach, namely cHRev , to automatically recommend reviewers who are best suited to participate in a given review, based on their historical contributions as demonstrated in their prior reviews. We evaluate the effectiveness of cHRev on three open source systems as well as a commercial codebase at Microsoft and compare it to the state of the art in reviewer recommendation. We show that by leveraging the specific information in previously completed reviews (i.e.,quantification of review comments and their recency), we are able to improve dramatically on the performance of prior approaches, which (limitedly) operate on generic review information (i.e., reviewers of similar source code file and path names) or source coderepository data. We also present the insights into why our approach cHRev outperforms the existing approaches.

Journal ArticleDOI
TL;DR: The state-of-the-art in traffic engineering for SDN with attention to four cores including flow management, fault tolerance, topology update, and traffic analysis is discussed in detail.
Abstract: SDN is an emerging networking paradigm that separates the network control plane from the data forwarding plane with the promise to dramatically improve network resource utilization, simplify network management, reduce operating costs, and promote innovation and evolution. While traffic engineering techniques have been widely exploited for ATM and IP/MPLS networks for performance optimization in the past, the promising SDN networks require novel traffic engineering solutions that can exploit the global network view, network status, and flow patterns/characteristics in order to achieve better traffic control and management. This article discusses the state-of-the-art in traffic engineering for SDN with attention to four cores including flow management, fault tolerance, topology update, and traffic analysis. Challenging issues for SDN traffic engineering solutions are discussed in detail.

Proceedings ArticleDOI
01 Jun 2016
TL;DR: The proposed framework jointly exploits deep packet inspection (DPI) and semi-supervised machine learning so that accurate traffic classification can be realized, while requiring minimal communications between the network controller and the SDN switches.
Abstract: In this paper, a QoS-aware traffic classification framework for software defined networks is proposed. Instead of identifying specific applications in most of the previous work of traffic classification, our approach classifies the network traffic into different classes according to the QoS requirements, which provide the crucial information to enable the fine-grained and QoS-aware traffic engineering. The proposed framework is fully located in the network controller so that the real-time, adaptive, and accurate traffic classification can be realized by exploiting the superior computation capacity, the global visibility, andthe inherent programmability of the network controller. More specifically, the proposed framework jointly exploits deep packet inspection (DPI) and semi-supervised machine learning so that accurate traffic classification can be realized, while requiring minimal communications between the network controller and the SDN switches. Based on the real Internet data set, the simulation results show the proposed classification framework can provide good performance in terms of classification accuracy and communication costs

Posted Content
TL;DR: DeepQA as mentioned in this paper is a single-model quality assessment method based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information.
Abstract: Protein quality assessment (QA) by ranking and selecting protein models has long been viewed as one of the major challenges for protein tertiary structure prediction. Especially, estimating the quality of a single protein model, which is important for selecting a few good models out of a large model pool consisting of mostly low-quality models, is still a largely unsolved problem. We introduce a novel single-model quality assessment method DeepQA based on deep belief network that utilizes a number of selected features describing the quality of a model from different perspectives, such as energy, physio-chemical characteristics, and structural information. The deep belief network is trained on several large datasets consisting of models from the Critical Assessment of Protein Structure Prediction (CASP) experiments, several publicly available datasets, and models generated by our in-house ab initio method. Our experiment demonstrate that deep belief network has better performance compared to Support Vector Machines and Neural Networks on the protein model quality assessment problem, and our method DeepQA achieves the state-of-the-art performance on CASP11 dataset. It also outperformed two well-established methods in selecting good outlier models from a large set of models of mostly low quality generated by ab initio modeling methods. DeepQA is a useful tool for protein single model quality assessment and protein structure prediction. The source code, executable, document and training/test datasets of DeepQA for Linux is freely available to non-commercial users at this http URL.

Journal ArticleDOI
01 Aug 2016
TL;DR: A software-defined architecture, namely SoftWater, is first introduced to facilitate the development of the next-generation underwater communication systems and can easily incorporate new underwater communication solutions, accordingly maximize the network capacity, can achieve the network robustness and energy efficiency, as well as can provide truly differentiated and scalable networking services.
Abstract: Underwater communication systems have drawn the attention of the research community in the last 15 years. This growing interest can largely be attributed to new civil and military applications enabled by large-scale networks of underwater devices (e.g., underwater static sensors, unmanned autonomous vehicles (AUVs), and autonomous robots), which can retrieve information from the aquatic and marine environment, perform in-network processing on the extracted data, and transmit the collected information to remote locations. Currently underwater communication systems are inherently hardware-based and rely on closed and inflexible architectural design. This imposes significant challenges into adopting new underwater communication and networking technologies, prevent the provision of truly-differentiated services to highly diverse underwater applications, and induce great barriers to integrate heterogeneous underwater devices. Software-defined networking (SDN), recognized as the next-generation networking paradigm, relies on the highly flexible, programmable, and virtualizable network architecture to dramatically improve network resource utilization, simplify network management, reduce operating cost, and promote innovation and evolution. In this paper, a software-defined architecture, namely SoftWater, is first introduced to facilitate the development of the next-generation underwater communication systems. More specifically, by exploiting the network function virtualization (NFV) and network virtualization concepts, SoftWater architecture can easily incorporate new underwater communication solutions, accordingly maximize the network capacity, can achieve the network robustness and energy efficiency, as well as can provide truly differentiated and scalable networking services. Consequently, the SoftWater architecture can simultaneously support a variety of different underwater applications, and can enable the interoperability of underwater devices from different manufacturers that operate on different underwater communication technologies based on acoustic, optical, or radio waves. Moreover, the essential network management tools of SoftWater are discussed, including reconfigurable multi-controller placement, hybrid in-band and out-of-band control traffic balancing, and utility-optimal network virtualization. Furthermore, the major benefits of SoftWater architecture are demonstrated by introducing software-defined underwater networking solutions, including the throughput-optimal underwater routing, SDN-enhanced fault recovery, and software-defined underwater mobility management. The research challenges to realize the SoftWater are also discussed in detail.

Journal ArticleDOI
TL;DR: It is shown in rodents that acute pharmacological inhibition of the vesicular monoamine transporter (VMAT) blocks amphetamine-induced locomotion and self-administration without impacting cocaine-induced behaviours.
Abstract: Amphetamines elevate extracellular dopamine, but the underlying mechanisms remain uncertain. Here we show in rodents that acute pharmacological inhibition of the vesicular monoamine transporter (VMAT) blocks amphetamine-induced locomotion and self-administration without impacting cocaine-induced behaviours. To study VMAT's role in mediating amphetamine action in dopamine neurons, we have used novel genetic, pharmacological and optical approaches in Drosophila melanogaster. In an ex vivo whole-brain preparation, fluorescent reporters of vesicular cargo and of vesicular pH reveal that amphetamine redistributes vesicle contents and diminishes the vesicle pH-gradient responsible for dopamine uptake and retention. This amphetamine-induced deacidification requires VMAT function and results from net H(+) antiport by VMAT out of the vesicle lumen coupled to inward amphetamine transport. Amphetamine-induced vesicle deacidification also requires functional dopamine transporter (DAT) at the plasma membrane. Thus, we find that at pharmacologically relevant concentrations, amphetamines must be actively transported by DAT and VMAT in tandem to produce psychostimulant effects.

Journal ArticleDOI
TL;DR: In this mode, one or more chiral selectors are added to the background electrolyte acting as pseudostationary phases as mentioned in this paper, and they are applied to enantioseparations in capillary electrokinetic chromatography.
Abstract: Capillary electrokinetic chromatography is generally recognized as a versatile and robust capillary electromigration technique for the separation of enantiomers. In this mode, one or more chiral selectors are added to the background electrolyte acting as pseudostationary phases. Within the various chiral selectors that have been applied to enantioseparations in capillary electrokinetic chromatography, cyclodextrins are by far the most often used selectors because of their versatility, structural variety and commercial availability. This is reflected in the large number of applications of cyclodextrins to analytical enantioseparations that have been reported between January 2012 and July 2016, the period of time covered by this review. Many of these applications cover aspects of life sciences such as drug analysis, bioanalysis or food analysis. Despite the large number of commercially available cyclodextrins, new derivatives have been developed in order to achieve altered enantioselectivities or to further broaden the application range. Cyclodextrins have also been used to demonstrate the validity of theoretical models of electromigration as well as complex formation equilibria in enantioseparations. Finally, recent studies for an understanding of the molecular basis of the chiral recognition between cyclodextrins and the analytes are discussed.

Journal ArticleDOI
TL;DR: Stem cell-mediated cell therapy demonstrates the potential to restore the function and structure of the NP, and many challenges need to be overcome before the application of these approaches can be successful clinically.

Journal ArticleDOI
TL;DR: A novel single‐model quality assessment method QAcon utilizing structural features, physicochemical properties, and residue contact predictions, which is ranked as one of the top single model QA methods.
Abstract: Motivation Protein model quality assessment (QA) plays a very important role in protein structure prediction. It can be divided into two groups of methods: single model and consensus QA method. The consensus QA methods may fail when there is a large portion of low quality models in the model pool. Results In this paper, we develop a novel single-model quality assessment method QAcon utilizing structural features, physicochemical properties, and residue contact predictions. We apply residue-residue contact information predicted by two protein contact prediction methods PSICOV and DNcon to generate a new score as feature for quality assessment. This novel feature and other 11 features are used as input to train a two-layer neural network on CASP9 datasets to predict the quality of a single protein model. We blindly benchmarked our method QAcon on CASP11 dataset as the MULTICOM-CLUSTER server. Based on the evaluation, our method is ranked as one of the top single model QA methods. The good performance of the features based on contact prediction illustrates the value of using contact information in protein quality assessment. Availability and implementation The web server and the source code of QAcon are freely available at: http://cactus.rnet.missouri.edu/QAcon. Contact chengji@missouri.edu. Supplementary information Supplementary data are available at Bioinformatics online.

Journal ArticleDOI
TL;DR: In this article, the authors studied increasing stability in the interior inverse source problem for the Helmholtz equation from boundary Cauchy data for multiple wave numbers using the Fourier transform with respect to the wave number.

Journal ArticleDOI
TL;DR: This research enhances theoretical understanding about which dimensions of trust play more important roles in influencing satisfaction and purchase behavior, respectively and provides guidance to practitioners enabling them to focus on the development and training foci that best prepare customer relationship employees on the diverse aspects of trust most salient to customer needs.
Abstract: The three trusting beliefs have different effects on satisfaction and purchase.Benevolence belief is a stronger predictor of satisfaction than competence belief.Competence is a stronger predictor of purchase than integrity and benevolence.Future trust research should include both satisfaction and purchase behavior. Trust has been extensively studied in the buyer-seller context and typically operationalized according to the McKnight tripartite conception of trusting beliefs. The McKnight model identifies three beliefs (integrity, benevolence, and competence) as the key components of trust. However, limited research has examined the relative effect of these three individual trusting beliefs on satisfaction and purchase behavior in the buyer-seller context. To address this gap, we posit that a buyer's beliefs in a seller's integrity and benevolence have stronger influences on satisfaction than a belief in a seller's competence. In contrast, a buyer's belief in a seller's competence has a stronger influence on purchase behavior as compared to beliefs in a seller's integrity and benevolence. The results from a buyer-broker simulation study support that (1) a buyer's belief in a seller's benevolence is a stronger predictor of satisfaction than the belief in a seller's competence; (2) a buyer's belief in a seller's competence is a stronger predictor of purchase behavior than are beliefs in a seller's integrity and benevolence. This research enhances our theoretical understanding about which dimensions of trust play more important roles in influencing satisfaction and purchase behavior, respectively. This research also provides guidance to practitioners enabling them to focus on the development and training foci that best prepare customer relationship employees on the diverse aspects of trust most salient to customer needs, such as, emphasizing competence over benevolence for infrequent purchases, or emphasizing benevolence for potentially frequent purchases.


Journal ArticleDOI
TL;DR: The effectiveness of sexual counselling interventions to counsel adult cardiac patients about sexual problems with usual care was evaluated and no significant differences were reported in favour of the intervention.
Abstract: BACKGROUND: Sexual problems are common among people with cardiovascular disease. Although clinical guidelines recommend sexual counselling for patients and their partners, there is little evidence ...

Journal ArticleDOI
TL;DR: How service members' psychological acceptance promotes family resilience and adaption to the multiple contextual challenges and role transitions associated with military deployment is discussed.
Abstract: This research examined whether military service members' deployment-related trauma exposure, posttraumatic stress disorder (PTSD) symptoms, and experiential avoidance are associated with their observed levels of positive social engagement, social withdrawal, reactivity-coercion, and distress avoidance during postdeployment family interaction. Self reports of deployment related trauma, postdeployment PTSD symptoms, and experiential avoidance were collected from 184 men who were deployed to the Middle East conflicts, were partnered, and had a child between 4 and 13 years of age. Video samples of parent-child and partner problem solving and conversations about deployment issues were collected, and were rated by trained observers to assess service members' positive engagement, social withdrawal, reactivity-coercion, and distress avoidance, as well as spouse and child negative affect and behavior. Service members' experiential avoidance was reliably associated with less observed positive engagement and more observed withdrawal and distress avoidance after controlling for spouse and child negative affect and behavior during ongoing interaction. Service members' experiential avoidance also diminished significant associations between service members' PTSD symptoms and their observed behavior. The results are discussed in terms of how service members' psychological acceptance promotes family resilience and adaption to the multiple contextual challenges and role transitions associated with military deployment. Implications for parenting and marital interventions are described.

Journal ArticleDOI
TL;DR: A robust adaptive neural network control is added to the feedback linearization control to reduce the deviation in PEMFCs and can significantly enhance the output performance as well as reject the disturbances.

Journal ArticleDOI
TL;DR: In this paper, the use of Twitter analysis is proposed to document weekly trends in emotion and stress, and attempt to use the method to estimate the work recovery effect of weekends. But, the method is limited to the US national level of analysis.
Abstract: We propose the use of Twitter analysis as an alternative source of data to document weekly trends in emotion and stress, and attempt to use the method to estimate the work recovery effect of weekends. On the basis of 2,102,176,189 Tweets, we apply Pennebaker's linguistic inquiry word count (LIWC) approach to measure daily Tweet content across 18 months, aggregated to the US national level of analysis. We derived a word count dictionary to assess work stress and applied p-technique factor analysis to the daily word count data from 19 substantively different content areas covered by the LIWC dictionaries. Dynamic factor analysis revealed two latent factors in day-level variation of Tweet content. These two factors are: (a) a negative emotion/stress/somatic factor, and (b) a positive emotion/food/friends/home/family/leisure factor, onto which elements of work, money, achievement, and health issues have strong negative loadings. The weekly trend analysis revealed a clear “Friday dip” for work stress and negative emotion expressed on Twitter. In contrast, positive emotion Tweets showed a “mid-week dip” for Tuesday-Wednesday-Thursday and “weekend peak” for Friday through Sunday, whereas work/money/achievement/health problem Tweets showed a small “weekend dip” on Fridays through Sundays. Results partially support the Effort-Recovery theory. Implications and limitations of the method are discussed.