scispace - formally typeset
Search or ask a question

Showing papers by "Edinburgh Napier University published in 2019"


Proceedings Article
08 May 2019
TL;DR: UniLM as mentioned in this paper is a unified pre-trained language model that can be fine-tuned for both natural language understanding and generation tasks, achieving state-of-the-art results on five natural language generation datasets, including improving the CNN/DailyMail abstractive summarization ROUGE-L to 40.51 (2.04 absolute improvement).
Abstract: This paper presents a new Unified pre-trained Language Model (UniLM) that can be fine-tuned for both natural language understanding and generation tasks. The model is pre-trained using three types of language modeling tasks: unidirectional, bidirectional, and sequence-to-sequence prediction. The unified modeling is achieved by employing a shared Transformer network and utilizing specific self-attention masks to control what context the prediction conditions on. UniLM compares favorably with BERT on the GLUE benchmark, and the SQuAD 2.0 and CoQA question answering tasks. Moreover, UniLM achieves new state-of-the-art results on five natural language generation datasets, including improving the CNN/DailyMail abstractive summarization ROUGE-L to 40.51 (2.04 absolute improvement), the Gigaword abstractive summarization ROUGE-L to 35.75 (0.86 absolute improvement), the CoQA generative question answering F1 score to 82.5 (37.1 absolute improvement), the SQuAD question generation BLEU-4 to 22.12 (3.75 absolute improvement), and the DSTC7 document-grounded dialog response generation NIST-4 to 2.67 (human performance is 2.65). The code and pre-trained models are available at https://github.com/microsoft/unilm.

1,019 citations


Journal ArticleDOI
TL;DR: It is revealed that digital transformation is an ongoing process of using new digital technologies in everyday organizational life, which recognizes agility as the core mechanism for the strategic renewal of an organization's business model, collaborative approach, and eventually the culture.

760 citations


Journal ArticleDOI
TL;DR: It is confirmed that eukaryotes form at least two domains, the loss of monophyly in the Excavata, robust support for the Haptista and Cryptista, and suggested primer sets for DNA sequences from environmental samples that are effective for each clade are provided.
Abstract: This revision of the classification of eukaryotes follows that of Adl et al., 2012 [J. Euk. Microbiol. 59(5)] and retains an emphasis on protists. Changes since have improved the resolution of many ...

750 citations


Journal ArticleDOI
TL;DR: There is a call to action for key stakeholders to create the infrastructure and cultural adaptations needed so that all people living with and beyond cancer can be as active as is possible for them.
Abstract: Multiple organizations around the world have issued evidence-based exercise guidance for patients with cancer and cancer survivors. Recently, the American College of Sports Medicine has updated its exercise guidance for cancer prevention as well as for the prevention and treatment of a variety of cancer health-related outcomes (eg, fatigue, anxiety, depression, function, and quality of life). Despite these guidelines, the majority of people living with and beyond cancer are not regularly physically active. Among the reasons for this is a lack of clarity on the part of those who work in oncology clinical settings of their role in assessing, advising, and referring patients to exercise. The authors propose using the American College of Sports Medicine's Exercise Is Medicine initiative to address this practice gap. The simple proposal is for clinicians to assess, advise, and refer patients to either home-based or community-based exercise or for further evaluation and intervention in outpatient rehabilitation. To do this will require care coordination with appropriate professionals as well as change in the behaviors of clinicians, patients, and those who deliver the rehabilitation and exercise programming. Behavior change is one of many challenges to enacting the proposed practice changes. Other implementation challenges include capacity for triage and referral, the need for a program registry, costs and compensation, and workforce development. In conclusion, there is a call to action for key stakeholders to create the infrastructure and cultural adaptations needed so that all people living with and beyond cancer can be as active as is possible for them.

392 citations


Journal ArticleDOI
TL;DR: The results of this best practice analysis offer a series of critical insights into what strategic principles drive smart city development in Europe and generate scientific knowledge which helps to overcome the dichotomous nature of smart city research.

207 citations


Journal ArticleDOI
TL;DR: The bespoke eMERGe Reporting Guidance, which incorporates new methodological developments and advances the methodology, can help researchers to report the important aspects of meta-ethnography and should raise reporting quality.
Abstract: The aim of this study was to provide guidance to improve the completeness and clarity of meta‐ethnography reporting. Evidence‐based policy and practice require robust evidence syntheses which can further understanding of people's experiences and associated social processes. Meta‐ethnography is a rigorous seven‐phase qualitative evidence synthesis methodology, developed by Noblit and Hare. Meta‐ethnography is used widely in health research, but reporting is often poor quality and this discourages trust in and use of its findings. Meta‐ethnography reporting guidance is needed to improve reporting quality. The eMERGe study used a rigorous mixed‐methods design and evidence‐based methods to develop the novel reporting guidance and explanatory notes. The study, conducted from 2015 to 2017, comprised of: (1) a methodological systematic review of guidance for meta‐ethnography conduct and reporting; (2) a review and audit of published meta‐ethnographies to identify good practice principles; (3) international, multidisciplinary consensus‐building processes to agree guidance content; (4) innovative development of the guidance and explanatory notes. Recommendations and good practice for all seven phases of meta‐ethnography conduct and reporting were newly identified leading to 19 reporting criteria and accompanying detailed guidance.The bespoke eMERGe Reporting Guidance, which incorporates new methodological developments and advances the methodology, can help researchers to report the important aspects of meta‐ethnography. Use of the guidance should raise reporting quality. Better reporting could make assessments of confidence in the findings more robust and increase use of meta‐ethnography outputs to improve practice, policyand service user outcomes in health and other fields. This is the first tailored reporting guideline for meta‐ethnography.

188 citations


Journal ArticleDOI
TL;DR: In this article, the authors provide an overview of unsupervised learning in the domain of networking, and provide a comprehensive review of the current state of the art in this area, by synthesizing insights from previous survey papers.
Abstract: While machine learning and artificial intelligence have long been applied in networking research, the bulk of such works has focused on supervised learning. Recently, there has been a rising trend of employing unsupervised machine learning using unstructured raw network data to improve network performance and provide services, such as traffic engineering, anomaly detection, Internet traffic classification, and quality of service optimization. The growing interest in applying unsupervised learning techniques in networking stems from their great success in other fields, such as computer vision, natural language processing, speech recognition, and optimal control (e.g., for developing autonomous self-driving cars). In addition, unsupervised learning can unconstrain us from the need for labeled data and manual handcrafted feature engineering, thereby facilitating flexible, general, and automated methods of machine learning. The focus of this survey paper is to provide an overview of applications of unsupervised learning in the domain of networking. We provide a comprehensive survey highlighting recent advancements in unsupervised learning techniques, and describe their applications in various learning tasks, in the context of networking. We also provide a discussion on future directions and open research issues, while identifying potential pitfalls. While a few survey papers focusing on applications of machine learning in networking have previously been published, a survey of similar scope and breadth is missing in the literature. Through this timely review, we aim to advance the current state of knowledge, by carefully synthesizing insights from previous survey papers, while providing contemporary coverage of the recent advances and innovations.

182 citations


Journal ArticleDOI
TL;DR: A novel two-stage deep learning model based on a stacked auto-encoder with a soft-max classifier for efficient network intrusion detection that has the potential to serve as a future benchmark for deep learning and network security research communities.
Abstract: The network intrusion detection system is an important tool for protecting computer networks against threats and malicious attacks. Many techniques have recently been proposed; however, these techniques face significant challenges due to the continuous emergence of new threats that are not recognized by the existing detection systems. In this paper, we propose a novel two-stage deep learning model based on a stacked auto-encoder with a soft-max classifier for efficient network intrusion detection. The model comprises two decision stages: an initial stage responsible for classifying network traffic as normal or abnormal using a probability score value. This is then used in the final decision stage as an additional feature for detecting the normal state and other classes of attacks. The proposed model is able to learn useful feature representations from large amounts of unlabeled data and classifies them automatically and efficiently. To evaluate and test the effectiveness of the proposed model, several experiments are conducted on two public datasets: an older benchmark dataset, the KDD99, and a newer one, the UNSW-NB15. The comparative experimental results demonstrate that our proposed model significantly outperforms the existing models and methods and achieves high recognition rates, up to 99.996% and 89.134%, for the KDD99 and UNSW-NB15 datasets, respectively. We conclude that our model has the potential to serve as a future benchmark for deep learning and network security research communities.

177 citations


Journal ArticleDOI
TL;DR: Clinically diagnosed AF after a stroke or a transient ischemic attack is associated with significantly increased risk of recurrent stroke or systemic embolism, in particular, with additional stroke risk factors, and requires OAC rather than antiplatelet therapy.
Abstract: Cardiac thromboembolism attributed to atrial fibrillation (AF) is responsible for up to one-third of ischemic strokes. Stroke may be the first manifestation of previously undetected AF. Given the efficacy of oral anticoagulants in preventing AF-related ischemic strokes, strategies of searching for AF after a stroke using ECG monitoring followed by oral anticoagulation (OAC) treatment have been proposed to prevent recurrent cardioembolic strokes. This white paper by experts from the AF-SCREEN International Collaboration summarizes existing evidence and knowledge gaps on searching for AF after a stroke by using ECG monitoring. New AF can be detected by routine plus intensive ECG monitoring in approximately one-quarter of patients with ischemic stroke. It may be causal, a bystander, or neurogenically induced by the stroke. AF after a stroke is a risk factor for thromboembolism and a strong marker for atrial myopathy. After acute ischemic stroke, patients should undergo 72 hours of electrocardiographic monitoring to detect AF. The diagnosis requires an ECG of sufficient quality for confirmation by a health professional with ECG rhythm expertise. AF detection rate is a function of monitoring duration and quality of analysis, AF episode definition, interval from stroke to monitoring commencement, and patient characteristics including old age, certain ECG alterations, and stroke type. Markers of atrial myopathy (eg, imaging, atrial ectopy, natriuretic peptides) may increase AF yield from monitoring and could be used to guide patient selection for more intensive/prolonged poststroke ECG monitoring. Atrial myopathy without detected AF is not currently sufficient to initiate OAC. The concept of embolic stroke of unknown source is not proven to identify patients who have had a stroke benefitting from empiric OAC treatment. However, some embolic stroke of unknown source subgroups (eg, advanced age, atrial enlargement) might benefit more from non-vitamin K-dependent OAC therapy than aspirin. Fulfilling embolic stroke of unknown source criteria is an indication neither for empiric non-vitamin K-dependent OAC treatment nor for withholding prolonged ECG monitoring for AF. Clinically diagnosed AF after a stroke or a transient ischemic attack is associated with significantly increased risk of recurrent stroke or systemic embolism, in particular, with additional stroke risk factors, and requires OAC rather than antiplatelet therapy. The minimum subclinical AF duration required on ECG monitoring poststroke/transient ischemic attack to recommend OAC therapy is debated.

173 citations


Journal ArticleDOI
TL;DR: The resulting CNN is shown to provide better classification performance when compared to more conventional learning machines; indeed, it achieves an average accuracy of 89.8% in binary classification and of 83.3% in three-ways classification.

172 citations


Journal ArticleDOI
TL;DR: In this paper, the authors identify barriers to and enablers for the circular economy within the built environment, where its constituting elements (buildings and infrastructure) are characterised by long lifespans, numerous stakeholders, and hundreds of components and ancillary materials that interact dynamically in space and time.

Journal ArticleDOI
TL;DR: The proposed taxonomy makes it easier to understand the existing attack landscape towards developing defence mechanisms, and is leveraged to identify open problems that can lead to new research areas within the field of adversarial machine learning.

Journal ArticleDOI
TL;DR: The development of effective interventions for CPTSD can build upon the success of PTSD interventions, and the benefits of flexibility in intervention selection, sequencing and delivery should be assessed, based on clinical need and patient preferences.
Abstract: Background The 11th revision to the WHO International Classification of Diseases (ICD-11) identified complex post-traumatic stress disorder (CPTSD) as a new condition. There is a pressing need to identify effective CPTSD interventions. Methods We conducted a systematic review and meta-analysis of randomised controlled trials (RCTs) of psychological interventions for post-traumatic stress disorder (PTSD), where participants were likely to have clinically significant baseline levels of one or more CPTSD symptom clusters (affect dysregulation, negative self-concept and/or disturbed relationships). We searched MEDLINE, PsycINFO, EMBASE and PILOTS databases (January 2018), and examined study and outcome quality. Results Fifty-one RCTs met inclusion criteria. Cognitive behavioural therapy (CBT), exposure alone (EA) and eye movement desensitisation and reprocessing (EMDR) were superior to usual care for PTSD symptoms, with effects ranging from g = −0.90 (CBT; k = 27, 95% CI −1.11 to −0.68; moderate quality) to g = −1.26 (EMDR; k = 4, 95% CI −2.01 to −0.51; low quality). CBT and EA each had moderate–large or large effects on negative self-concept, but only one trial of EMDR provided useable data. CBT, EA and EMDR each had moderate or moderate–large effects on disturbed relationships. Few RCTs reported affect dysregulation data. The benefits of all interventions were smaller when compared with non-specific interventions (e.g. befriending). Multivariate meta-regression suggested childhood-onset trauma was associated with a poorer outcome. Conclusions The development of effective interventions for CPTSD can build upon the success of PTSD interventions. Further research should assess the benefits of flexibility in intervention selection, sequencing and delivery, based on clinical need and patient preferences.

Journal ArticleDOI
TL;DR: This position paper provides a brief overview of currently existing digital health applications in different cardiovascular disease settings and provides the reader with the most relevant challenges for their large-scale deployment in Europe.
Abstract: Cardiovascular disease is one of the main causes of morbidity and mortality worldwide. Despite the availability of highly effective treatments, the contemporary burden of disease remains huge. Digital health interventions hold promise to improve further the quality and experience of cardiovascular care. This position paper provides a brief overview of currently existing digital health applications in different cardiovascular disease settings. It provides the reader with the most relevant challenges for their large-scale deployment in Europe. The potential role of different stakeholders and related challenges are identified, and the key points suggestions on how to proceed are given. This position paper was developed by the European Society of Cardiology (ESC) e-Cardiology working group, in close collaboration with the ESC Digital Health Committee, the European Association of Preventive Cardiology, the European Heart Rhythm Association, the Heart Failure Association, the European Association of Cardiovascular Imaging, the Acute Cardiovascular Care Association, the European Association of Percutaneous Cardiovascular Interventions, the Association of Cardiovascular Nursing and Allied Professions and the Council on Hypertension. It relates to the ESC's action plan and mission to play a pro-active role in all aspects of the e-health agenda in support of cardiovascular health in Europe and aims to be used as guiding document for cardiologists and other relevant stakeholders in the field of digital health.

Journal ArticleDOI
TL;DR: An enhanced user privacy scheme through caching and spatial K -anonymity (CSKA) in continuous LBSs; it adopts multi-level caching to reduce the risk of exposure of users’ information to untrusted LSPs and can minimize the overhead of the LBS server.

Journal ArticleDOI
TL;DR: Telehealth interventions with a range of delivery modes could be offered to patients who cannot attend cardiac rehabilitation, or as an adjunct to cardiac rehabilitation for effective secondary prevention.
Abstract: Background:Coronary heart disease (CHD) is a major cause of death worldwide. Cardiac rehabilitation, an evidence-based CHD secondary prevention programme, remains underutilized. Telehealth may offe...

Journal ArticleDOI
TL;DR: This study provides the first translation and validation of the World Health Organization ACE - International Questionnaire (ACE-IQ) and highlights the importance of examining ACE exposure within local contexts, as children's adverse experiences may be idiosyncratic to geographic, social, and cultural norms.

Journal ArticleDOI
TL;DR: The current work is the first to articulate and differentiate the methodological variations and their application for different purposes and represents a significant advance in the understanding of the methodological application of meta-ethnography.
Abstract: Decision making in health and social care requires robust syntheses of both quantitative and qualitative evidence. Meta-ethnography is a seven-phase methodology for synthesising qualitative studies. Developed in 1988 by sociologists in education Noblit and Hare, meta-ethnography has evolved since its inception; it is now widely used in healthcare research and is gaining popularity in education research. The aim of this article is to provide up-to-date, in-depth guidance on conducting the complex analytic synthesis phases 4 to 6 of meta-ethnography through analysis of the latest methodological evidence. We report findings from a methodological systematic review conducted from 2015 to 2016. Fourteen databases and five other online resources were searched. Expansive searches were also conducted resulting in inclusion of 57 publications on meta-ethnography conduct and reporting from a range of academic disciplines published from 1988 to 2016. Current guidance on applying meta-ethnography originates from a small group of researchers using the methodology in a health context. We identified that researchers have operationalised the analysis and synthesis methods of meta-ethnography – determining how studies are related (phase 4), translating studies into one another (phase 5), synthesising translations (phase 6) and line of argument synthesis - to suit their own syntheses resulting in variation in methods and their application. Empirical research is required to compare the impact of different methods of translation and synthesis. Some methods are potentially better at preserving links with the context and meaning of primary studies, a key principle of meta-ethnography. A meta-ethnography can and should include reciprocal and refutational translation and line of argument synthesis, rather than only one of these, to maximise the impact of its outputs. The current work is the first to articulate and differentiate the methodological variations and their application for different purposes and represents a significant advance in the understanding of the methodological application of meta-ethnography.

Journal ArticleDOI
TL;DR: In this article, Chandra and VLA observations of GW170817 at ~521-743 days post merger are presented, and a homogeneous analysis of the entire Chandra data set is performed.
Abstract: We present Chandra and VLA observations of GW170817 at ~521-743 days post merger, and a homogeneous analysis of the entire Chandra data set. We find that the late-time non-thermal emission follows the expected evolution from an off-axis relativistic jet, with a steep temporal decay $F_{\ u}\\propto t^{-1.95\\pm0.15}$ and a simple power-law spectrum $F_{\ u}\\propto \ u^{-0.575\\pm0.007}$. We present a new method to constrain the merger environment density based on diffuse X-ray emission from hot plasma in the host galaxy and we find $n\\le 9.6 \\times 10^{-3}\\,\\rm{cm^{-3}}$. This measurement is independent from inferences based on the jet afterglow modeling and allows us to partially solve for model degeneracies. The updated best-fitting model parameters with this density constraint are a fireball kinetic energy $E_0 = 1.5_{-1.1}^{+3.6}\\times 10^{49}\\,\\rm{erg}$ ($E_{iso}= 2.1_{-1.5}^{+6.4}\\times10^{52}\\, \\rm{erg}$), jet opening angle $\\theta_{0}= 5.9^{+1.0}_{-0.7}\\,\\rm{deg}$ with characteristic Lorentz factor $\\Gamma_j = 163_{-43}^{+23}$, expanding in a low-density medium with $n_0 = 2.5_{-1.9}^{+4.1} \\times 10^{-3}\\, \\rm{cm^{-3}}$ and viewed $\\theta_{obs} = 30.4^{+4.0}_{-3.4}\\, \\rm{deg}$ off-axis. The synchrotron emission originates from a power-law distribution of electrons with $p=2.15^{+0.01}_{-0.02}$. The shock microphysics parameters are constrained to $\\epsilon_{\\rm{e}} = 0.18_{-0.13}^{+0.30}$ and $\\epsilon_{\\rm{B}}=2.3_{-2.2}^{+16.0} \\times 10^{-3}$. We investigate the presence of X-ray flares and find no statistically significant evidence of $\\ge2.5\\sigma$ of temporal variability at any time. Finally, we use our observations to constrain the properties of synchrotron emission from the deceleration of the fastest kilonova ejecta with energy $E_k^{KN}\\propto (\\Gamma\\beta)^{-\\alpha}$ into the environment, finding that shallow stratification indexes $\\alpha\\le6$ are disfavored.

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper reviewed the image enhancement and restoration methods that tackle typical underwater image impairments, including some extreme degradations and distortions, in terms of the underwater image formation model (IFM).
Abstract: Underwater images play a key role in ocean exploration but often suffer from severe quality degradation due to light absorption and scattering in water medium. Although major breakthroughs have been made recently in the general area of image enhancement and restoration, the applicability of new methods for improving the quality of underwater images has not specifically been captured. In this paper, we review the image enhancement and restoration methods that tackle typical underwater image impairments, including some extreme degradations and distortions. First, we introduce the key causes of quality reduction in underwater images, in terms of the underwater image formation model (IFM). Then, we review underwater restoration methods, considering both the IFM-free and the IFM-based approaches. Next, we present an experimental-based comparative evaluation of the state-of-the-art IFM-free and IFM-based methods, considering also the prior-based parameter estimation algorithms of the IFM-based methods, using both subjective and objective analyses (the used code is freely available at https://github.com/wangyanckxx/Single-Underwater-Image-Enhancement-and-Color-Restoration). Starting from this paper, we pinpoint the key shortcomings of existing methods, drawing recommendations for future research in this area. Our review of underwater image enhancement and restoration provides researchers with the necessary background to appreciate challenges and opportunities in this important field.

Journal ArticleDOI
TL;DR: This study ascertained CR availability, volumes and its drivers, and density globally, finding that capacity is grossly insufficient, such that most patients will not derive the benefits associated with participation.

Proceedings ArticleDOI
25 Mar 2019
TL;DR: In this article, a cyclical annealing schedule was proposed to learn more meaningful latent codes progressively by leveraging the results of previous learning cycles as warm re-restart, which is validated on a broad range of NLP tasks, including language modeling, dialog response generation and semi-supervised text classification.
Abstract: Variational autoencoders (VAE) with an auto-regressive decoder have been applied for many natural language processing (NLP) tasks. VAE objective consists of two terms, the KL regularization term and the reconstruction term, balanced by a weighting hyper-parameter 𝛽. One notorious training difficulty is that the KL term tends to vanish. In this paper we study different scheduling schemes for 𝛽, and show that KL vanishing is caused by the lack of good latent codes in training decoder at the beginning of optimization. To remedy the issue, we propose a cyclical annealing schedule, which simply repeats the process of increasing 𝛽 multiple times. This new procedure allows us to learn more meaningful latent codes progressively by leveraging the results of previous learning cycles as warm re-restart. The effectiveness of cyclical annealing schedule is validated on a broad range of NLP tasks, including language modeling, dialog response generation and semi-supervised text classification.

Journal ArticleDOI
TL;DR: A novel cross-modality interactive attention network that takes full advantage of the interactive properties of multispectral input sources is proposed that achieves state-of-the-art performance with high efficiency.

Journal ArticleDOI
TL;DR: Individuals with CPTSD reported substantially higher psychiatric burden and lower levels of psychological well-being compared to those with PTSD and those with neither diagnosis.
Abstract: The primary aim of this study was to provide an assessment of the current prevalence rates of International Classification of Diseases (11th rev.) posttraumatic stress disorder (PTSD) and complex PTSD (CPTSD) among the adult population of the United States and to identify characteristics and correlates associated with each disorder. A total of 7.2% of the sample met criteria for either PTSD or CPTSD, and the prevalence rates were 3.4% for PTSD and 3.8% for CPTSD. Women were more likely than men to meet criteria for both PTSD and CPTSD. Cumulative adulthood trauma was associated with both PTSD and CPTSD; however, cumulative childhood trauma was more strongly associated with CPTSD than PTSD. Among traumatic stressors occurring in childhood, sexual and physical abuse by caregivers were identified as events associated with risk for CPTSD, whereas sexual assault by noncaregivers and abduction were risk factors for PTSD. Adverse childhood events were associated with both PTSD and CPTSD, and equally so. Individuals with CPTSD reported substantially higher psychiatric burden and lower levels of psychological well-being compared to those with PTSD and those with neither diagnosis.

Journal ArticleDOI
TL;DR: This bibliometric study offers a systematic review of the research on smart cities produced since 1992 and helps bridge the division affecting this research area, demonstrating that it is caused by the dichotomous nature of the development paths of smart cities that each thematic cluster relates to and the strategic principles they in turn support.

Journal ArticleDOI
TL;DR: Following the recently published 11th version of the WHO International Classification of Diseases (ICD‐11), this work sought to examine the risk factors and comorbidities associated with posttraumatic stress disorder and complex PTSD.
Abstract: BACKGROUND: Following the recently published 11th version of the WHO International Classification of Diseases (ICD-11), we sought to examine the risk factors and comorbidities associated with posttraumatic stress disorder (PTSD) and complex PTSD (CPTSD). METHOD: Cross-sectional and retrospective design. The sample consisted of 1,051 trauma-exposed participants from a nationally representative panel of the UK adult population. RESULTS: A total of 5.3% (95% confidence interval [CI] = 4.0-6.7%) met the diagnostic criteria for PTSD and 12.9% (95% CI = 10.9-15.0%) for CPTSD. Diagnosis of PTSD was independently associated with being female, being in a relationship, and the recency of traumatic exposure. CPTSD was independently associated with younger age, interpersonal trauma in childhood, and interpersonal trauma in adulthood. Growing up in an urban environment was associated with the diagnosis of PTSD and CPTSD. High rates of physical and mental health comorbidity were observed for PTSD and CPTSD. Those with CPTSD were more likely to endorse symptoms reflecting major depressive disorder (odds ratio [OR] = 21.85, 95 CI = 12.51-38.04) and generalized anxiety disorder (OR = 24.63, 95 CI = 14.77-41.07). Presence of PTSD (OR = 3.13, 95 CI = 1.81-5.41) and CPTSD (OR = 3.43, 95 CI = 2.37-4.70) increased the likelihood of suicidality by more than three times. Nearly half the participants with PTSD and CPTSD reported the presence of a chronic illness. CONCLUSIONS: CPTSD is a more common, comorbid, debilitating condition compared to PTSD. Further research is now required to identify effective interventions for its treatment.

Journal ArticleDOI
TL;DR: The 4 ‘A’s Test is a short, pragmatic tool which can help improving detection rates of delirium in routine clinical care and can be compared with the Confusion Assessment Method, which has a sensitivity and specificity higher than the 4AT.
Abstract: Delirium affects > 15% of hospitalised patients but is grossly underdetected, contributing to poor care. The 4 ‘A’s Test (4AT, www.the4AT.com ) is a short delirium assessment tool designed for routine use without special training. The primary objective was to assess the accuracy of the 4AT for delirium detection. The secondary objective was to compare the 4AT with another commonly used delirium assessment tool, the Confusion Assessment Method (CAM). This was a prospective diagnostic test accuracy study set in emergency departments or acute medical wards involving acute medical patients aged ≥ 70. All those without acutely life-threatening illness or coma were eligible. Patients underwent (1) reference standard delirium assessment based on DSM-IV criteria and (2) were randomised to either the index test (4AT, scores 0–12; prespecified score of > 3 considered positive) or the comparator (CAM; scored positive or negative), in a random order, using computer-generated pseudo-random numbers, stratified by study site, with block allocation. Reference standard and 4AT or CAM assessments were performed by pairs of independent raters blinded to the results of the other assessment. Eight hundred forty-three individuals were randomised: 21 withdrew, 3 lost contact, 32 indeterminate diagnosis, 2 missing outcome, and 785 were included in the analysis. Mean age was 81.4 (SD 6.4) years. 12.1% (95/785) had delirium by reference standard assessment, 14.3% (56/392) by 4AT, and 4.7% (18/384) by CAM. The 4AT had an area under the receiver operating characteristic curve of 0.90 (95% CI 0.84–0.96). The 4AT had a sensitivity of 76% (95% CI 61–87%) and a specificity of 94% (95% CI 92–97%). The CAM had a sensitivity of 40% (95% CI 26–57%) and a specificity of 100% (95% CI 98–100%). The 4AT is a short, pragmatic tool which can help improving detection rates of delirium in routine clinical care. International standard randomised controlled trial number (ISRCTN) 53388093 . Date applied 30/05/2014; date assigned 02/06/2014.

Journal ArticleDOI
TL;DR: The proposed MTM-EMBG structure is a cross-shaped microstrip transmission line on which are imprinted two outward facing E-shaped slits to suppress surface currents that would otherwise contribute towards mutual coupling between the array elements.
Abstract: This article presents a unique technique to enhance isolation between transmit/receive radiating elements in densely packed array antenna by embedding a metamaterial (MTM) electromagnetic bandgap (EMBG) structure in the space between the radiating elements to suppress surface currents that would otherwise contribute towards mutual coupling between the array elements. The proposed MTM-EMBG structure is a cross-shaped microstrip transmission line on which are imprinted two outward facing E-shaped slits. Unlike other MTM structures there is no short-circuit grounding using via-holes. With this approach, the maximum measured mutual coupling achieved is -60 dB @ 9.18 GHz between the transmit patches (#1 & #2) and receive patches (#3 & #4) in a four-element array antenna. Across the antenna’s measured operating frequency range of 9.12 to 9.96 GHz, the minimum measured isolation between each element of the array is 34.2 dB @ 9.48 GHz, and there is no degradation in radiation patterns. The average measured isolation over this frequency range is 47 dB. The results presented confirm the proposed technique is suitable in applications such as synthetic aperture radar (SAR) and multiple-input multiple-output (MIMO) systems.

Journal ArticleDOI
22 Mar 2019
TL;DR: An IoT-based forensic model is presented that supports the identification, acquisition, analysis, and presentation of potential artifacts of forensic interest from IoT devices and the underpinning infrastructure and uses the popular Amazon Echo as a use case to demonstrate how the proposed model can be used to guide forensics analysis of IoT devices.
Abstract: Internet of Things (IoT) are increasingly common in our society, and can be found in civilian settings as well as sensitive applications, such as battlefields and national security. Given the potential of these devices to be targeted by attackers, they are a valuable source in digital forensic investigations. In addition, incriminating evidence may be stored on an IoT device (e.g., Amazon Echo in a home environment and Fitbit worn by the victim or an accused person). In comparison to IoT security and privacy literature, IoT forensics is relatively under-studied. IoT forensics is also challenging in practice, particularly due to the complexity, diversity, and heterogeneity of IoT devices and ecosystems. In this paper, we present an IoT-based forensic model that supports the identification, acquisition, analysis, and presentation of potential artifacts of forensic interest from IoT devices and the underpinning infrastructure. Specifically, we use the popular Amazon Echo as a use case to demonstrate how our proposed model can be used to guide forensics analysis of IoT devices.

Journal ArticleDOI
TL;DR: A low-complexity unsupervised NILM algorithm is presented, which is inspired by a fuzzy clustering algorithm called entropy index constraints competitive agglomeration but facilitated and improved in a practical load monitoring environment to produce a set of generalized appliance models for the detection of appliance usage within a household.
Abstract: Awareness of electric energy usage has both societal and economic benefits, which include reduced energy bills and stress on non-renewable energy sources. In recent years, there has been a surge in interest in the field of load monitoring, also referred to as energy disaggregation, which involves methods and techniques for monitoring electric energy usage and providing appropriate feedback on usage patterns to homeowners. The use of unsupervised learning in non-intrusive load monitoring (NILM) is a key area of study, with practical solutions having wide implications for energy monitoring. In this paper, a low-complexity unsupervised NILM algorithm is presented, which is designed toward practical implementation. The algorithm is inspired by a fuzzy clustering algorithm called entropy index constraints competitive agglomeration, but facilitated and improved in a practical load monitoring environment to produce a set of generalized appliance models for the detection of appliance usage within a household. Experimental evaluation conducted using energy data from the reference energy data disaggregation dataset indicates that the algorithm has out-performance for event detection compared with recent state-of-the-art work for unsupervised NILM when considering common NILM metrics, such as accuracy, precision, recall, ${F}$ -measure, and total energy correctly assigned.