scispace - formally typeset
Search or ask a question

Showing papers by "Simon Maskell published in 2018"


Journal ArticleDOI
TL;DR: The results clearly suggest that broad-ranging statistical signal detection in Twitter and Facebook, using currently available methods for adverse event recognition, performs poorly and cannot be recommended at the expense of other pharmacovigilance activities.
Abstract: Social media has been proposed as a possibly useful data source for pharmacovigilance signal detection. This study primarily aimed to evaluate the performance of established statistical signal detection algorithms in Twitter/Facebook for a broad range of drugs and adverse events. Performance was assessed using a reference set by Harpaz et al., consisting of 62 US Food and Drug Administration labelling changes, and an internal WEB-RADR reference set consisting of 200 validated safety signals. In total, 75 drugs were studied. Twitter/Facebook posts were retrieved for the period March 2012 to March 2015, and drugs/events were extracted from the posts. We retrieved 4.3 million and 2.0 million posts for the WEB-RADR and Harpaz drugs, respectively. Individual case reports were extracted from VigiBase for the same period. Disproportionality algorithms based on the Information Component or the Proportional Reporting Ratio and crude post/report counting were applied in Twitter/Facebook and VigiBase. Receiver operating characteristic curves were generated, and the relative timing of alerting was analysed. Across all algorithms, the area under the receiver operating characteristic curve for Twitter/Facebook varied between 0.47 and 0.53 for the WEB-RADR reference set and between 0.48 and 0.53 for the Harpaz reference set. For VigiBase, the ranges were 0.64–0.69 and 0.55–0.67, respectively. In Twitter/Facebook, at best, 31 (16%) and four (6%) positive controls were detected prior to their index dates in the WEB-RADR and Harpaz references, respectively. In VigiBase, the corresponding numbers were 66 (33%) and 17 (27%). Our results clearly suggest that broad-ranging statistical signal detection in Twitter and Facebook, using currently available methods for adverse event recognition, performs poorly and cannot be recommended at the expense of other pharmacovigilance activities.

39 citations


Journal ArticleDOI
TL;DR: This study proposes a causality measure that can detect an adverse reaction that is caused by a drug rather than merely being a correlated signal, and obtains an ADR detection accuracy of 74% on a large-scale manually annotated dataset of tweets.
Abstract: Detecting adverse drug reactions (ADRs) is an important task that has direct implications for the use of that drug. If we can detect previously unknown ADRs as quickly as possible, then this information can be provided to the regulators, pharmaceutical companies, and health care organizations, thereby potentially reducing drug-related morbidity and saving lives of many patients. A promising approach for detecting ADRs is to use social media platforms such as Twitter and Facebook. A high level of correlation between a drug name and an event may be an indication of a potential adverse reaction associated with that drug. Although numerous association measures have been proposed by the signal detection community for identifying ADRs, these measures are limited in that they detect correlations but often ignore causality.This study aimed to propose a causality measure that can detect an adverse reaction that is caused by a drug rather than merely being a correlated signal.To the best of our knowledge, this was the first causality-sensitive approach for detecting ADRs from social media. Specifically, the relationship between a drug and an event was represented using a set of automatically extracted lexical patterns. We then learned the weights for the extracted lexical patterns that indicate their reliability for expressing an adverse reaction of a given drug.Our proposed method obtains an ADR detection accuracy of 74% on a large-scale manually annotated dataset of tweets, covering a standard set of drugs and adverse reactions.By using lexical patterns, we can accurately detect the causality between drugs and adverse reaction-related events.

34 citations


Journal ArticleDOI
TL;DR: In this article, a general method of model selection from experimentally recorded time-trace data is discussed, which can be used to distinguish between quantum and classical dynamical models, and it can also be used in postselection as well as for real-time analysis.
Abstract: We discuss a general method of model selection from experimentally recorded time-trace data. This method can be used to distinguish between quantum and classical dynamical models. It can be used in postselection as well as for real-time analysis, and offers an alternative to statistical tests based on state-reconstruction methods. We examine the conditions that optimize quantum hypothesis testing, maximizing one's ability to discriminate between classical and quantum models. We set upper limits on the temperature and lower limits on the measurement efficiencies required to explore these differences, using an experiment in levitated optomechanical systems as an example.

18 citations


Journal ArticleDOI
TL;DR: A novel method through which local information about the target density can be used to construct an efficient importance sampler, whose configuration effectively depends only on the shape of the target and on a single free parameter representing pseudo-time.
Abstract: This work proposes a novel method through which local information about the target density can be used to construct an efficient importance sampler. The backbone of the proposed method is the incremental mixture importance sampling (IMIS) algorithm of Raftery and Bao (Biometrics 66(4):1162–1173, 2010), which builds a mixture importance distribution incrementally, by positioning new mixture components where the importance density lacks mass, relative to the target. The key innovation proposed here is to construct the mean vectors and covariance matrices of the mixture components by numerically solving certain differential equations, whose solution depends on the local shape of the target log-density. The new sampler has a number of advantages: (a) it provides an extremely parsimonious parametrization of the mixture importance density, whose configuration effectively depends only on the shape of the target and on a single free parameter representing pseudo-time; (b) it scales well with the dimensionality of the target; (c) it can deal with targets that are not log-concave. The performance of the proposed approach is demonstrated on two synthetic non-Gaussian densities, one being defined on up to eighty dimensions, and on a Bayesian logistic regression model, using the Sonar dataset. The Julia code implementing the importance sampler proposed here can be found at https://github.com/mfasiolo/LIMIS.

13 citations


Journal ArticleDOI
TL;DR: In this paper, an approach to the simulation of complex urban environments and demonstrates the viability of using this approach for the generation of simulated sensor data, corresponding to the use of wide area imaging systems for surveillance and reconnaissance applications.

7 citations


01 Jan 2018
TL;DR: This paper is an extension of previous works by the authors, where all uncertainties pertaining to the complete fusion life cycle are now jointly and comprehensively considered.
Abstract: © 2018 JAIF. In this paper, the uncertainties that enter through the life-cycle of an information fusion system are exhaustively and explicitly considered and defined. Addressing the factors that influence a fusion system is an essential step required before uncertainty representation and reasoning processes within a fusion system can be evaluated according to the Uncertainty Representation and Reasoning Evaluation Framework (URREF) ontology. The life cycle of a fusion system consists primarily of two stages, namely inception and design, as well as routine operation and assessment. During the inception and design stage, the primary flow is that of abstraction, through modelling and representation of real-world phenomena. This stage is mainly characterised by epistemic uncertainty. During the routine operation and assessment stage, aleatory uncertainty combines with epistemic uncertainty from the design phase as well as uncertainty about the effect of actions on the mission in a feedback loop (another form of epistemic uncertainty). Explicit and accurate internal modelling of these uncertainties, and the evaluation of how these uncertainties are represented and reasoned about in the fusion system using the URREF ontology, are the main contributions of this paper for the information fusion community. This paper is an extension of previous works by the authors, where all uncertainties pertaining to the complete fusion life cycle are now jointly and comprehensively considered. Also, uncertainties pertaining to the decision process are further detailed.

6 citations


Proceedings ArticleDOI
23 Jul 2018
TL;DR: In this paper, a text-guided deep convolutional neural network classifier was proposed for aerial vehicle recognition using a synthetic aerial dataset and the desired classes consist of the combination of the class types and colors of the vehicles.
Abstract: This paper investigates the problem of aerial vehicle recognition using a text-guided deep convolutional neural network classifier. The network receives an aerial image and a desired class, and makes a yes or no output by matching the image and the textual description of the desired class. We train and test our model on a synthetic aerial dataset and our desired classes consist of the combination of the class types and colors of the vehicles. This strategy helps when considering more classes in testing than in training.

6 citations


Posted Content
TL;DR: An approach to the simulation of complex urban environments and the viability of using this approach for the generation of simulated sensor data, corresponding to the use of wide area imaging systems for surveillance and reconnaissance applications is demonstrated.
Abstract: The development, benchmarking and validation of aerial Persistent Surveillance (PS) algorithms requires access to specialist Wide Area Aerial Surveillance (WAAS) datasets. Such datasets are difficult to obtain and are often extremely large both in spatial resolution and temporal duration. This paper outlines an approach to the simulation of complex urban environments and demonstrates the viability of using this approach for the generation of simulated sensor data, corresponding to the use of wide area imaging systems for surveillance and reconnaissance applications. This provides a cost-effective method to generate datasets for vehicle tracking algorithms and anomaly detection methods. The system fuses the Simulation of Urban Mobility (SUMO) traffic simulator with a MATLAB controller and an image generator to create scenes containing uninterrupted door-to-door journeys across large areas of the urban environment. This `pattern-of-life' approach provides three-dimensional visual information with natural movement and traffic flows. This can then be used to provide simulated sensor measurements (e.g. visual band and infrared video imagery) and automatic access to ground-truth data for the evaluation of multi-target tracking systems.

2 citations


Posted Content
TL;DR: This paper investigates the problem of aerial vehicle recognition using a text-guided deep convolutional neural network classifier, and makes a yes or no output by matching the image and the textual description of the desired class.
Abstract: This paper investigates the problem of aerial vehicle recognition using a text-guided deep convolutional neural network classifier. The network receives an aerial image and a desired class, and makes a yes or no output by matching the image and the textual description of the desired class. We train and test our model on a synthetic aerial dataset and our desired classes consist of the combination of the class types and colors of the vehicles. This strategy helps when considering more classes in testing than in training.

1 citations


Proceedings ArticleDOI
10 Jul 2018
TL;DR: A comparative study of several supervised spectral embedding techniques and their relationship with the feature space used to describe the exemplars which act as inputs to an embedding procedure, and recommendations on the use of such methods for fusing multiple views of an object to recognize it under variable poses are formulated.
Abstract: Manifold embedding techniques have properties that render them attractive candidates to learn a compact and general representation of a three dimensional spatial object. In turn this representation can be used for object recognition through classification. This paper presents a comparative study of several supervised spectral embedding techniques and their relationship with the feature space used to describe the exemplars which act as inputs to an embedding procedure. By concentrating on this aspect, we are able to highlight preferential combinations between feature description and embedding, and we formulate recommendations on the use of such methods for fusing multiple views of an object to recognize it under variable poses.

1 citations


Proceedings ArticleDOI
10 Jul 2018
TL;DR: This paper considers the problem of tracking a manoeuvring target when some of the measurements are delayed by the time taken to propagate through some medium, and uses a particle filter, which handles out-of-sequence measurements by storing a history of hypothesised target states and measurement emission times for each particle.
Abstract: Ahstract-This paper considers the problem of tracking a manoeuvring target when some of the measurements are delayed by the time taken to propagate through some medium We are especially interested in bearing-only measurements, since it is possible to extract range information by fusing measurements which have negligible propagation delay (such as from electrooptical sensors) and measurements which have a propagation delay proportional to the range to the target (such as from acoustic sensors) This requires us to handle measurements which appear out of sequence, and with emission times unknown to the tracker Unlike previous approaches, a particle filter is used, which handles out-of-sequence measurements by storing a history of hypothesised target states and measurement emission times for each particle This allows new target states and times to be inserted into the trajectory of each target by interpolating between adjacent states in the history