scispace - formally typeset
Search or ask a question
Author

Oscar Valdez

Bio: Oscar Valdez is an academic researcher from University of Texas at Austin. The author has contributed to research in topics: Gravitational wave & Frequency domain. The author has an hindex of 1, co-authored 1 publications receiving 4 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, a multilayer signal estimation (MuLaSE) was proposed to search for GW signals from the CCSN search based on a multistage, high accuracy spectral estimation to effectively achieve higher detection signal to noise ratio.
Abstract: In the post-detection era of gravitational wave (GW) astronomy, core collapse supernovae (CCSN) are one of the most interesting potential sources of signals arriving at the Advanced LIGO detectors. Mukherjee et al. have developed and implemented a new method to search for GW signals from the CCSN search based on a multistage, high accuracy spectral estimation to effectively achieve higher detection signal to noise ratio (SNR). The study has been further enhanced by incorporation of a convolutional neural network (CNN) to significantly reduce false alarm rates (FAR). The combined pipeline is termed multilayer signal estimation (MuLaSE) that works in an integrative manner with the coherent wave burst (cWB) pipeline. In order to compare the performance of this new search pipeline, termed ``MuLaSECC'', with the cWB, an extensive analysis has been performed with two families of core collapse supernova waveforms corresponding to two different three dimensional (3D) general relativistic CCSN explosion models, viz. Kuroda 2017 and the Ott 2013. The performance of this pipeline has been characterized through receiver operating characteristics (ROC) and the reconstruction of the detected signals. The MuLaSECC is found to have higher efficiency in low false alarm range, a higher detection probability of weak signals and an improved reconstruction, especially in the lower frequency domain.

7 citations


Cited by
More filters
Journal Article
TL;DR: The first direct detection of gravitational waves and the first observation of a binary black hole merger were reported in this paper, with a false alarm rate estimated to be less than 1 event per 203,000 years, equivalent to a significance greater than 5.1σ.
Abstract: On September 14, 2015 at 09:50:45 UTC the two detectors of the Laser Interferometer Gravitational-Wave Observatory simultaneously observed a transient gravitational-wave signal. The signal sweeps upwards in frequency from 35 to 250 Hz with a peak gravitational-wave strain of 1.0×10(-21). It matches the waveform predicted by general relativity for the inspiral and merger of a pair of black holes and the ringdown of the resulting single black hole. The signal was observed with a matched-filter signal-to-noise ratio of 24 and a false alarm rate estimated to be less than 1 event per 203,000 years, equivalent to a significance greater than 5.1σ. The source lies at a luminosity distance of 410(-180)(+160) Mpc corresponding to a redshift z=0.09(-0.04)(+0.03). In the source frame, the initial black hole masses are 36(-4)(+5)M⊙ and 29(-4)(+4)M⊙, and the final black hole mass is 62(-4)(+4)M⊙, with 3.0(-0.5)(+0.5)M⊙c(2) radiated in gravitational waves. All uncertainties define 90% credible intervals. These observations demonstrate the existence of binary stellar-mass black hole systems. This is the first direct detection of gravitational waves and the first observation of a binary black hole merger.

4,375 citations

01 Jan 2016
TL;DR: The applied missing data analysis is universally compatible with any devices to read and is available in the digital library an online access to it is set as public so you can download it instantly.
Abstract: Thank you for downloading applied missing data analysis. Maybe you have knowledge that, people have look hundreds times for their favorite readings like this applied missing data analysis, but end up in infectious downloads. Rather than enjoying a good book with a cup of tea in the afternoon, instead they juggled with some malicious bugs inside their laptop. applied missing data analysis is available in our digital library an online access to it is set as public so you can download it instantly. Our digital library hosts in multiple locations, allowing you to get the most less latency time to download any of our books like this one. Merely said, the applied missing data analysis is universally compatible with any devices to read.

1,924 citations

Journal Article
TL;DR: In this paper, the potential sensitivity of prospective detection scenarios for GWs from CCSNe within 5 Mpc, using realistic noise at the predicted sensitivity of the Advanced LIGO and Advanced Virgo detectors for 2015, 2017, and 2019.
Abstract: The next galactic core-collapse supernova (CCSN) has already exploded, and its electromagnetic (EM) waves, neutrinos, and gravitational waves (GWs) may arrive at any moment. We present an extensive study on the potential sensitivity of prospective detection scenarios for GWs from CCSNe within 5 Mpc, using realistic noise at the predicted sensitivity of the Advanced LIGO and Advanced Virgo detectors for 2015, 2017, and 2019. We quantify the detectability of GWs from CCSNe within the Milky Way and Large Magellanic Cloud, for which there will be an observed neutrino burst. We also consider extreme GW emission scenarios for more distant CCSNe with an associated EM signature. We find that a three-detector network at design sensitivity will be able to detect neutrino-driven CCSN explosions out to ∼5.5 kpc, while rapidly rotating core collapse will be detectable out to the Large Magellanic Cloud at 50 kpc. Of the phenomenological models for extreme GW emission scenarios considered in this study, such as long-lived bar-mode instabilities and disk fragmentation instabilities, all models considered will be detectable out to M31 at 0.77 Mpc, while the most extreme models will be detectable out to M82 at 3.52 Mpc and beyond.

8 citations

Journal ArticleDOI
26 Jul 2022
TL;DR: In this paper , the authors employ Generative Adversarial Networks (GAN) to learn the underlying distribution of blip glitches and to generate artificial populations, which can be used to simulate GW signals.
Abstract: The noise of gravitational-wave (GW) interferometers limits their sensitivity and impacts the data quality, hindering the detection of GW signals from astrophysical sources. For transient searches, the most problematic are transient noise artifacts, known as glitches, that happen at a rate around $1\text{ }\text{ }{\mathrm{min}}^{\ensuremath{-}1}$, and can mimic GW signals. Because of this, there is a need for better modeling and inclusion of glitches in large-scale studies, such as stress testing the pipelines. In this proof-of concept work we employ generative adversarial networks (GAN), a state-of-the-art deep learning algorithm inspired by game theory, to learn the underlying distribution of blip glitches and to generate artificial populations. We reconstruct the glitch in the time domain, providing a smooth input that the GAN can learn. With this methodology, we can create distributions of $\ensuremath{\sim}{10}^{3}$ glitches from Hanford and Livingston detectors in less than 1 sec. Furthermore, we employ several metrics to measure the performance of our methodology and the quality of its generations. This investigation will be extended in the future to different glitch classes with the final goal of creating an open-source interface for mock data generation.

4 citations

DOI
12 Mar 2022
TL;DR: In this article , the authors employ Generative Adversarial Networks (GAN), a state-of-the-art Deep Learning algorithm inspired by Game Theory, to learn the underlying distribution of blip glitches and to generate arti�cial populations.
Abstract: The noise of gravitational-wave (GW) interferometers limits their sensitivity and impacts the data quality, hindering the detection of GW signals from astrophysical sources. For transient searches, the most problematic are transient noise artifacts, known as glitches, that happen at a rate around 1 min − 1 , and can mimic GW signals. Because of this, there is a need for better modeling and inclusion of glitches in large-scale studies, such as stress testing the pipelines. In this proof-of concept work we employ Generative Adversarial Networks (GAN), a state-of-the-art Deep Learning algorithm inspired by Game Theory, to learn the underlying distribution of blip glitches and to generate artificial populations. We reconstruct the glitch in the time-domain, providing a smooth input that the GAN can learn. With this methodology, we can create distributions of ∼ 10 3 glitches from Hanford and Livingston detectors in less than one second. Furthermore, we employ several metrics to measure the performance of our methodology and the quality of its generations. This investigation will be extended in the future to different glitch classes with the final goal of creating an open-source interface for mock data generation.

3 citations