scispace - formally typeset
Search or ask a question

Showing papers by "Jeffrey H. Shapiro published in 2015"


Journal ArticleDOI
TL;DR: The error probability of this microwave quantum-illumination system, or quantum radar, is shown to be superior to that of any classical microwave radar of equal transmitted energy.
Abstract: Quantum illumination is a quantum-optical sensing technique in which an entangled source is exploited to improve the detection of a low-reflectivity object that is immersed in a bright thermal background. Here, we describe and analyze a system for applying this technique at microwave frequencies, a more appropriate spectral region for target detection than the optical, due to the naturally occurring bright thermal background in the microwave regime. We use an electro-optomechanical converter to entangle microwave signal and optical idler fields, with the former being sent to probe the target region and the latter being retained at the source. The microwave radiation collected from the target region is then phase conjugated and upconverted into an optical field that is combined with the retained idler in a joint-detection quantum measurement. The error probability of this microwave quantum-illumination system, or quantum radar, is shown to be superior to that of any classical microwave radar of equal transmitted energy.

391 citations


Journal ArticleDOI
TL;DR: This work experimentally demonstrates an entanglement-enhanced sensing system that is resilient to quantum decoherence and suggests that advantageous quantum-sensing technology could be developed for practical situations.
Abstract: Nonclassical states are essential for optics-based quantum information processing, but their fragility limits their utility for practical scenarios in which loss and noise inevitably degrade, if not destroy, nonclassicality. Exploiting nonclassical states in quantum metrology yields sensitivity advantages over all classical schemes delivering the same energy per measurement interval to the sample being probed. These enhancements, almost without exception, are severely diminished by quantum decoherence. Here, we experimentally demonstrate an entanglement-enhanced sensing system that is resilient to quantum decoherence. We employ entanglement to realize a 20% signal-to-noise ratio improvement over the optimum classical scheme in an entanglement-breaking environment plagued by 14 dB of loss and a noise background 75 dB stronger than the returned probe light. Our result suggests that advantageous quantum-sensing technology could be developed for practical situations.

227 citations


Journal ArticleDOI
TL;DR: A robust method for estimating depth and reflectivity using fixed dwell time per pixel and on the order of one detected photon per pixel averaged over the scene, which increases photon efficiency 100-fold over traditional processing and also improves, somewhat, upon first-photon imaging under a total acquisition time constraint in raster-scanned operation.
Abstract: Capturing depth and reflectivity images at low light levels from active illumination of a scene has wide-ranging applications. Conventionally, even with detectors sensitive to individual photons, hundreds of photon detections are needed at each pixel to mitigate Poisson noise. We develop a robust method for estimating depth and reflectivity using fixed dwell time per pixel and on the order of one detected photon per pixel averaged over the scene. Our computational image formation method combines physically accurate single-photon counting statistics with exploitation of the spatial correlations present in real-world reflectivity and 3-D structure. Experiments conducted in the presence of strong background light demonstrate that our method is able to accurately recover scene depth and reflectivity, while traditional imaging methods based on maximum likelihood (ML) estimation or approximations thereof lead to noisier images. For depth, performance compares favorably to signal-independent noise removal algorithms such as median filtering or block-matching and 3-D filtering (BM3D) applied to the pixelwise ML estimate; for reflectivity, performance is similar to signal-dependent noise removal algorithms such as Poisson nonlocal sparse PCA and BM3D with variance-stabilizing transformation. Our framework increases photon efficiency 100-fold over traditional processing and also improves, somewhat, upon first-photon imaging under a total acquisition time constraint in raster-scanned operation. Thus, our new imager will be useful for rapid, low-power, and noise-tolerant active optical imaging, and its fixed dwell time will facilitate parallelization through use of a detector array.

164 citations


Journal ArticleDOI
TL;DR: The proposed SPGD algorithm based on Zernike polynomials improves the quality of the turbulence-distorted OAM beam and simultaneously correct multiple OAM beams propagating through the same turbulence, and the crosstalk among these modes is reduced by more than 5 dB.
Abstract: A stochastic-parallel-gradient-descent algorithm (SPGD) based on Zernike polynomials is proposed to generate the phase correction pattern for a distorted orbital angular momentum (OAM) beam. The Zernike-polynomial coefficients for the correction pattern are obtained by monitoring the intensity profile of the distorted OAM beam through an iteration-based feedback loop. We implement this scheme and experimentally show that the proposed approach improves the quality of the turbulence-distorted OAM beam. Moreover, we apply phase correction patterns derived from a probe OAM beam through emulated turbulence to correct other OAM beams transmitted through the same turbulence. Our experimental results show that the patterns derived this way simultaneously correct multiple OAM beams propagating through the same turbulence, and the crosstalk among these modes is reduced by more than 5 dB.

98 citations


Journal ArticleDOI
TL;DR: This work investigates the sensing of a data-carrying Gaussian beacon on a separate wavelength as a means to provide the information necessary to compensate for the effects of atmospheric turbulence on orbital angular momentum (OAM) and polarization-multiplexed beams in a free-space optical link.
Abstract: We investigate the sensing of a data-carrying Gaussian beacon on a separate wavelength as a means to provide the information necessary to compensate for the effects of atmospheric turbulence on orbital angular momentum (OAM) and polarization-multiplexed beams in a free-space optical link. The influence of the Gaussian beacon's wavelength on the compensation of the OAM beams at 1560 nm is experimentally studied. It is found that the compensation performance degrades slowly with the increase in the beacon's wavelength offset, in the 1520-1590 nm band, from the OAM beams. Using this scheme, we experimentally demonstrate a 1 Tbit/s OAM and polarization-multiplexed link through emulated dynamic turbulence with a data-carrying beacon at 1550 nm. The experimental results show that the turbulence effects on all 10 data channels, each carrying a 100 Gbit/s signal, are mitigated efficiently, and the power penalties after compensation are below 5.9 dB for all channels. The results of our work might be helpful for the future implementation of a high-capacity OAM, polarization and wavelength-multiplexed free-space optical link that is affected by atmospheric turbulence.

44 citations


Journal ArticleDOI
TL;DR: This work introduces a novel framework for accurate depth imaging using a small number of detected photons in the presence of an unknown amount of background light that may vary spatially, and outperforms the conventional pixelwise log-matched filtering by a factor of 6.1 in absolute depth error.
Abstract: Light detection and ranging systems reconstruct scene depth from time-of-flight measurements. For low light-level depth imaging applications, such as remote sensing and robot vision, these systems use single-photon detectors that resolve individual photon arrivals. Even so, they must detect a large number of photons to mitigate Poisson shot noise and reject anomalous photon detections from background light. We introduce a novel framework for accurate depth imaging using a small number of detected photons in the presence of an unknown amount of background light that may vary spatially. It employs a Poisson observation model for the photon detections plus a union-of-subspaces constraint on the discrete-time flux from the scene at any single pixel. Together, they enable a greedy signal-pursuit algorithm to rapidly and simultaneously converge on accurate estimates of scene depth and background flux, without any assumptions on spatial correlations of the depth or background flux. Using experimental single-photon data, we demonstrate that our proposed framework recovers depth features with 1.7 cm absolute error, using 15 photons per image pixel and an illumination pulse with 6.7-cm scaled root-mean-square length. We also show that our framework outperforms the conventional pixelwise log-matched filtering, which is a computationally-efficient approximation to the maximum-likelihood solution, by a factor of 6.1 in absolute depth error.

20 citations



Posted Content
TL;DR: In this paper, the authors proposed a two-way secure communication protocol in which Alice uses an amplified spontaneous emission source while Bob employs binary phase-shift keying and an optical amplifier.
Abstract: We propose a two-way secure-communication protocol in which Alice uses an amplified spontaneous emission source while Bob employs binary phase-shift keying and an optical amplifier. Against an eavesdropper who captures all the light lost in fibers linking Alice and Bob, this protocol is capable of 3.5 Gbps quantum-secured direct communication at 50 km range. If Alice augments her terminal with a spontaneous parametric downconverter and both Alice and Bob add channel monitors, they can realize 2 Gbps quantum key distribution at that range against an eavesdropper who injects her own light into Bob's terminal. Compared with prevailing quantum key distribution methods, this protocol has the potential to significantly increase secure key rates at long distances by employing many ultrabroadband photons per key bit to mitigate channel loss.

6 citations


Posted Content
TL;DR: This work designs and demonstrates the first PPM-QKD, whose security against collective attacks is established through continuous-variable entanglement measurements that also enable a novel decoy-state protocol performed conveniently in post processing.
Abstract: The binary (one-bit-per-photon) encoding that most existing quantum key distribution (QKD) protocols employ puts a fundamental limit on their achievable key rates, especially under high channel loss conditions associated with long-distance fiber-optic or satellite-to-ground links. Inspired by the pulse-position-modulation (PPM) approach to photon-starved classical communications, we design and demonstrate the first PPM-QKD, whose security against collective attacks is established through continuous-variable entanglement measurements that also enable a novel decoy-state protocol performed conveniently in post processing. We achieve a throughput of 8.0 Mbit/s (2.5 Mbit/s for loss equivalent to 25 km of fiber) and secret-key capacity up to 4.0 bits per detected photon, thus demonstrating the significant enhancement afforded by high-dimensional encoding. These results point to a new avenue for realizing high-throughput satellite-based or long-haul fiber-optic quantum communications beyond their photon-reception-rate limits.

2 citations


Proceedings ArticleDOI
10 May 2015
TL;DR: In this article, the authors report the first experimental demonstration of an entanglement-enhanced sensing system that is resilient to environmental loss and noise, and demonstrate that the system can be used in the presence of noise and environmental loss.
Abstract: We report the first experimental demonstration of an entanglement-enhanced sensing system that is resilient to environmental loss and noise.

2 citations


Posted Content
TL;DR: Floodlight quantum key distribution (FL-QKD) is introduced, which breaks the one photon per bit barrier and overcomes loss by exploiting a huge number of optical modes per bit.
Abstract: Existing quantum key distribution protocols typically transmit at most one photon per bit, so that the no-cloning theorem ensures their security. As a result, their key rates suffer dramatically in long-distance transmission because of channel loss. We introduce floodlight quantum key distribution (FL-QKD), which breaks the one photon per bit barrier and overcomes loss by exploiting a huge number of optical modes per bit. FL-QKD is capable of 2 Gbps secret-key rates over a 50 km fiber link. Its security follows from employing less than one photon per mode and using photon-coincidence channel monitoring.

Proceedings ArticleDOI
07 Jun 2015
TL;DR: In this article, the physics and computational aspects of ghost imaging are surveyed, originally correlation-based image formation using entangled light beams, has grown to encompass a wide range of related computational approaches.
Abstract: Ghost imaging, originally correlation-based image formation using entangled light beams, has grown to encompass a wide range of related computational approaches. This paper surveys the physics and computational aspects of ghost imaging.

Proceedings ArticleDOI
10 May 2015
TL;DR: In this article, a framework for evaluating CPHASE gates that use single-photon Kerr nonlinearities in which one pulse overtakes another was established, and it was shown that causality induced phase noise precludes the possibility of high-fidelity π-radian conditional phase shifts.
Abstract: We establish a framework for evaluating CPHASE gates that use single-photon Kerr nonlinearities in which one pulse overtakes another. We show that causality-induced phase noise precludes the possibility of high-fidelity π-radian conditional phase shifts.

Journal Article
TL;DR: In this paper, the authors demonstrate two high-dimensional QKD protocols, secure against collective Gaussian attacks, yielding up to 8.6 secure bits per photon and 6.7 Mb/s throughput.
Abstract: We demonstrate two high-dimensional QKD protocols - secure against collective Gaussian attacks - yielding up to 8.6 secure bits per photon and 6.7 Mb/s throughput, with 6.9 bits per photon after transmission through 20 km of fiber.

Journal ArticleDOI
TL;DR: In this article, a greedy signal-pursuit algorithm is proposed to converge on accurate estimates of scene depth and background flux, without any assumptions on spatial correlations of the depth or background flux.
Abstract: Light detection and ranging systems reconstruct scene depth from time-of-flight measurements. For low light-level depth imaging applications, such as remote sensing and robot vision, these systems use single-photon detectors that resolve individual photon arrivals. Even so, they must detect a large number of photons to mitigate Poisson shot noise and reject anomalous photon detections from background light. We introduce a novel framework for accurate depth imaging using a small number of detected photons in the presence of an unknown amount of background light that may vary spatially. It employs a Poisson observation model for the photon detections plus a union-of-subspaces constraint on the discrete-time flux from the scene at any single pixel. Together, they enable a greedy signal-pursuit algorithm to rapidly and simultaneously converge on accurate estimates of scene depth and background flux, without any assumptions on spatial correlations of the depth or background flux. Using experimental single-photon data, we demonstrate that our proposed framework recovers depth features with 1.7 cm absolute error, using 15 photons per image pixel and an illumination pulse with 6.7-cm scaled root-mean-square length. We also show that our framework outperforms the conventional pixelwise log-matched filtering, which is a computationally-efficient approximation to the maximum-likelihood solution, by a factor of 6.1 in absolute depth error.

Proceedings ArticleDOI
07 Jun 2015
TL;DR: Theory and proof-of-principle demonstrations are reviewed for two recent computational imaging approaches that are capable of forming high-quality range and reflectivity images from ~1 detected photon per pixel.
Abstract: Theory and proof-of-principle demonstrations are reviewed for two recent computational imaging approaches that are capable of forming high-quality range and reflectivity images from ~1 detected photon per pixel.