scispace - formally typeset
S

Sho Yaida

Researcher at Duke University

Publications -  41
Citations -  2515

Sho Yaida is an academic researcher from Duke University. The author has contributed to research in topics: Slowdown & Artificial neural network. The author has an hindex of 17, co-authored 38 publications receiving 2228 citations. Previous affiliations of Sho Yaida include Massachusetts Institute of Technology & Stanford University.

Papers
More filters
Journal ArticleDOI

Viscosity Bound Violation in Higher Derivative Gravity

TL;DR: In this paper, the authors consider the shear viscosity to entropy density ratio in conformal field theories dual to Einstein gravity with curvature square corrections, and they find that the value of the Shear V2R can be adjusted to any positive value from infinity down to zero.
Journal ArticleDOI

Viscosity Bound and Causality Violation

TL;DR: It is argued, in the context of the same model, that tuning eta/s below (16/25)(1/4 pi) induces microcausality violation in the CFT, rendering the theory inconsistent, supporting the idea of a possible universal lower bound on eta-s for all consistent theories.
Journal ArticleDOI

Holographic lattices, dimers, and glasses

TL;DR: In this article, a periodic lattice of localized fermionic impurities within a plasma medium is holographically engineered by putting an array of probe D5-branes in the background produced by N D3-brane.
Journal ArticleDOI

Configurational entropy measurements in extremely supercooled liquids that break the glass ceiling.

TL;DR: The colossal gap between experiments and simulations is closed but in silico configurations that have no experimental analog yet are created, and measurements consistently confirm that the steep entropy decrease observed in experiments is also found in simulations, even beyond the experimental glass transition.
Posted Content

Robust Learning with Jacobian Regularization

TL;DR: The stabilizing effect of the Jacobian regularizer leads to significant improvements in robustness, as measured against both random and adversarial input perturbations, without severely degrading generalization properties on clean data.