scispace - formally typeset
N

Nikola Ciganović

Researcher at Imperial College London

Publications -  6
Citations -  143

Nikola Ciganović is an academic researcher from Imperial College London. The author has contributed to research in topics: Hair cell & Cochlea. The author has an hindex of 4, co-authored 6 publications receiving 112 citations.

Papers
More filters
Journal ArticleDOI

Smooth Max-Information as One-Shot Generalization for Mutual Information

TL;DR: It is demonstrated that different smoothed definitions of Max-information are essentially equivalent, which allows us to derive new chain rules for the max-information in terms of min- and max-entropies, thus extending the smooth entropy formalism to mutual information.
Journal ArticleDOI

Minimal basilar membrane motion in low-frequency hearing

TL;DR: It is shown that low-frequency sound moves a small portion of the basilar membrane, and that the motion declines in an exponential manner across the basiliar membrane, Hence, the response of the hearing organ to speech-frequency sounds is different from the one evident in high-frequency cochlear regions.
Journal ArticleDOI

Hair bundles of cochlear outer hair cells are shaped to minimize their fluid-dynamic resistance

TL;DR: Analytical and computational methods are used to study the fluid flow across rows of differently shaped hair bundles and find that rows of V-shaped hair bundles have a considerably reduced resistance to crossflow, and that the biologically observed shapes of hair bundles of outer hair cells are near-optimal in this regard.
Journal ArticleDOI

Static length changes of cochlear outer hair cells can tune low-frequency hearing.

TL;DR: A geometry-based kinematic model of the apical organ of Corti is developed that reproduces salient, yet counter-intuitive features of the organ’s motion and uncovers a mechanism by which a static length change of the outer hair cells can sensitively tune the signal transmitted to the sensory inner hair cells.
Journal ArticleDOI

Smooth Max-Information as One-Shot Generalization for Mutual Information

TL;DR: In this paper, the authors studied the formal properties of smooth max-information, a generalization of von Neumann mutual information derived from the max-relative entropy, and showed that different smoothed definitions are essentially equivalent (up to logarithmic terms in the smoothing parameters).