scispace - formally typeset
A

Andy Shih

Researcher at University of California, Los Angeles

Publications -  17
Citations -  375

Andy Shih is an academic researcher from University of California, Los Angeles. The author has contributed to research in topics: Probabilistic logic & Graphical model. The author has an hindex of 8, co-authored 17 publications receiving 253 citations. Previous affiliations of Andy Shih include Stanford University.

Papers
More filters
Posted Content

On the Opportunities and Risks of Foundation Models.

Rishi Bommasani, +113 more
- 16 Aug 2021 - 
TL;DR: The authors provides a thorough account of the opportunities and risks of foundation models, ranging from their capabilities (e.g., language, vision, robotics, reasoning, human interaction) and technical principles(e. g.g. model architectures, training procedures, data, systems, security, evaluation, theory) to their applications.
Proceedings ArticleDOI

A Symbolic Approach to Explaining Bayesian Network Classifiers.

TL;DR: In this article, the authors propose an approach for explaining Bayesian network classifiers, which is based on compiling such classifiers into decision functions that have a tractable and symbolic form.
Journal ArticleDOI

Compiling Bayesian Network Classifiers into Decision Graphs.

TL;DR: An algorithm is proposed for compiling Bayesian network classifiers into decision graphs that mimic the input and output behavior of the classifiers, which are tractable and can be exponentially smaller in size than decision trees.
Book ChapterDOI

Verifying Binarized Neural Networks by Angluin-Style Learning

TL;DR: An Angluin-style learning algorithm is proposed to compile a neural network on a given region into an Ordered Binary Decision Diagram (OBDD), using a SAT solver as an equivalence oracle to verify the behavior of binarized neural networks.
Proceedings ArticleDOI

On Tractable Representations of Binary Neural Networks.

TL;DR: A more efficient approach for compiling neural networks is considered, based on a pseudo-polynomial time algorithm for compiling a neuron, and it is shown that it is feasible to obtain compact representations of neural networks as SDDs.