scispace - formally typeset
Search or ask a question

Showing papers on "Softmax function published in 1995"


Proceedings ArticleDOI
27 Nov 1995
TL;DR: The max-min propagation neural network model is considered as a hierarchical mixture of experts by replacing the max (min) units with softmax functions, and a gradient ascent algorithm and an expectation-maximization algorithm are presented.
Abstract: The max-min propagation neural network model is considered as a hierarchical mixture of experts by replacing the max (min) units with softmax functions. The resulting mixture is different from the model of Jordan and Jacobs, but we exploit the similarities between both models to derive a probability model. Learning is treated as a maximum-likelihood problem, in particular we present a gradient ascent algorithm and an expectation-maximization algorithm. Simulation results on the parity problem and the majority problem are reported.

17 citations


Proceedings Article
27 Nov 1995
TL;DR: The benchmarks present evidence that soft assign has clear advantages in accuracy, speed, parallelizability and algorithmic simplicity over softmax and a penalty term in optimization problems with two-way constraints.
Abstract: A new technique, termed soft assign, is applied for the first time to two classic combinatorial optimization problems, the traveling salesman problem and graph partitioning. Soft assign, which has emerged from the recurrent neural network/statistical physics framework, enforces two-way (assignment) constraints without the use of penalty terms in the energy functions. The soft assign can also be generalized from two-way winner-take-all constraints to multiple membership constraints which are required for graph partitioning. The soft assign technique is compared to the softmax (Potts glass). Within the statistical physics framework, softmax and a penalty term has been a widely used method for enforcing the two-way constraints common within many combinatorial optimization problems. The benchmarks present evidence that soft assign has clear advantages in accuracy, speed, parallelizability and algorithmic simplicity over softmax and a penalty term in optimization problems with two-way constraints.

12 citations