scispace - formally typeset
Search or ask a question

Showing papers on "Softmax function published in 1996"


01 Aug 1996
TL;DR: The benchmarks present evidence that softas-sign has clear advantages in accuracy, speed, parallelizability and al-gorithmic simplicity over softmax and a penalty term in optimization problems with two-way constraints.
Abstract: A new technique termed softassign, is applied to three combinatorial optimization problems|weighted graph matching, the traveling salesman problem and graph partitioning. Softassign, which has emerged from the recurrent neural network/statistical physics framework, enforces two-way (assignment) constraints without the use of penalty terms in the energy functions. The softassign can also be generalized from two-way winner-take-all constraints to multiple membership constraints which are required for graph partitioning. The softassign technique is compared to softmax (Potts glass) dynamics. Within the statistical physics framework, softmax and a penalty term has been a widely used method for enforcing the two-way constraints common to many combinatorial optimization problems. The benchmarks present evidence that softas-sign has clear advantages in accuracy, speed, parallelizability and al-gorithmic simplicity over softmax and a penalty term in optimization problems with two-way constraints.

104 citations


Proceedings Article
03 Dec 1996
TL;DR: A new reinforcement learning architecture for nonlinear control that enables both efficient use of the value function and simple computation for real-time implementation was proposed.
Abstract: A new reinforcement learning architecture for nonlinear control is proposed. A direct feedback controller, or the actor, is trained by a value-gradient based controller, or the tutor. This architecture enables both efficient use of the value function and simple computation for real-time implementation. Good performance was verified in multi-dimensional nonlinear control tasks using Gaussian softmax networks.

42 citations


Proceedings ArticleDOI
03 Jun 1996
TL;DR: The benchmarks present evidence that softassign has clear advantages in accuracy, speed, and algorithmic simplicity over softmax with a penalty term in this weighted graph matching problem.
Abstract: A new technique, termed softassign, is applied to weighted graph matching. Softassign, which has emerged from the recurrent neural network/statistical physics framework, enforces two-way (assignment) constraints without the use of penalty terms in the energy functions. The softassign technique is compared to softmax (Potts glass) dynamics. Within the statistical physics framework, softmax with a penalty term has been a widely used method for enforcing the two-way constraints common to many combinatorial optimization problems. The benchmarks present evidence that softassign has clear advantages in accuracy, speed, and algorithmic simplicity over softmax with a penalty term in this weighted graph matching problem.

11 citations