scispace - formally typeset
A

Alberto Prieto

Researcher at University of Granada

Publications -  248
Citations -  4450

Alberto Prieto is an academic researcher from University of Granada. The author has contributed to research in topics: Artificial neural network & Fuzzy logic. The author has an hindex of 34, co-authored 248 publications receiving 4285 citations. Previous affiliations of Alberto Prieto include Royal Institute of Technology & Cisco Systems, Inc..

Papers
More filters
Journal ArticleDOI

Use of Phase in Brain---Computer Interfaces based on Steady-State Visual Evoked Potentials

TL;DR: This paper has presented an experiment, based on the AM modulation of flickering stimuli, that demonstrates that first, the phase shifts of different stimuli can be recovered from that of the corresponding SSVEPs without the need of a real time system; and second, this information can be used efficiently to develop a BCIbased on the classification of thephase shifts of the SSVEP.
Journal ArticleDOI

Online global learning in direct fuzzy controllers

TL;DR: A novel approach to achieve real-time global learning in fuzzy controllers using a one-step algorithm, which leads to an enhanced control policy thanks to the global learning performed, avoiding overfitting.
Journal ArticleDOI

Analysis of the Functional Block Involved in the Design of Radial Basis Function Networks

TL;DR: In the present contribution, the relevance and relative importance of the parameters involved in such a design are investigated by using a statistical tool, the ANalysis of the VAriance (ANOVA), and various problems of classification, functional approximation and time series estimation are analyzed.
Book ChapterDOI

Solving Master Mind Using GAs and Simulated Annealing: A Case of Dynamic Constraint Optimization

TL;DR: This paper proves that the algorithms that follow the optimal strategy behave similarly, getting the correct combination in more or less the same number of guesses; between them, GA is better with respect to the number of combinations examined, and this difference increases with the size of the search space, while SA is much faster (around 2 orders of magnitude) and gives a good enough answer.
Journal ArticleDOI

Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations

TL;DR: The modified backpropagation algorithm proposed uses the statistical sensitivity of the network to changes in the weights as a quantitative measure of network tolerance and attempts to reduce this statistical sensitivity while keeping the figures for the usual training performance similar to those obtained with the usual backpropAGation algorithm.