scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Multilayer perceptron neural networks to compute quasistatic parameters of asymmetric coplanar waveguides

01 Dec 2004-Neurocomputing (Elsevier)-Vol. 62, pp 349-365
TL;DR: The results of the MLPNNs trained with the Levenberg-Marquardt algorithm for the quasistatic parameters of the ACPWs were in very good agreement with the results available in the literature obtained by using conformal-mapping technique.
About: This article is published in Neurocomputing.The article was published on 2004-12-01. It has received 48 citations till now. The article focuses on the topics: Multilayer perceptron & Artificial neural network.
Citations
More filters
Journal ArticleDOI
TL;DR: An artificial neural network with k-fold cross validation is trained with manually classified mineral samples based on their pixel values to classify 5 different minerals, namely, quartz, muscovite, biotite, chlorite, and opaque.

75 citations

Journal ArticleDOI
TL;DR: The empirical defect-per-million-opportunities (DPMO) measurements demonstrate that the two hybrid intelligence methods can provide satisfactory performance for stencil printing optimization problem.

68 citations

Journal ArticleDOI
TL;DR: An artificial neural network-based synthesis model is proposed for the design of single-feed circularly-polarized square microstrip antenna (CPSMA) with truncated corners and is validated by comparing its results with the electromagnetic simulation and measurement.
Abstract: An artificial neural network-based synthesis model is proposed for the design of single-feed circularly-polarized square microstrip antenna (CPSMA) with truncated corners. To obtain the training data sets, the resonant frequency and Q-factor of square microstrip antennas are calculated by empirical formulae. Then the size of the truncated corners and the operation frequency with the best axial ratio are obtained. Using the Levenberg-Marquardt (LM) algorithm, a three hidden layered network is trained to achieve an accurate synthesis model. At last, the model is validated by comparing its results with the electromagnetic simulation and measurement. It is extremely useful to antenna engineers for directly obtaining patch physical dimensions of the single-feed CPSMA with truncated corners.

63 citations


Cites background from "Multilayer perceptron neural networ..."

  • ...An important class of ANNs is the multilayer perceptron (MLP) [21], which is suitable for modeling high-dimensional and highly nonlinear problems....

    [...]

  • ...An important class of ANNs is the multilayer perceptron (MLP) [21], which is suitable for modeling...

    [...]

  • ...The success of MLPs for a particular problem depends on the adequacy of the training algorithm regarding the necessities of the problem....

    [...]

  • ...An MLP consists of three types of layers: an input layer, an output layer and one or more hidden layers....

    [...]

  • ...In the study, the well-known Levenberg-Marquardt (LM) algorithm [22] is adopted to train the MLPs for obtaining high-precision synthesis models....

    [...]

Journal ArticleDOI
23 Dec 2013
TL;DR: In this paper, experiments are conducted with artificial neural network model to predict severe thunderstorms that occurred over Kolkata during May 3, 11, and 15, 2009, using thunderstorm affected meteorological parameters.
Abstract: Forecasting thunderstorm is one of the most difficult tasks in weather prediction, due to their rather small spatial and temporal extension and the inherent nonlinearity of their dynamics and physics. Accurate forecasting of severe thunderstorms is critical for a large range of users in the community. In this paper, experiments are conducted with artificial neural network model to predict severe thunderstorms that occurred over Kolkata during May 3, 11, and 15, 2009, using thunderstorm affected meteorological parameters. The capabilities of six learning algorithms, namely, Step, Momentum, Conjugate Gradient, Quick Propagation, Levenberg-Marquardt, and Delta-Bar-Delta, in predicting thunderstorms and the usefulness for the advanced prediction were studied and their performances were evaluated by a number of statistical measures. The results indicate that Levenberg-Marquardt algorithm well predicted thunderstorm affected surface parameters and 1, 3, and 24 h advanced prediction models are able to predict hourly temperature and relative humidity adequately with sudden fall and rise during thunderstorm hour. This demonstrates its distinct capability and advantages in identifying meteorological time series comprising nonlinear characteristics. The developed model can be useful in decision making for meteorologists and others who work with real-time thunderstorm forecast.

54 citations

Journal ArticleDOI
TL;DR: In this paper, the activation energy from carbon reinforced carbon composites were obtained by Thermogravimetry Analysis. And the nonlinear fitting method based on Levenberg-Marquardt approach was used to fit the weights loss curves at five heating rates.

31 citations

References
More filters
Book
16 Jul 1998
TL;DR: Thorough, well-organized, and completely up to date, this book examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks.
Abstract: From the Publisher: This book represents the most comprehensive treatment available of neural networks from an engineering perspective. Thorough, well-organized, and completely up to date, it examines all the important aspects of this emerging technology, including the learning process, back-propagation learning, radial-basis function networks, self-organizing systems, modular networks, temporal processing and neurodynamics, and VLSI implementation of neural networks. Written in a concise and fluid manner, by a foremost engineering textbook author, to make the material more accessible, this book is ideal for professional engineers and graduate students entering this exciting field. Computer experiments, problems, worked examples, a bibliography, photographs, and illustrations reinforce key concepts.

29,130 citations


"Multilayer perceptron neural networ..." refers methods in this paper

  • ...of training algorithms used to train a MLPNN and a frequently used one is called the BP training algorithm [1,4,17]....

    [...]

Journal ArticleDOI
01 Jan 1988-Nature
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Abstract: We describe a new learning procedure, back-propagation, for networks of neurone-like units. The procedure repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector. As a result of the weight adjustments, internal ‘hidden’ units which are not part of the input or output come to represent important features of the task domain, and the regularities in the task are captured by the interactions of these units. The ability to create useful new features distinguishes back-propagation from earlier, simpler methods such as the perceptron-convergence procedure1.

23,814 citations


"Multilayer perceptron neural networ..." refers background in this paper

  • ...Therefore, a lot of variations to improve the convergence of the BP were proposed such as DBD, EDBD, QP [6,19,24,27]....

    [...]

Journal ArticleDOI
TL;DR: The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks and is found to be much more efficient than either of the other techniques when the network contains no more than a few hundred weights.
Abstract: The Marquardt algorithm for nonlinear least squares is presented and is incorporated into the backpropagation algorithm for training feedforward neural networks. The algorithm is tested on several function approximation problems, and is compared with a conjugate gradient algorithm and a variable learning rate algorithm. It is found that the Marquardt algorithm is much more efficient than either of the other techniques when the network contains no more than a few hundred weights. >

6,899 citations


"Multilayer perceptron neural networ..." refers background in this paper

  • ...(23), Jk is the Jacobian of g(wk) evaluated by taking derivative of g(wk) with respect to wk , % is the Marquardt parameter, I is the identity matrix [8,13,30]....

    [...]

  • ...In particular, it generally does not su=er from the problem of slow convergence [8,13,30]....

    [...]

Journal ArticleDOI
TL;DR: A bird's eye review of the various types of ANNs and the related learning rules is presented, with special emphasis on backpropagation ANNs theory and design, and a generalized methodology for developing successful ANNs projects from conceptualization, to design, to implementation is described.

2,622 citations


"Multilayer perceptron neural networ..." refers methods in this paper

  • ...of training algorithms used to train a MLPNN and a frequently used one is called the BP training algorithm [1,4,17]....

    [...]

Book
30 Aug 2004
TL;DR: artificial neural networks, artificial neural networks , مرکز فناوری اطلاعات و اصاع رسانی, کδاوρزی
Abstract: artificial neural networks , artificial neural networks , مرکز فناوری اطلاعات و اطلاع رسانی کشاورزی

2,254 citations