Journal ArticleDOI
Machine learning-based constitutive model for J2- plasticity
TLDR
In this article, an artificial neural network (ANN) was constructed to replace the nonlinear stress-integration scheme conducted in the conventional theoretical constitutive model under isotropic hardening and associated flow rule.About:
This article is published in International Journal of Plasticity.The article was published on 2021-03-01. It has received 44 citations till now. The article focuses on the topics: Constitutive equation.read more
Citations
More filters
Journal ArticleDOI
Machine learning-based modeling of the coupling effect of strain rate and temperature on strain hardening for 5182-O aluminum alloy
TL;DR: In this paper , a polynomial model and artificial neural network (ANN) model are used to describe the highly nonlinearity and coupling of strain hardening of aluminum alloy.
Counterexample-trained Neural Network Model of Rate and Temperature Dependent Hardening with Dynamic Strain Aging
Journal ArticleDOI
A State-of-the-Art Review on Machine Learning-Based Multiscale Modeling, Simulation, Homogenization and Design of Materials
Journal ArticleDOI
Machine learning for extending capability of mechanical characterization to improve springback prediction of a quenching and partitioning steel
TL;DR: In this article , a data-driven method was proposed to calibrate parameters of a kinematic hardening model (Yoshida-Uemori model), which adopts all TC and CT stress vs. strain curves from physical tests and machine learning.
Journal ArticleDOI
From CP-FFT to CP-RNN: Recurrent Neural Network Surrogate Model of Crystal Plasticity
TL;DR: In this article , the authors apply the same methodology to 2D and 3D datasets corresponding to the effective mechanical behavior of an aluminum alloy as obtained through Crystal Plasticity simulations, obtaining reasonable approximations of the behavior using RNN models of size ranging from 5'000 to 100'000 parameters.
References
More filters
Proceedings Article
Adam: A Method for Stochastic Optimization
Diederik P. Kingma,Jimmy Ba +1 more
TL;DR: This work introduces Adam, an algorithm for first-order gradient-based optimization of stochastic objective functions, based on adaptive estimates of lower-order moments, and provides a regret bound on the convergence rate that is comparable to the best known results under the online convex optimization framework.
Journal ArticleDOI
Learning representations by back-propagating errors
TL;DR: Back-propagation repeatedly adjusts the weights of the connections in the network so as to minimize a measure of the difference between the actual output vector of the net and the desired output vector, which helps to represent important features of the task domain.
Journal ArticleDOI
Multilayer feedforward networks are universal approximators
TL;DR: It is rigorously established that standard multilayer feedforward networks with as few as one hidden layer using arbitrary squashing functions are capable of approximating any Borel measurable function from one finite dimensional space to another to any desired degree of accuracy, provided sufficiently many hidden units are available.
Journal ArticleDOI
A logical calculus of the ideas immanent in nervous activity
Warren S. McCulloch,Walter Pitts +1 more
TL;DR: In this article, it is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under another and gives the same results, although perhaps not in the same time.
Proceedings Article
Rectified Linear Units Improve Restricted Boltzmann Machines
Vinod Nair,Geoffrey E. Hinton +1 more
TL;DR: Restricted Boltzmann machines were developed using binary stochastic hidden units that learn features that are better for object recognition on the NORB dataset and face verification on the Labeled Faces in the Wild dataset.