Open AccessPosted Content
Structure and Performance of Fully Connected Neural Networks: Emerging Complex Network Properties.
TLDR
In this article, the authors proposed a complex network (CN) technique to analyze the structure and performance of fully connected neural networks and found that these measures are highly related to the network classification performance.Abstract:
Understanding the behavior of Artificial Neural Networks is one of the main topics in the field recently, as black-box approaches have become usual since the widespread of deep learning. Such high-dimensional models may manifest instabilities and weird properties that resemble complex systems. Therefore, we propose Complex Network (CN) techniques to analyze the structure and performance of fully connected neural networks. For that, we build a dataset with 4 thousand models and their respective CN properties. They are employed in a supervised classification setup considering four vision benchmarks. Each neural network is approached as a weighted and undirected graph of neurons and synapses, and centrality measures are computed after training. Results show that these measures are highly related to the network classification performance. We also propose the concept of Bag-Of-Neurons (BoN), a CN-based approach for finding topological signatures linking similar neurons. Results suggest that six neuronal types emerge in such networks, independently of the target domain, and are distributed differently according to classification accuracy. We also tackle specific CN properties related to performance, such as higher subgraph centrality on lower-performing models. Our findings suggest that CN properties play a critical role in the performance of fully connected neural networks, with topological patterns emerging independently on a wide range of models.read more
Citations
More filters
Journal ArticleDOI
A physics-aware deep learning model for energy localization in multiscale shock-to-detonation simulations of heterogeneous energetic materials
Phong Nguyen,Yen Thi Nguyen,Pradeep Kumar Seshadri,Joseph B. Choi,H. S. Udaykumar,Stephen Baek +5 more
TL;DR: In this paper , a physics-aware recurrent convolutional neural network (PARC) is used to model the mesoscale energy localization of shock-initiated heterogeneous EM microstructures.
Journal ArticleDOI
Machine Learning-Based Label Quality Assurance for Object Detection Projects in Requirements Engineering
Neven Pičuljan,Zeljka Car +1 more
TL;DR: In this paper , a machine learning-based method for automatic label quality assurance, especially in the context of object detection use cases, has been proposed, which aims to support both annotators and computer vision project stakeholders while reducing the time and resources needed to conduct label QA activities.
Posted Content
Characterizing Learning Dynamics of Deep Neural Networks via Complex Networks.
TL;DR: In this paper, the authors interpret deep neural networks with complex network theory and introduce metrics for nodes/neurons and layers, namely Nodes Strength and Layers Fluctuation.
References
More filters
Journal ArticleDOI
The human connectome: a complex network
TL;DR: Current empirical efforts toward generating a network map of the human brain, the human connectome, are reviewed, and how the connectome can provide new insights into the organization of the brain's structural connections and their role in shaping functional dynamics are explored.
Journal ArticleDOI
Subgraph centrality in complex networks.
TL;DR: A new centrality measure that characterizes the participation of each node in all subgraphs in a network, C(S)(i), which is better able to discriminate the nodes of a network than alternate measures such as degree, closeness, betweenness, and eigenvector centralities.
Posted Content
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks
Jonathan Frankle,Michael Carbin +1 more
TL;DR: In this paper, the lottery tickets hypothesis is proposed to find the subnetworks that can reach test accuracy comparable to the original network in a similar number of iterations, where the winning tickets have won the initialization lottery: their connections have initial weights that make training particularly effective.
Proceedings ArticleDOI
Large-scale deep unsupervised learning using graphics processors
TL;DR: It is argued that modern graphics processors far surpass the computational capabilities of multicore CPUs, and have the potential to revolutionize the applicability of deep unsupervised learning methods.
Journal ArticleDOI
Analyzing and modeling real-world phenomena with complex networks: a survey of applications
Luciano da Fontoura Costa,Osvaldo N. Oliveira,Gonzalo Travieso,Francisco A. Rodrigues,Paulino Ribeiro Villas Boas,Lucas Antiqueira,Matheus P. Viana,Luis E. C. Rocha +7 more
TL;DR: The success of new scientific areas can be assessed by their potential in contributing to new theoretical approaches and in applications to real-world problems as mentioned in this paper, and complex networks have fared extremely well in both these aspects, with their sound theoretical basis being developed over the years and with a variety of applications.