scispace - formally typeset
Search or ask a question
Institution

Indian Institute of Technology Ropar

EducationRopar, India
About: Indian Institute of Technology Ropar is a education organization based out in Ropar, India. It is known for research contribution in the topics: Catalysis & Computer science. The organization has 1014 authors who have published 2878 publications receiving 35715 citations.


Papers
More filters
Book ChapterDOI
02 Dec 2018
TL;DR: The proposed framework customizes CapsNet by adding multi-level capsules and replacing the standard convolutional layers with densely connected convolutions, which leads to learning of discriminative feature maps learned by different layers to form the primary capsules.
Abstract: Past few years have witnessed an exponential growth of interest in deep learning methodologies with rapidly improving accuracy and reduced computational complexity. In particular, architectures using Convolutional Neural Networks (CNNs) have produced state-of-the-art performances for image classification and object recognition tasks. Recently, Capsule Networks (CapsNets) achieved a significant increase in performance by addressing an inherent limitation of CNNs in encoding pose and deformation. Inspired by such an advancement, we propose Multi-level Dense Capsule Networks (multi-level DCNets). The proposed framework customizes CapsNet by adding multi-level capsules and replacing the standard convolutional layers with densely connected convolutions. A single-level DCNet essentially adds a deeper convolution network, which leads to learning of discriminative feature maps learned by different layers to form the primary capsules. Additionally, multi-level capsule networks uses a hierarchical architecture to learn new capsules from former capsules that represent spatial information in a fine-to-coarser manner, which makes it more efficient for learning complex data. Experiments on image classification task using benchmark datasets demonstrate the efficacy of the proposed architectures. DCNet achieves state-of-the-art performance (99.75%) on the MNIST dataset with approximately twenty-fold decrease in total training iterations, over the conventional CapsNet. Furthermore, multi-level DCNet performs better than CapsNet on SVHN dataset (96.90%), and outperforms the ensemble of seven CapsNet models on CIFAR-10 by \(+\)0.31% with seven-fold decrease in the number of parameters. Source codes, models and figures are available at https://github.com/ssrp/Multi-level-DCNet.

26 citations

Journal ArticleDOI
TL;DR: In this article, the surface of ZnO nanoparticles is decorated with dipodal receptors having imine linkages, and two spectroscopic techniques have been used to study the cation recognition behavior of receptors.

26 citations

Journal ArticleDOI
29 Jul 2011-Wear
TL;DR: In this article, the effects of sliding velocity on the wear rate, coefficient of friction and nature of tribolayers formed on discs have been studied using pin-on-disc set up.

26 citations

Journal ArticleDOI
TL;DR: In this paper, a pulse compression favorable frequency modulated thermal wave imaging was proposed for active infrared thermography with moderate peak power heat sources in a limited span of time with improved test resolution and sensitivity.

26 citations


Authors

Showing all 1056 results

NameH-indexPapersCitations
Rajesh Kumar1494439140830
Rajeev Ahuja85107232325
Surya Prakash Singh5573612989
Christopher C. Berndt542579941
S. Sitharama Iyengar5377613751
Sarit K. Das5227317410
R.P. Chhabra502888299
Narinder Singh454529028
Rajendra Srivastava441927153
Shirish H. Sonawane442245544
Dharmendra Tripathi371884298
Partha Pratim Roy364045505
Harpreet Singh352384090
Namita Singh342194217
Javed N. Agrewala321123073
Network Information
Related Institutions (5)
Nanyang Technological University
112.8K papers, 3.2M citations

91% related

Royal Institute of Technology
68.4K papers, 1.9M citations

91% related

Indian Institute of Science
62.4K papers, 1.2M citations

91% related

University of Science and Technology of China
101K papers, 2.4M citations

90% related

École Polytechnique Fédérale de Lausanne
98.2K papers, 4.3M citations

90% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202327
202292
2021541
2020468
2019402
2018355