scispace - formally typeset
Search or ask a question
Institution

National Institute of Technology, Kurukshetra

EducationKurukshetra, Haryana, India
About: National Institute of Technology, Kurukshetra is a education organization based out in Kurukshetra, Haryana, India. It is known for research contribution in the topics: Control theory & Cloud computing. The organization has 2449 authors who have published 5174 publications receiving 53995 citations. The organization is also known as: NIT Kurukshetra.


Papers
More filters
Journal ArticleDOI
TL;DR: A survey of IoT and Cloud Computing with a focus on the security issues of both technologies is presented, and it shows how the Cloud Computing technology improves the function of the IoT.

894 citations

Journal ArticleDOI
TL;DR: Support vector machines (SVM) are attractive for the classification of remotely sensed data with some claims that the method is insensitive to the dimensionality of the data and, therefore, does not require a dimensionality-reduction analysis in preprocessing, but it is shown that the accuracy of a classification by an SVM does vary as a function of the number of features used.
Abstract: Support vector machines (SVM) are attractive for the classification of remotely sensed data with some claims that the method is insensitive to the dimensionality of the data and, therefore, does not require a dimensionality-reduction analysis in preprocessing. Here, a series of classification analyses with two hyperspectral sensor data sets reveals that the accuracy of a classification by an SVM does vary as a function of the number of features used. Critically, it is shown that the accuracy of a classification may decline significantly (at 0.05 level of statistical significance) with the addition of features, particularly if a small training sample is used. This highlights a dependence of the accuracy of classification by an SVM on the dimensionality of the data and, therefore, the potential value of undertaking a feature-selection analysis prior to classification. Additionally, it is demonstrated that, even when a large training sample is available, feature selection may still be useful. For example, the accuracy derived from the use of a small number of features may be noninferior (at 0.05 level of significance) to that derived from the use of a larger feature set providing potential advantages in relation to issues such as data storage and computational processing costs. Feature selection may, therefore, be a valuable analysis to include in preprocessing operations for classification by an SVM.

708 citations

Journal ArticleDOI
TL;DR: The main methodologies used in electricity price forecasting have been reviewed in this paper and classification of various price-influencing factors used by different researchers has been done and put for reference.

492 citations

Journal ArticleDOI
TL;DR: The proposed solutions for collecting and managing sensors’ data in a smart building could lead us in an energy efficient smart building, and thus in a Green Smart Building.

460 citations

Proceedings ArticleDOI
21 Jul 2003
TL;DR: Results obtained by random forests classifier, another technique of generating ensemble of classifiers and their performance is compared with the ensemble of decision tree classifiers, which suggests that bagging perform well in comparison with boosting in case of noise in training data.
Abstract: In recent years, a number of works reported the use of combination of multiple classifiers to produce a single classification and demonstrated significant performance improvement. The resulting classifier, referred to as an ensemble classifier, is a set of classifiers whose individual decisions are combined by weighted or unweighted voting to classify new examples. An ensembles are often more accurate than the individual classifiers that makes them up. In remote sensing Giacinto and Roli, 1997, Roli et al., 1997 report the use of ensemble of neural networks and the integration of classification results of different type of classifiers. Studies by growing an ensemble of decision trees and allowing them to vote for the most popular class reported a significant improvement in classification accuracy for land cover classification. This paper presents results obtained by random forests classifier, another technique of generating ensemble of classifiers and their performance is compared with the ensemble of decision tree classifiers. A classification accuracy of 88.32% is achieved by random forest classifier in comparison with 87.38% and 87.28% by decision tree ensemble created using boosting and bagging techniques. Further, study also suggests that bagging perform well in comparison with boosting in case of noise in training data.

437 citations


Authors

Showing all 2503 results

NameH-indexPapersCitations
Praveen Kumar88133935718
Santosh Kumar80119629391
Ashwani Kumar6670318099
Amit Singh5764013795
Brij B. Gupta513689332
Rajiv Kumar5156115404
Sunil Luthra451626485
Pramod Kumar391704248
Abid Haleem393047178
Amit Mishra384015735
Mahesh Pal361057081
Ashutosh Kumar Singh353979381
Vikas Mittal343105182
Jitendra Kumar321273359
Suresh Kumar294073580
Network Information
Related Institutions (5)
National Institute of Technology, Rourkela
10.7K papers, 150.1K citations

93% related

National Institute of Technology, Tiruchirappalli
8K papers, 111.9K citations

93% related

Thapar University
8.5K papers, 130.3K citations

93% related

Indian Institute of Technology Roorkee
21.4K papers, 419.9K citations

92% related

Jadavpur University
27.6K papers, 422K citations

90% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
202336
202296
2021690
2020615
2019605
2018636