scispace - formally typeset
Search or ask a question
Author

Yadwinder Singh Brar

Bio: Yadwinder Singh Brar is an academic researcher from Punjab Technical University. The author has contributed to research in topics: AC power & Particle swarm optimization. The author has an hindex of 12, co-authored 70 publications receiving 482 citations. Previous affiliations of Yadwinder Singh Brar include Guru Nanak Dev Engineering College, Ludhiana & Yahoo!.


Papers
More filters
Journal ArticleDOI
TL;DR: This study summarises the research trends in SEE based upon a corpus of 1178 articles and identifies the core research areas and trends which may lead the researchers to understand and discern the research patterns in large literature dataset.
Abstract: Context Software effort estimation (SEE) is most crucial activity in the field of software engineering. Vast research has been conducted in SEE resulting into a tremendous increase in literature. Thus it is of utmost importance to identify the core research areas and trends in SEE which may lead the researchers to understand and discern the research patterns in large literature dataset. Objective To identify unobserved research patterns through natural language processing from a large set of research articles on SEE published during the period 1996 to 2016. Method A generative statistical method, called Latent Dirichlet Allocation (LDA), applied on a literature dataset of 1178 articles published on SEE. Results As many as twelve core research areas and sixty research trends have been revealed; and the identified research trends have been semantically mapped to associate core research areas. Conclusions This study summarises the research trends in SEE based upon a corpus of 1178 articles. The patterns and trends identified through this research can help in finding the potential research areas.

77 citations

Journal ArticleDOI
TL;DR: The Evolutionary optimization technique has been employed in which the ’preferred’ weightage pattern has been searched and the non-inferior solution that attains maximum satisfaction level from the membership functions of the participating objectives has been adjudged the ‘best’ solution.

55 citations

Journal ArticleDOI
TL;DR: Fuzzy Analytic Hierarchy Process is proposed, which can be used to rectify the subjectivity and imprecision of AHP andCan be used for selecting the type of Model best suited for estimating the effort for a given problem type or environment.
Abstract: Effort Estimation has always been a challenging task for the Project managers. Many researchers have tried to help them by creating different types of models. This has been already proved that none is successful for all types of projects and every type of environment. Analytic Hierarchy Process has been identified as the tool that would help in Multi Criteria Decision Making. Researchers have identified that Analytic Hierarchy Process can be used for the comparison of effort estimation of different models and techniques. But the problem with traditional Analytic Hierarchy Process is its inability to deal with the imprecision and subjectivity in the pairwise comparison process. The motive of this paper is to propose Fuzzy Analytic Hierarchy Process, which can be used to rectify the subjectivity and imprecision of Analytic Hierarchy Process and can be used for selecting the type of Model best suited for estimating the effort for a given problem type or environment. Instead of single crisp value, Fuzzy Analytic Hierarchy Process uses a range of values to incorporate decision maker uncertainty. From this range, decision maker can select the value that reflects his confidence and also he can specify his attitude like optimistic, pessimistic or moderate. In this work, the comparison of Analytic Hierarchy Process and Fuzzy Analytic Hierarchy Process is concluded using a case study of selection of effort estimation model.

37 citations

Journal ArticleDOI
25 Jun 2017-Energies
TL;DR: In this article, a preliminary attempt to correlate developments in this arena in the Asian region, as well as the developed world, to explore the possibilities of harnessing this resource in a better manner was made.
Abstract: Modern economies run on the backbone of electricity as one of major factors behind industrial development. India is endowed with plenty of natural resources and the majority of electricity within the country is generated from thermal and hydro-electric plants. A few nuclear plants assist in meeting the national requirements for electricity but still many rural areas remain uncovered. As India is primarily a rural agrarian economy, providing electricity to the remote, undeveloped regions of the country remains a top priority of the government. A vital, untapped source is livestock generated biomass which to some extent has been utilized to generate electricity in small scale biogas based plants under the government's thrust on rural development. This study is a preliminary attempt to correlate developments in this arena in the Asian region, as well as the developed world, to explore the possibilities of harnessing this resource in a better manner. The current potential of 2600 million tons of livestock dung generated per year, capable of yielding 263,702 million m3 of biogas is exploited. Our estimates suggest that if this resource is utilized judiciously, it possesses the potential of generating 477 TWh (Terawatt hour) of electrical energy per annum.

31 citations

01 Jan 2011
TL;DR: The aim of this study is to analyze soft computing techniques in the existing models and to provide in depth review of software and project estimation techniques existing in industry and literature based on the different test datasets along with their strength and weaknesses.
Abstract: The effort invested in a software project is probably one of the most important and most analyzed variables in recent years in the process of project management. The limitation of algorithmic effort prediction models is their inability to cope with uncertainties and imprecision surrounding software projects at the early development stage. More recently attention has turned to a variety of machine learning methods, and soft computing in particular to predict software development effort. Soft computing is a consortium of methodologies centering in fuzzy logic, artificial neural networks, and evolutionary computation. It is important, to mention here, that these methodologies are complementary and synergistic, rather than competitive. They provide in one form or another flexible information processing capability for handling real life ambiguous situations. These methodologies are currently used for reliable and accurate estimate of software development effort, which has always been a challenge for both the software industry and academia. The aim of this study is to analyze soft computing techniques in the existing models and to provide in depth review of software and project estimation techniques existing in industry and literature based on the different test datasets along with their strength and weaknesses.

28 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: A review of more than 90 published papers is presented here to analyze the applicability of various methods discussed and it is observed that Analytical Hierarchy Process is the most popular technique followed by outranking techniques PROMETHEE and ELECTRE.
Abstract: Multi-Criteria Decision Making (MCDM) techniques are gaining popularity in sustainable energy management. The techniques provide solutions to the problems involving conflicting and multiple objectives. Several methods based on weighted averages, priority setting, outranking, fuzzy principles and their combinations are employed for energy planning decisions. A review of more than 90 published papers is presented here to analyze the applicability of various methods discussed. A classification on application areas and the year of application is presented to highlight the trends. It is observed that Analytical Hierarchy Process is the most popular technique followed by outranking techniques PROMETHEE and ELECTRE. Validation of results with multiple methods, development of interactive decision support systems and application of fuzzy methods to tackle uncertainties in the data is observed in the published literature.

1,715 citations

Journal ArticleDOI
TL;DR: The state-of-the-art of data mining and analytics are reviewed through eight unsupervisedLearning and ten supervised learning algorithms, as well as the application status of semi-supervised learning algorithms.
Abstract: Data mining and analytics have played an important role in knowledge discovery and decision making/supports in the process industry over the past several decades. As a computational engine to data mining and analytics, machine learning serves as basic tools for information extraction, data pattern recognition and predictions. From the perspective of machine learning, this paper provides a review on existing data mining and analytics applications in the process industry over the past several decades. The state-of-the-art of data mining and analytics are reviewed through eight unsupervised learning and ten supervised learning algorithms, as well as the application status of semi-supervised learning algorithms. Several perspectives are highlighted and discussed for future researches on data mining and analytics in the process industry.

657 citations

Journal Article
TL;DR: Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) with einer Schlüssellänge von 128 bit zum Sicherheitsrisiko, und zuletzt konnte 1998 mit einem von der „Electronic Frontier Foundation“ (EFF) entwickkelten Spezialmaschine mit 1.800 parallel arbeit
Abstract: Im Jahre 1977 wurde der „Data Encryption Algorithm“ (DEA) vom „National Bureau of Standards“ (NBS, später „National Institute of Standards and Technology“ – NIST) zum amerikanischen Verschlüsselungsstandard für Bundesbehörden erklärt [NBS_77]. 1981 folgte die Verabschiedung der DEA-Spezifikation als ANSI-Standard „DES“ [ANSI_81]. Die Empfehlung des DES als StandardVerschlüsselungsverfahren wurde auf fünf Jahre befristet und 1983, 1988 und 1993 um jeweils weitere fünf Jahre verlängert. Derzeit liegt eine Neufassung des NISTStandards vor [NIST_99], in dem der DES für weitere fünf Jahre übergangsweise zugelassen sein soll, aber die Verwendung von Triple-DES empfohlen wird: eine dreifache Anwendung des DES mit drei verschiedenen Schlüsseln (effektive Schlüssellänge: 168 bit) [NIST_99]. Der DES basiert auf einer von Horst Feistel bei IBM entwickelten Blockchiffre („Lucipher“) mit einer Schlüssellänge von 128 bit. Da die amerikanische „National Security Agency“ (NSA) dafür gesorgt hatte, daß der DES eine Schlüssellänge von lediglich 64 bit besitzt, von denen nur 56 bit relevant sind, und spezielle Substitutionsboxen (den „kryptographischen Kern“ des Verfahrens) erhielt, deren Konstruktionskriterien von der NSA nicht veröffentlicht wurden, war das Verfahren von Beginn an umstritten. Kritiker nahmen an, daß es eine geheime „Trapdoor“ in dem Verfahren gäbe, die der NSA eine OnlineEntschlüsselung auch ohne Kenntnis des Schlüssels erlauben würde. Zwar ließ sich dieser Verdacht nicht erhärten, aber sowohl die Zunahme von Rechenleistung als auch die Parallelisierung von Suchalgorithmen machen heute eine Schlüssellänge von 56 bit zum Sicherheitsrisiko. Zuletzt konnte 1998 mit einer von der „Electronic Frontier Foundation“ (EFF) entwickelten Spezialmaschine mit 1.800 parallel arbeitenden, eigens entwickelten Krypto-Prozessoren ein DES-Schlüssel in einer Rekordzeit von 2,5 Tagen gefunden werden. Um einen Nachfolger für den DES zu finden, kündigte das NIST am 2. Januar 1997 die Suche nach einem „Advanced Encryption Standard“ (AES) an. Ziel dieser Initiative ist, in enger Kooperation mit Forschung und Industrie ein symmetrisches Verschlüsselungsverfahren zu finden, das geeignet ist, bis weit ins 21. Jahrhundert hinein amerikanische Behördendaten wirkungsvoll zu verschlüsseln. Dazu wurde am 12. September 1997 ein offizieller „Call for Algorithm“ ausgeschrieben. An die vorzuschlagenden symmetrischen Verschlüsselungsalgorithmen wurden die folgenden Anforderungen gestellt: nicht-klassifiziert und veröffentlicht, weltweit lizenzfrei verfügbar, effizient implementierbar in Hardund Software, Blockchiffren mit einer Blocklänge von 128 bit sowie Schlüssellängen von 128, 192 und 256 bit unterstützt. Auf der ersten „AES Candidate Conference“ (AES1) veröffentlichte das NIST am 20. August 1998 eine Liste von 15 vorgeschlagenen Algorithmen und forderte die Fachöffentlichkeit zu deren Analyse auf. Die Ergebnisse wurden auf der zweiten „AES Candidate Conference“ (22.-23. März 1999 in Rom, AES2) vorgestellt und unter internationalen Kryptologen diskutiert. Die Kommentierungsphase endete am 15. April 1999. Auf der Basis der eingegangenen Kommentare und Analysen wählte das NIST fünf Kandidaten aus, die es am 9. August 1999 öffentlich bekanntmachte: MARS (IBM) RC6 (RSA Lab.) Rijndael (Daemen, Rijmen) Serpent (Anderson, Biham, Knudsen) Twofish (Schneier, Kelsey, Whiting, Wagner, Hall, Ferguson).

624 citations

Book
30 Sep 2004
TL;DR: In this article, a review of evolutionary method has been presented to solve the problem of allocating customers' load demands among the available thermal power generating units in an economic, secure and reliable way.
Abstract: Electric power systems have experienced continuous growth in all the three major sectors of the power system namely, generation, transmission and distribution. Electricity cannot be stored economically, but there has to be continuous balance between demand and supply. The increase in load sizes and operational complexity such as generation allocation, non-utility generation planning, and pricing brought about by the widespread interconnection of transmission systems and inter-utility power transaction contracts, has introduced major difficulties into the operation of power system. Allocation of customers' load demands among the available thermal power generating units in an economic, secure and reliable way has been a subject of interest since 1920 or even earlier. However practically, the generating units have non-convex input-output characteristics due to prohibited operating zones, valve-point loadings and multi-fuel effects considered as heavy equality and inequality constraints, which cannot be directly solved by mathematical programming methods. Dynamic programming can treat such types of problems, but it suffers from the curse of dimensionality. Over the past decade, many prominent methods have been developed to solve these problems, such as the hierarchical numerical methods, tabu search, neural network approaches, genetic algorithm, evolutionary programming, swarm optimisation, differential evolution and hybrid search methods. Review of evolutionary method has been presented.

384 citations