scispace - formally typeset
Search or ask a question

Showing papers by "Ranjit Biswas published in 2019"


Journal ArticleDOI
TL;DR: This study uses software-defined networking (SDN) framework for energy-efficient and optimal routing of data and requests from source to destination, and vice versa and uses ‘Twofish’ cryptographic technique for encrypting the information captured by the sensors.
Abstract: The technological advancements in the field of computing are giving rise to the generation of gigantic volumes of data which are beyond the handling capabilities of the conventionally available too

21 citations


Journal ArticleDOI
TL;DR: In this article, a small deformation micromorphic computational homogenization framework for matrix-inclusion composites is presented, where standard continuum models at the micro-scale are translated consistently onto the macro-scale to recover a micromomorphic continuum.

13 citations


Journal ArticleDOI
TL;DR: This research provides a new compact clustering technique which prevents data loss by including SDES encryption technique to protect security attacks, a new error control scheme to address any number of transmission errors and Huffman compression to control the data size and extra security overheads.
Abstract: Analyzing the multi-dimensional data in faster way is an important and basic aspect in any clustering mechanism. At the same time the clustering mechanism should provide a security structure to protect data loss by preventing distinguish security attacks and transmission errors. Also, to control the data size, security overheads and data loss due to data and time overheads during clustering of unstructured and uncertain big data, a data compression system is also needed. Among the various researches, the current literature fails to suggest any integrated technique to solve these issues in a combinatorial way. Henceforth, this research provides a new compact clustering technique which prevents data loss by including SDES encryption technique to protect security attacks, a new error control scheme to address any number of transmission errors and Huffman compression to control the data size and extra security overheads. Apart from that, the faster execution of proposed integrated technique reduces the time overhead. The experimental result shows, it offers higher data integrity by producing lower percentage of Information Loss, higher SNR and compression ratio. Furthermore, the capacity to produce higher Throughputs and low Cyclomatic Complexity shows its time efficiencies.

11 citations


Book ChapterDOI
01 Jan 2019
TL;DR: This paper presents a comprehensive review of the various techniques available for efficiently handling small size files in Hadoop on the basis of certain performance parameters like access time, read/write complexity, scalability, and processing speed.
Abstract: Recent technological advancements in the field of computing have been the cause of voluminous generation of data which cannot be handled effectively by traditionally available tools, processes, and systems. To effectively handle this big data, new techniques and frameworks have emerged in recent times. Hadoop is a prominent framework for managing huge amount of data. It provides efficient means for the storage, retrieval, processing, and analytics of big data. Although Hadoop works very well with large files, its performance tends to degrade when it is required to process hundreds or thousands of small size files. This paper puts forward the challenges and opportunities that may arise while handling large number of small size files. It also presents a comprehensive review of the various techniques available for efficiently handling small size files in Hadoop on the basis of certain performance parameters like access time, read/write complexity, scalability, and processing speed.

9 citations


Book ChapterDOI
12 Apr 2019
TL;DR: In this article, the authors used Auto Regressive Integrated Moving Average (ARIMA) model under time series analysis for forecasting, which consider only the historical data, and selected price of sunflower seed for the period 1st January 2011 to 31st December 2016, gathered from “data.gov.in.
Abstract: The Forecasting of agriculture commodity price plays an important role in the developing country like India, whose major population directly or indirectly depends upon farming. There are several forecasting techniques like Time series analysis, regression techniques, learning techniques. We used Auto Regressive Integrated Moving Average (ARIMA) model under Time series analysis for forecasting, which consider only the historical data. We selected price of sunflower seed for the period 1st January 2011 to 31st December 2016, gathered from “data.gov.in” for the market Kadiri, Anantpur district, Andhra Pradesh, India. We used the data from 1st Jan, 2011 to 31st Dec 2015 for training purpose and the data from 1st Jan, 2016 to 31st Dec 2016 for testing purpose. Based on the training data, ARIMA(1, 1, 2) selected as best model. Mean Average Percentage Error (MAPE) for the selected model is calculated as 2.30%. The Root Mean Square Percentage Error (RMSPE) observed by the model as 3.44%.

8 citations


Journal ArticleDOI
TL;DR: It has been justified that intuitionistic fuzzy theory is more appropriate tool than fuzzy theory for soft-computing and a set of necessary eligibility conditions is proposed at least one of which is to be mandatorily satisfied by any fuzzy decision maker before using ‘Fuzzy Set Theory’ in SoftComputing.
Abstract: The entire work reported here is a philosophy based theoretical presentation. In this work it has been justified that intuitionistic fuzzy theory is more appropriate tool than fuzzy theory for soft-computing. Any association of computing methodologies centered on fuzzy set theory is well regarded as one kind of Softcomputing. Soft computing with fuzzy theory involves fluent use of both μ(x) and μ(x) i.e. ν(x) of the elements of all the universes of the concerned decision problem. The role of both μ(x) and ν(x) are very fluent in solving decision problems in almost all application areas of fuzzy theory. Recent literatures reveal that fuzzy set theory may not be an appropriate model to deal with ill-defined large size decision problems. In this work the author very precisely unearths the weakness of fuzzy theory in a further dimension, which is not caused due to μ(x) but due to the other part ν(x) of the coin. Doing a rigorous exercise with few real life examples it is observed that the fuzzy set theory is inappropriate not only for large size decision problems but also for many decision problems, irrespective of its size, large or small, in real life environment. Consequently, a set of necessary eligibility conditions is proposed at least one of which is to be mandatorily satisfied by any fuzzy decision maker before using ‘Fuzzy Set Theory’ in SoftComputing. The initial content of this paper deals with a basic question: Is ‘Fuzzy Theory’ really always good for soft-computing? In the Theory of CIFS it is justified that while solving any decision making problem, the selection of a suitable soft-computing set theory or crisp theory is made by the concerned decision maker by his own choice and own knowledge, which functions at the outer sphere of the cognition system of the decision maker. It is analogous to the case of a computer programmer who solves a problem by writing codes in a

6 citations


Journal ArticleDOI
TL;DR: There is a genuine vacuum in the family of all existing standard algebraic structures of Abstract Algebra to provide computational norms to the giant subjects like: mathematics, physics, statistics, etc and this vacuum was remaining so far in a very hidden way.
Abstract: The important algebraic structures viz. group, ring, module, field, linear space, algebra over a field, associative algebra over a field, Division Algebra have made the subject „Abstract Algebra‟ very rich and well equipped to deal with various algebraic computations at elementary level to higher level of mathematics. In this work it is unearthed that these algebraic structures are not sufficient (i.e. not capable in most of the cases) to support „Mathematics‟, to support the various branches of Sciences and Engineering. It is unearthed that there are many computational problems and issues in mathematics which do not fall under the jurisdictions of any of these algebraic structures to deal appropriately. It is thus observed that there is a genuine vacuum in the family of all existing standard algebraic structures of Abstract Algebra to provide computational norms to the giant subjects like: mathematics, physics, statistics, etc. and this vacuum was remaining so far in a very hidden way. Consequently, it is strongly justified that this family (of important algebraic structures viz. group, ring, module, field, linear space, algebra over a field, associative algebra over a field, Division Algebra etc.) needs inclusion of an appropriate new member who is well capable and can take the responsibility to deal with the all type of computations being practiced in mathematics, sciences, engineering studies unlike the limited capabilities of the existing standard algebraic structures. It is fact that an algebraist can introduce a number of new algebraic structures if he desires. But the question may arise about the necessity to do so!. A new algebraic structure is not supposed to be a redundant one to the subject „Abstract Algebra‟ to unnecessarily cater to the existing huge volume of literature of Algebra. It must have some unique as well as advance kind of roles in the mathematical computing of daily practices (at school, college, research levels) which none of the existing

5 citations


Journal ArticleDOI
TL;DR: In this paper, a new type of numbers is discovered called compound numbers, which is a generalized concept of the complex numbers, and a new number theory is introduced in the subject Number Theory entitled "Theory of Compound Numbers" in which we can talk about various operations over the objects of a set, about a new concept of infinity, about zero object, and signed objects called by positive and negative objects; categorizations like prime objects, composite objects, neither prime nor composite objects.
Abstract: In this paper a new type of numbers is discovered called by „Compound Numbers‟ which is a generalized concept of the complex numbers. A new theory entitled “Theory of Objects” and the corresponding “Object Algebra” are first of all introduced where we can talk about various operations over the objects of a set, about a new concept of „infinity‟, about zero object, and signed objects called by positive and negative objects; categorizations like prime objects, composite objects, neither prime nor composite objects, etc. The traditional notion of „numbers‟ that we use in classical arithmetic is a particular case of „object‟ of a „region‟ algebraic structure. The introduction of imaginary number i in the classical „Theory of Numbers‟ is very interesting but its history says that it took a long span of years to convince the mathematicians about the role and necessity of it in mathematics. Since then i has been playing extraordinary roles in various branches of Science, Engineering and many other broad fields. In our newly proposed “Theory of Objects”, the notion of „imaginary object‟ is introduced for a region where the existing concept of „imaginary number‟ is a particular instance of the concept of „imaginary object‟. It is unearthed that the region C (set of complex numbers) does also have imaginary objects. The adjective „imaginary‟ is completely a local issue with respect to the concerned region. Something may be an imaginary object for a region A, but may be a core internal member of another region B. For example, i is an imaginary object for a region R, but it is a core internal member of the region C. In the “Theory of Objects” developed in this paper, two imaginary objects e and w are unearthed for the region C (set of complex numbers). These imaginary objects e and w of C are called by „compound numbers‟, and consequently a new number theory is introduced in the subject Number Theory entitled „Theory of Compound Numbers‟. In fact it is

5 citations