About: Infosys is a company organization based out in Bengaluru, India. It is known for research contribution in the topics: Cloud computing & Business process. The organization has 1880 authors who have published 1840 publications receiving 20595 citations. The organization is also known as: Infy & Infosys Limitied.
Papers published on a yearly basis
01 Jan 2012
Eindhoven University of Technology1, Queensland University of Technology2, Capgemini3, University of Rome Tor Vergata4, Humboldt University of Berlin5, Software AG6, University of Padua7, Polytechnic University of Catalonia8, Hewlett-Packard9, Ghent University10, New Mexico State University11, IBM12, University of Milan13, University of Tartu14, University of Vienna15, Technical University of Lisbon16, Telecom SudParis17, Rabobank18, Infosys19, University of Calabria20, Fujitsu21, Pennsylvania State University22, University of Bari23, University of Bologna24, Vienna University of Economics and Business25, Free University of Bozen-Bolzano26, Stevens Institute of Technology27, Indian Council of Agricultural Research28, Pontifical Catholic University of Chile29, University of Haifa30, Ulsan National Institute of Science and Technology31, Cranfield University32, Katholieke Universiteit Leuven33, Deloitte34, Tsinghua University35, University of Innsbruck36, Hasso Plattner Institute37
TL;DR: This manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users to increase the maturity of process mining as a new tool to improve the design, control, and support of operational business processes.
Abstract: Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains. There are two main drivers for the growing interest in process mining. On the one hand, more and more events are being recorded, thus, providing detailed information about the history of processes. On the other hand, there is a need to improve and support business processes in competitive and rapidly changing environments. This manifesto is created by the IEEE Task Force on Process Mining and aims to promote the topic of process mining. Moreover, by defining a set of guiding principles and listing important challenges, this manifesto hopes to serve as a guide for software developers, scientists, consultants, business managers, and end-users. The goal is to increase the maturity of process mining as a new tool to improve the (re)design, control, and support of operational business processes.
09 May 2005
TL;DR: This work identifies two families of resource allocation algorithms: task-based algorithms that greedily allocate tasks to resources, and workflow- based algorithms that search for an efficient allocation for the entire workflow.
Abstract: Grid applications require allocating a large number of heterogeneous tasks to distributed resources. A good allocation is critical for efficient execution. However, many existing grid toolkits use matchmaking strategies that do not consider overall efficiency for the set of tasks to be run. We identify two families of resource allocation algorithms: task-based algorithms, that greedily allocate tasks to resources, and workflow-based algorithms, that search for an efficient allocation for the entire workflow. We compare the behavior of workflow-based algorithms and task-based algorithms, using simulations of workflows drawn from a real application and with varying ratios of computation cost to data transfer cost. We observe that workflow-based approaches have a potential to work better for data-intensive applications even when estimates about future tasks are inaccurate.
TL;DR: A new architecture for the implementation of IoT to store and process scalable sensor data (big data) for health care applications and uses MapReduce based prediction model to predict the heart diseases is proposed.
Abstract: Wearable medical devices with sensor continuously generate enormous data which is often called as big data mixed with structured and unstructured data. Due to the complexity of the data, it is difficult to process and analyze the big data for finding valuable information that can be useful in decision-making. On the other hand, data security is a key requirement in healthcare big data system. In order to overcome this issue, this paper proposes a new architecture for the implementation of IoT to store and process scalable sensor data (big data) for health care applications. The Proposed architecture consists of two main sub architectures, namely, Meta Fog-Redirection (MF-R) and Grouping and Choosing (GC) architecture. MF-R architecture uses big data technologies such as Apache Pig and Apache HBase for collection and storage of the sensor data (big data) generated from different sensor devices. The proposed GC architecture is used for securing integration of fog computing with cloud computing. This architecture also uses key management service and data categorization function (Sensitive, Critical and Normal) for providing security services. The framework also uses MapReduce based prediction model to predict the heart diseases. Performance evaluation parameters such as throughput, sensitivity, accuracy, and f-measure are calculated to prove the efficiency of the proposed architecture as well as the prediction model.
TL;DR: A Point/Counterpoint department discusses whether global software development is indeed a business necessity, and presents five articles that cover various aspects of globalSoftware Development, including knowledge management strategies, distributed software development, requirements engineering, distributed requirements, and managing offshore collaboration.
Abstract: Global software development efforts have increased in recent years, and such development seems to have become a business necessity for various reasons, including cost, availability of resources, and the need to locate development closer to customers. However, there's still much to learn about global software development before the discipline becomes mature. This special issue aims to assess the gap between the state of the art and the state of the practice. It presents five articles that cover various aspects of global software development, including knowledge management strategies, distributed software development, requirements engineering, distributed requirements, and managing offshore collaboration. A Point/Counterpoint department discusses whether global software development is indeed a business necessity.This article is part of a special issue on Global Software Development.
TL;DR: Case studies from an Indian IT-services firm provide insights into the root causes of RE phase conflicts in client-vendor offshore-outsourcing relationships.
Abstract: With outsourcing on the rise, every relation between an outsourcer and a vendor calls for collaboration between multiple organizations across multiple locations. As part of a global IT-services organization with high process maturity, we have had many opportunities to understand the requirements engineering life cycle related to global software development. RE is a software project's most critical phase; the RE phase's success is essential for the project's success. Case studies from an Indian IT-services firm provide insights into the root causes of RE phase conflicts in client-vendor offshore-outsourcing relationships
Showing all 1880 results
|Avinash C. Kak||51||254||25027|
|Sanjoy Kumar Paul||30||160||2847|
Related Institutions (5)
86.9K papers, 4.1M citations
5.5K papers, 483.1K citations
Singapore Management University
8.3K papers, 239.6K citations
Information Technology University
13K papers, 236.4K citations
Carnegie Mellon University
104.3K papers, 5.9M citations