scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Cooperative Information Systems in 2017"


Journal ArticleDOI
TL;DR: A conformity particle swarm optimization with fireworks explosion operation (CPSO-FEO) to solve large-scale HW/SW partitioning and an improved FEO with new initialization strategy is presented to enhance the search accuracy and solution quality.
Abstract: In the co-design process of hardware/software (HW/SW) system, especially for large and complicated embedded systems, HW/SW partitioning is a challenging step. Among different heuristic approaches, particle swarm optimization (PSO) has the advantages of simple implementation and computational efficiency, which is suitable for solving large-scale problems. This paper presents a conformity particle swarm optimization with fireworks explosion operation (CPSO-FEO) to solve large-scale HW/SW partitioning. First, the proposed CPSO algorithm simulates the conformist mentality from biology research. The CPSO particles with psychological conformist always try to move toward a secure point and avoid being attacked by natural enemy. In this way, there is a greater possibility to increase population diversity and avoid local optimum in CPSO. Next, to enhance the search accuracy and solution quality, an improved FEO with new initialization strategy is presented and is combined with CPSO algorithm to search a better pos...

71 citations


Journal ArticleDOI
TL;DR: Enterprise architecture (EA) is a description of an enterprise from an integrated business and IT perspective intended to improve business andIT alignment, and is used in the majority of large companies.
Abstract: Enterprise architecture (EA) is a description of an enterprise from an integrated business and IT perspective intended to improve business and IT alignment, and is used in the majority of large com...

37 citations


Journal ArticleDOI
TL;DR: The correlation miner is introduced, a technique that facilitates discovery of business process models when events are not associated with a case identifier, by not only enabling the discovery of the process model, but also detecting which events belong to the same case.
Abstract: Process discovery algorithms aim to capture process models from event logs. These algorithms have been designed for logs in which the events that belong to the same case are related to each other — and to that case — by means of a unique case identifier. However, in service-oriented systems, these case identifiers are rarely stored beyond request-response pairs, which makes it hard to relate events that belong to the same case. This is known as the correlation challenge. This paper addresses the correlation challenge by introducing a technique, called the correlation miner, that facilitates discovery of business process models when events are not associated with a case identifier. It extends previous work on the correlation miner, by not only enabling the discovery of the process model, but also detecting which events belong to the same case. Experiments performed on both synthetic and real-world event logs show the applicability of the correlation miner. The resulting technique enables us to observe a service-oriented system and determine — with high accuracy — which request-response pairs sent by different communicating parties are related to each other.

30 citations


Journal ArticleDOI
TL;DR: Identifier lexicon may have a direct impact on software understandability and reusability and, thus, on the quality of the final software product.
Abstract: Identifier lexicon may have a direct impact on software understandability and reusability and, thus, on the quality of the final software product. Understandability and reusability are two importan...

19 citations


Journal ArticleDOI
TL;DR: A comparative study of different lightweight authentication methods is presented but the main focus of this work is one of the different implementations of lightweight authentication in the vertical handoff.
Abstract: Lightweight authentication is one of the solutions proposed in order to reduce the time required for authentication during vertical handoff across heterogeneous networks. Reducing the handoff latency is considered to be a challenging issue. It arises when a user requires maintaining its service continuity while traveling across heterogeneous networks. For example, a mobile user may change access networks while being engaged in different scenarios, such as browsing Internet, using real-time applications or collaborating in cooperative information systems. Delay in the vertical handoff creates many problems, i.e. packet loss, service interruption, security problems, etc. Fast and lightweight authentication schemes are always tempted in such application domains because of many benefits, for instance seamless and efficient handoff, service continuity, guaranteed quality of service (QoS) and suitability for real-time applications while maintaining security. Various techniques have been proposed in this domain to reduce authentication delay. However, these methods do not fully address all the issues in the problem domain; for example, these methods have deficiencies in terms of security, monetary cost, signaling cost and packet latency. In this paper, a comparative study of different lightweight authentication methods is presented but the main focus of this work is one of the different implementations. An overview of major problems and their solutions are presented along with their strengths and limitations. Different emerging research areas are also presented in the domain of lightweight authentication in the vertical handoff.

15 citations


PatentDOI
TL;DR: In this paper, a computer-implemented method according to one embodiment includes identifying a set of virtual machines to be placed within a system, receiving characteristics associated with the set of VMMs, determining characteristics associated associated with a current state of the system, determining a placement of the VMMs within the system and determining an updated placement of all VMMs.
Abstract: A computer-implemented method according to one embodiment includes identifying a set of virtual machines to be placed within a system, receiving characteristics associated with the set of virtual machines, determining characteristics associated with a current state of the system, determining a placement of the set of virtual machines within the system, based on the characteristics associated with the set of virtual machines and the characteristics associated with a current state of the system, determining an updated placement of all virtual machines currently placed within the system, based on the characteristics associated with the set of virtual machines and the characteristics associated with a current state of the system, and determining a migration sequence within the system in order to implement the updated placement of all virtual machines currently placed within the system.

14 citations


Journal ArticleDOI
TL;DR: The trust notation can provide a creditable basis for access control decision-making for the resource pooling, dynamic, and multi-tenant cloud environment.
Abstract: Providing a creditable basis for access control decision-making is not an easy task for the resource pooling, dynamic, and multi-tenant cloud environment. The trust notation can provide this credit...

13 citations


Journal ArticleDOI
TL;DR: This paper synthesizes the generic conceptual model of EAM providing a more realistic conceptualization of Eam describing it as a decentralized network of independent but interacting processes, artifacts and actors.
Abstract: Enterprise architecture (EA) is a description of an enterprise from an integrated business and IT perspective. Enterprise architecture management (EAM) is a management practice embracing all the management processes related to EA aiming to improve business and IT alignment. EAM is typically described as a sequential four-step process: (i) document the current state, (ii) describe the desired future state, (iii) develop the transition plan and (iv) implement the plan. This traditional four-step approach to EAM essentially defines the modern understanding of EA. Based on a literature review, this paper demonstrates that this four-step approach to EAM, though practiced by some companies, is inadequate as a model explaining the EAM phenomenon in general. As a substitute, this paper synthesizes the generic conceptual model of EAM providing a more realistic conceptualization of EAM describing it as a decentralized network of independent but interacting processes, artifacts and actors.

12 citations


Journal ArticleDOI
TL;DR: How IT service quality has been defined and managed over time is analyzed, how to manage SLAs in today’s multi-layer, multi-sourced Cloud environments is discussed, and what to expect going forward is discussed.
Abstract: Cloud providers offer services at different levels of abstraction from infrastructure to applications. The quality of Cloud services is a key determinant of the overall service level a provider offers to its customers. Service Level Agreements (SLAs) are (1) crucial for Cloud customers to ensure that promised levels of services are met, (2) an important sales instrument and (3) a differentiating factor for providers. Cloud providers and services are often selected more dynamically than in traditional IT services, and as a result, SLAs need to be set up and their monitoring implemented to match the same speed. In this context, managing SLAs is complex: different Cloud providers expose different management interfaces and SLA metrics differ from one provider to another. In this paper, we will analyze how IT service quality has been defined and managed over time, discuss how to manage SLAs in today’s multi-layer, multi-sourced Cloud environments, and what to expect going forward. A particular focus will be made on the rSLA framework that enables fast setup of SLA monitoring in dynamic and heterogeneous Cloud environments. The rSLA framework is made up of three main components: the rSLA language to formally represent SLAs, the rSLA Service, which interprets the SLAs and implements the behavior specified in them, and a set of Xlets-lightweight, dynamically bound adapters to monitoring and controlling interfaces. rSLA has been tested in the context of a real pilot and found to reduce the client on-boarding process from months to weeks.

7 citations


Journal ArticleDOI
TL;DR: The scalability of the cloud infrastructure is essential to perform large-scale data processing using MapReduce programming model by automatically provisioning and de-provisioning the resources on the cloud.
Abstract: The scalability of the cloud infrastructure is essential to perform large-scale data processing using MapReduce programming model by automatically provisioning and de-provisioning the resources on ...

7 citations


Journal ArticleDOI
TL;DR: The Nexus metadata model provides a unique level of flexibility for on-the-fly data integration in a loosely coupled federation of autonomous data providers, thereby advancing the status quo in terms of flexibility and expressive power.
Abstract: On-the-fly data integration, i.e. at query time, happens mostly in tightly coupled, homogeneous environments where the partitioning of the data can be controlled or is known in advance. During the process of data fusion, the information is homogenized and data inconsistencies are hidden from the application. Beyond this, we propose in this paper the Nexus metadata model and a processing approach that support on-the-fly data integration in a loosely coupled federation of autonomous data providers, thereby advancing the status quo in terms of flexibility and expressive power. It is able to represent data and schema inconsistencies like multi-valued attributes and multi-typed objects. In an open environment, this best suites the application needs where the data processing infrastructure is not able to decide which attribute value is correct. The Nexus metadata model provides the foundation for integration schemata that are specific to a given application domain. The corresponding processing model provides four complementary query semantics in order to account for the subtleties of multi-valued and missing attributes. In this paper we show that this query semantics is sound, easy to implement, and it builds upon existing query processing techniques. Thus the Nexus metadata model provides a unique level of flexibility for on-the-fly data integration.

Journal ArticleDOI
TL;DR: In the process of recommending review experts to projects, in order to effectively make use of the relevance among topics and the relationship among experts, a new method is proposed for review exp...
Abstract: In the process of recommending review experts to projects, in order to effectively make use of the relevance among topics and the relationship among experts, a new method is proposed for review exp...

Journal ArticleDOI
TL;DR: This paper presents a comprehensive survey of hardware attack mitigation techniques, which are matched to the hardware attacks and attack criteria they can counter, which helps security personnel choose appropriate mitigation techniques to protect their systems against hardware attacks.
Abstract: The goal of a hardware attack is to physically access a digital system to obtain secret information or modify the system behavior. These attacks can be classified as covert or overt based on the awareness of the attack. Each hardware attack has capabilities as well as objectives. Some employ hardware trojans, which are inserted during, manufacture, while others monitor system emissions. Once a hardware attack has been identified, mitigation techniques should be employed to protect the system. There are now a wide variety of techniques, which can be used against hardware attacks. In this paper, a comprehensive survey of hardware attack mitigation techniques is presented. These techniques are matched to the hardware attacks and attack criteria they can counter, which helps security personnel choose appropriate mitigation techniques to protect their systems against hardware attacks. An example is presented to illustrate the choice of appropriate countermeasures.

Journal ArticleDOI
Zhiqiang Zhang1, Xiaoyan Wei1, Xiaoqin Xie1, Haiwei Pan1, Yu Miao1 
TL;DR: An efficient optimization approach was proposed that can anticipate the tuples most likely to become Top-k result based on dominant relationship analysis, greatly reducing the amount of data in query processing.
Abstract: Uncertain data is inherent in various important applications and Top-k query on uncertain data is an important query type for many applications To tackle the performance issue of evaluating Top-k query on uncertain data, an efficient optimization approach was proposed in this paper This method can anticipate the tuples most likely to become Top-k result based on dominant relationship analysis, greatly reducing the amount of data in query processing When the database is updated, this method could determine whether the change affects the current query result, and help us to avoid unnecessary re-query The experimental results prove the feasibility and effectiveness of this method

Journal ArticleDOI
TL;DR: The paper makes use of a utility-driven approach to solve interaction among private cloud user, hybrid cloud service provider and public cloud provider in hybrid cloud environment.
Abstract: The paper presents a hybrid cloud service provisioning and selection optimization scheme, and proposes a hybrid cloud model which consists of hybrid cloud users, private cloud and public cloud. This scheme aims to effectively provide cloud service and allocate cloud resources, such that the system utility can be maximized subject to public cloud resource constraints and hybrid cloud users constraints. The paper makes use of a utility-driven approach to solve interaction among private cloud user, hybrid cloud service provider and public cloud provider in hybrid cloud environment. The paper presents hybrid cloud service provisioning and selection algorithm in hybrid cloud. The hybrid cloud market consists of hybrid cloud user agent, hybrid cloud service agent and hybrid cloud agent, which represent the interests of different roles. The experiments are designed to compare the performance of proposed algorithm with the other related work.


Journal ArticleDOI
TL;DR: The goal of this paper is to present a new enhanced consensus algorithm, built on the architecture of the Mostefaoui-Raynal (MR) consensus algorithm and integrates new features and some known techniques in order to enhance the performance of consensus in situations where process crashes are present in the system.
Abstract: The consensus problem has become a key issue in the field of collaborative telemedicine systems because of the need to guarantee the consistency of shared data. In this paper, we focus on the performance of consensus algorithms. First, we studied, in the literature, the most well-known algorithms in the domain. Experiments on these algorithms allowed us to propose a new algorithm that enhances the performance of consensus in different situations. During 2014, we presented our very first initial thoughts to enhance the performance of the consensus algorithms, but the proposed solution gave very moderate results. The goal of this paper is to present a new enhanced consensus algorithm, named Fouad, Lionel and J.-Christophe (FLC). This new algorithm was built on the architecture of the Mostefaoui-Raynal (MR) consensus algorithm and integrates new features and some known techniques in order to enhance the performance of consensus in situations where process crashes are present in the system. The results from our experiments running on the simulation platform Neko show that the FLC algorithm gives the best performance when using a multicast network model on different scenarios: in the first scenario, where there are no process crashes nor wrong suspicion, and even in the second one, where multiple simultaneous process crashes take place in the system.

Journal ArticleDOI
TL;DR: In the proposed method, a goal model is used to estimate the operational costs of business processes and goals scenarios in the goal model of desired information systems are applied as a basis for estimating the design cost.
Abstract: In this paper, a novel business process engineering method based on quality assessment is proposed. In the proposed method, a goal model is used to estimate the operational costs of business processes. Goals scenarios in the goal model of desired information systems are applied as a basis for estimating the design cost. Qualities of business requirements models and business process models are also estimated. Based on the quality metrics, the process of business process modeling is examined. Then, using XOR operator in the goal model, a simple and direct mapping of the goal model to the business process model is introduced. Common activities in the business process model are further factored and summarized using pre- and post-factoring operations. The proposed business process modeling method is language-independent. An ICT office in Mazandaran Power Distribution Company is used as a case study to exemplify QABPEM. Our evaluation results demonstrates the capability of the proposed method compared with the ...

Journal ArticleDOI
TL;DR: “Sexual abuse of children is inexcusable, so why is there such a fuss about a state intervention? Should the authors shut up and do nothing just because there is racism? No child or woman must be molested, irrespective of who the perpetrator is!”
Abstract: “Sexual abuse of children is inexcusable. So why is there such a fuss about a state intervention? Should we shut up and do nothing just because there is racism? No child or woman must be molested, irrespective of who the perpetrator is!” Thus my recollection of what one of my Scottish colleagues said in an informal conversation about the 2007 Northern Territory Intervention, a set of legal and political measures intended to curtail domestic violence in Indigenous Australian communities. “Yes”, I replied, “race should not be an issue when talking about crime”. Not least because domestic violence happens everywhere, including Scotland. I would not have heard anyone talking about a specifically Scottish, White or European propensity for domestic violence. Yet there is abundant talk about Black violence. Generalisation is the hallmark of racialisation. Blackness is scripted as inherently violent—a tenacious trope deriving from colonial concepts of ferocious animalism (e.g. Eze 2000; Nederveen-Pieterse 1990).

Journal ArticleDOI
TL;DR: This paper shows Tsai et al.
Abstract: Password authentication with smart card is one of the simplest and efficient authentication mechanisms to ensure secure communication over insecure network environments. Recently, Tsai et al. proposed an improved password authentication scheme for smart card. Their scheme is more secure than the other previous schemes. In this paper, we show Tsai et al.’s scheme is vulnerable to password guessing attack and has computational overhead. Furthermore, we propose an enhanced password authentication scheme to eliminate the security vulnerability and enhance the overhead. By presenting concrete analysis of security and performance, we show that the proposed scheme cannot only resist various well known attacks, but also is more efficient than the other related works, and thus is feasible for practical applications.

Journal ArticleDOI
TL;DR: The tenuous relationship between Indigenous rights, state responsibilities and business expectations is examined in relation to extractive industries and business activities.
Abstract: The United Nations Declaration on the Rights of Indigenous Peoples was hailed as a triumph among Indigenous peoples, signalling a long-awaited recognition of their fundamental human rights. Despite this, many violations of these basic rights continue, particularly in relation to extractive industries and business activities. In response, a business reference guide seeks to inform industries of their responsibilities. This article examines the tenuous relationship between Indigenous rights, state responsibilities and business expectations.

Journal ArticleDOI
TL;DR: This article explored nine regional Aboriginal women's experiences of culture and identity by a process of deeply listening to each woman: Ngara Dyin (Dharawal language), and seven interdependent overarching themes were developed: walking and talking black; it’s not easy growing up in a white society; we sit down and listen; connection to country; strong black women; the way forward; and, wanting that magic.
Abstract: The marginalised position and unequal health status of Aboriginal people in Australia are a direct consequence of the trauma and dispossession of colonisation. Aboriginal women experience even greater levels of distress and ill health than Aboriginal men, and are more disadvantaged than any other group of women in Australia. While strength of cultural identity leads to increased social and emotional wellbeing (SEWB) and reduced socioeconomic hardship, Aboriginal people in urban and regional areas suffer greater discrimination and resultant psychological stress than those in remote areas; they are additionally subjected to accusations of inauthenticity. Improving Aboriginal women’s SEWB is pivotal in advancing Aboriginal SEWB overall. This research has explored nine regional Aboriginal women’s experiences of culture and identity by a process of deeply listening to each woman: Ngara Dyin (Dharawal language). The aim was to discern means to strengthen cultural attachment and enhance positive cultural identity for this group of women, and consequently their community. Through the process of interpretive phenomenological analysis, seven interdependent overarching themes were developed: walking and talking black; it’s not easy growing up in a white society; we sit down and listen; connection to Country; strong black women; the way forward; and, wanting that magic. Decolonising approaches to increasing Aboriginal women’s SEWB dictate that understandings of culture and identity must be informed and guided by the very people whose experience is being sought, and these women clearly indicate the need for strengthened cultural connection through funded gatherings and connections with senior women from remote areas.

Journal ArticleDOI
TL;DR: Model-driven engineering provides formal model to be analyzed and human-centric design is based on the psychological and physical needs of the human users.
Abstract: Human-centric design is based on the psychological and physical needs of the human users. Model-driven engineering provides formal model to be analyzed. How to combine formal model with human-centr...

Journal ArticleDOI
TL;DR: The need for a comprehensive review of all those factors that have an impact on the teaching and learning of Hawaiian, including, in particular, curriculum design and teacher is suggested.
Abstract: In the late 19th century, when the United States began its illegal occupation of the Hawaiian Kingdom, the teaching of languages was dominated by an approach— grammar translation—that has been associated with elitism and cultural dominance. Since then, there have been major developments in language teaching. Among these has been the development of “communicative language teaching” (CLT), an approach intended to encourage learners to use the target language for genuine communication in culturally appropriate contexts. However, analysis of a sample of Hawaiian language lessons taught in the second decade of the 20th century revealed little evidence of any of these. Instead, an approach reminiscent of aspects of grammar translation was very much in evidence, with teacher talk, often in English, occupying over half of the lesson in each case, and with considerable evidence of confusion, frustration and minimal participation on the part of many of the students. What this suggests is the need for a comprehensive review of all those factors that have an impact on the teaching and learning of Hawaiian, including, in particular, curriculum design and teacher ISSN 1837-0144 © International Journal of Critical Indigenous Studies 2 training. It is no longer possible to accept that while language teachers talk, often in the language/s of colonisers, language death continues to stalk those indigenous languages that have so far failed to succumb.

Journal ArticleDOI
TL;DR: A number of the problematic concerns are discussed and what would be involved in designing more effective textbooks for Indigenous languages, and textbooks that are in line with current research findings are outlined.
Abstract: As part of a recent study of the teaching and learning of te reo Māori (the Māori language) in English-medium secondary schools in Aotearoa/New Zealand, I asked a sample of teachers which textbooks they used. I then analysed some of those textbooks that were referred to most often, using focus points derived from a review of literature on the design of textbooks for the teaching of additional languages. What I found was that the textbooks analysed were inconsistent with the relevant curriculum guidelines document and were also problematic in a number of other ways. This article discusses a number of the problematic concerns and outlines what would be involved in designing more effective textbooks for Indigenous languages, and textbooks that are in line with current research findings.

Journal ArticleDOI
TL;DR: This paper proposes a new system of Substitution-Permutation network along with Randomization Expansion of 240 bits of input data which uses 16 S-Boxes which are selected randomly based on the subkey values throughout 64 rounds of substitution steps.
Abstract: This paper proposes a new system of Substitution-Permutation network along with Randomization Expansion of 240 bits of input data. System uses 16 S-Boxes which are selected randomly based on the subkey values throughout 64 rounds of substitution steps. 64 sub-keys are generated during the SubstitutionPermutation process. The middletext is transposed based on decimal value of the sub-key generated during the each round. A CBC mode is the best associated with this system.