scispace - formally typeset
Search or ask a question

Showing papers on "Interoperability published in 2013"


Patent
23 Sep 2013
TL;DR: In this article, mobile data processing systems (MSs) interact with systems in their vicinity, and with each other, in communications and interoperability, in order to present content to a user.
Abstract: Mobile data processing Systems (MSs) interact with systems in their vicinity, and with each other, in communications and interoperability. Information transmitted inbound to, transmitted outbound from, is in process at, or is application modified at a mobile data processing system triggers processing of actions in accordance with user configurations, for example to present content to a user.

459 citations


Journal ArticleDOI
TL;DR: The design and implementation of OpenSlide is presented, a vendor-neutral C library for reading and manipulating digital slides of diverse vendor formats that is extensible and easily interfaced to various programming languages.

384 citations


Proceedings ArticleDOI
20 Jun 2013
TL;DR: This research examines the potential for new Health Level 7 (HL7) standard Fast Healthcare Interoperability Resources (FHIR, pronounced “fire”) standard to help achieve healthcare systems interoperability.
Abstract: This research examines the potential for new Health Level 7 (HL7) standard Fast Healthcare Interoperability Resources (FHIR, pronounced “fire”) standard to help achieve healthcare systems interoperability. HL7 messaging standards are widely implemented by the healthcare industry and have been deployed internationally for decades. HL7 Version 2 (“v2”) health information exchange standards are a popular choice of local hospital communities for the exchange of healthcare information, including electronic medical record information. In development for 15 years, HL7 Version 3 (“v3”) was designed to be the successor to Version 2, addressing Version 2's shortcomings. HL7 v3 has been heavily criticized by the industry for being internally inconsistent even in it's own documentation, too complex and expensive to implement in real world systems and has been accused of contributing towards many failed and stalled systems implementations. HL7 is now experimenting with a new approach to the development of standards with FHIR. This research provides a chronicle of the evolution of the HL7 messaging standards, an introduction to HL7 FHIR and a comparative analysis between HL7 FHIR and previous HL7 messaging standards.

379 citations


Journal ArticleDOI
16 Apr 2013
TL;DR: The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control.
Abstract: The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today’s and tomorrow’s clinically motivated research.

359 citations


Journal ArticleDOI
TL;DR: The aim of this paper is to offer a comprehensive review of state-of-the-art researches on SG communications, including standards interoperability, cognitive access to unlicensed radio spectra, and cyber security.
Abstract: The necessity to promote smart grid (SG) has been recognized with a strong consensus. The SG integrates electrical grids and communication infrastructures and forms an intelligent electricity network working with all connected components to deliver sustainable electricity supplies. Many advanced communication technologies have been identified for SG applications with a potential to significantly enhance the overall efficiency of power grids. In this paper, the challenges and applications of communication technologies in SG are discussed. In particular, we identify three major challenges to implement SG communication systems, including standards interoperability, cognitive access to unlicensed radio spectra, and cyber security. The issues to implement SG communications on an evolutional path and its future trends are also addressed. The aim of this paper is to offer a comprehensive review of state-of-the-art researches on SG communications.

350 citations


Journal ArticleDOI
01 Nov 2013
TL;DR: The proposed security scheme is therefore based on RSA, the most widely used public key cryptography algorithm, and designed to work over standard communication stacks that offer UDP/IPv6 networking for Low power Wireless Personal Area Networks (6LoWPANs).
Abstract: In this paper, we introduce the first fully implemented two-way authentication security scheme for the Internet of Things (IoT) based on existing Internet standards, specifically the Datagram Transport Layer Security (DTLS) protocol. By relying on an established standard, existing implementations, engineering techniques and security infrastructure can be reused, which enables easy security uptake. Our proposed security scheme is therefore based on RSA, the most widely used public key cryptography algorithm. It is designed to work over standard communication stacks that offer UDP/IPv6 networking for Low power Wireless Personal Area Networks (6LoWPANs). Our implementation of DTLS is presented in the context of a system architecture and the scheme's feasibility (low overheads and high interoperability) is further demonstrated through extensive evaluation on a hardware platform suitable for the Internet of Things.

344 citations


Book
23 May 2013
TL;DR: This comprehensive overview describes the underlying principles, implementation details and key enhancing features of802.11n and 802.11ac throughput, including revised chapters on MAC and interoperability, plus new chapters on 802.
Abstract: If you've been searching for a way to get up to speed on IEEE 802.11n and 802.11ac WLAN standards without having to wade through the entire specification, then look no further. This comprehensive overview describes the underlying principles, implementation details and key enhancing features of 802.11n and 802.11ac. For many of these features the authors outline the motivation and history behind their adoption into the standard. A detailed discussion of key throughput, robustness, and reliability enhancing features (such as MIMO, multi-user MIMO, 40/80/160 MHz channels, transmit beamforming and packet aggregation) is given, plus clear summaries of issues surrounding legacy interoperability and coexistence. Now updated and significantly revised, this 2nd edition contains new material on 802.11ac throughput, including revised chapters on MAC and interoperability, plus new chapters on 802.11ac PHY and multi-user MIMO. An ideal reference for designers of WLAN equipment, network managers, and researchers in the field of wireless communications.

267 citations


Book
28 Oct 2013
TL;DR: The Architectural Reference Model (ARM), presented in this book by the members of the IoT-A project team driving this harmonization effort, makes it possible to connect vertically closed systems, architectures and application areas so as to create open interoperable systems and integrated environments and platforms.
Abstract: The Internet of Things (IoT) is an emerging network superstructure that will connect physical resources and actual users. It will support an ecosystem of smart applications and services bringing hyper-connectivity to our society by using augmented and rich interfaces. Whereas in the beginning IoT referred to the advent of barcodes and Radio Frequency Identification (RFID), which helped to automate inventory, tracking and basic identification, today IoT is characterized by a dynamic trend toward connecting smart sensors, objects, devices, data and applications. The next step will be cognitive IoT, facilitating object and data re-use across application domains and leveraging hyper-connectivity, interoperability solutions and semantically enriched information distribution. The Architectural Reference Model (ARM), presented in this book by the members of the IoT-A project team driving this harmonization effort, makes it possible to connect vertically closed systems, architectures and application areas so as to create open interoperable systems and integrated environments and platforms. It constitutes a foundation from which software companies can capitalize on the benefits of developing consumer-oriented platforms including hardware, software and services. The material is structured in two parts. Part A introduces the general concepts developed for and applied in the ARM. It is aimed at end users who want to use IoT technologies, managers interested in understanding the opportunities generated by these novel technologies, and system architects who are interested in an overview of the underlying basic models. It also includes several case studies to illustrate how the ARM has been used in real-life scenarios. Part B then addresses the topic at a more detailed technical level and is targeted at readers with a more scientific or technical background. It provides in-depth guidance on the ARM, including a detailed description of a process for generating concrete architectures, as well as reference manuals with guidelines on how to use the various models and perspectives presented to create a concrete architecture. Furthermore, best practices and tips on how system engineers can use the ARM to develop specific IoT architectures for dedicated IoT solutions are illustrated and exemplified in reverse mapping exercises of existing standards and platforms.

255 citations


Book ChapterDOI
21 Oct 2013
TL;DR: It is found that only one-third of endpoints make descriptive meta-data available, making it difficult to locate or learn about their content and capabilities, and patchy support for established SParQL features like ORDER BY as well as for new SPARQL 1.1 features is found.
Abstract: Hundreds of public SPARQL endpoints have been deployed on the Web, forming a novel decentralised infrastructure for querying billions of structured facts from a variety of sources on a plethora of topics. But is this infrastructure mature enough to support applications? For 427 public SPARQL endpoints registered on the DataHub, we conduct various experiments to test their maturity. Regarding discoverability, we find that only one-third of endpoints make descriptive meta-data available, making it difficult to locate or learn about their content and capabilities. Regarding interoperability, we find patchy support for established SPARQL features like ORDER BY as well as (understandably) for new SPARQL 1.1 features. Regarding efficiency, we show that the performance of endpoints for generic queries can vary by up to 3—4 orders of magnitude. Regarding availability, based on a 27-month long monitoring experiment, we show that only 32.2% of public endpoints can be expected to have (monthly) "two-nines" uptimes of 99—100%.

250 citations


Journal ArticleDOI
TL;DR: An EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defining the clinical data attributes is proposed.
Abstract: We present a cloud-based approach for the design of interoperable electronic health record (EHR) systems. Cloud computing environments provide several benefits to all the stakeholders in the healthcare ecosystem (patients, providers, payers, etc.). Lack of data interoperability standards and solutions has been a major obstacle in the exchange of healthcare data between different stakeholders. We propose an EHR system - cloud health information systems technology architecture (CHISTAR) that achieves semantic interoperability through the use of a generic design methodology which uses a reference model that defines a general purpose set of data structures and an archetype model that defines the clinical data attributes. CHISTAR application components are designed using the cloud component model approach that comprises of loosely coupled components that communicate asynchronously. In this paper, we describe the high-level design of CHISTAR and the approaches for semantic interoperability, data integration, and security.

209 citations


Proceedings ArticleDOI
Nicolas Ferry1, Alessandro Rossini1, Franck Chauvel1, Brice Morin1, Arnor Solberg1 
28 Jun 2013
TL;DR: A classification of the state-of-the-art of cloud solutions is provided, and the need for model-driven engineering techniques and methods facilitating the specification of provisioning, deployment, monitoring, and adaptation concerns of multi-cloud systems at design-time and their enactment at run-time is argued.
Abstract: In the landscape of cloud computing, the competition between providers has led to an ever growing number of cloud solutions offered to consumers. The ability to run and manage multi-cloud systems (i.e., applications on multiple clouds) allows exploiting the peculiarities of each cloud solution and hence optimising the performance, availability, and cost of the applications. However, these cloud solutions are typically heterogeneous and the provided features are often incompatible. This diversity hinders the proper exploitation of the full potential of cloud computing, since it prevents interoperability and promotes vendor lock-in, as well as it increases the complexity of development and administration of multi-cloud systems. This problem needs to be addressed promptly. In this paper, we provide a classification of the state-of-the-art of cloud solutions, and argue for the need for model-driven engineering techniques and methods facilitating the specification of provisioning, deployment, monitoring, and adaptation concerns of multi-cloud systems at design-time and their enactment at run-time.

Journal ArticleDOI
TL;DR: The objective of linking building data in the cloud is to create an integrated well-connected graph of relevant information for managing a building to help solve the data interoperability problems.

Journal ArticleDOI
TL;DR: The Group on Earth Observation Model Web initiative utilizes a Model as a Service approach to increase model access and sharing, and a flexible architecture, capable of integrating different existing distributed computing infrastructures, is required to address the performance requirements.
Abstract: The Group on Earth Observation (GEO) Model Web initiative utilizes a Model as a Service approach to increase model access and sharing. It relies on gradual, organic growth leading towards dynamic webs of interacting models, analogous to the World Wide Web. The long term vision is for a consultative infrastructure that can help address "what if" and other questions that decision makers and other users have. Four basic principles underlie the Model Web: open access, minimal barriers to entry, service-driven, and scalability; any implementation approach meeting these principles will be a step towards the long term vision. Implementing a Model Web encounters a number of technical challenges, including information modelling, minimizing interoperability agreements, performance, and long term access, each of which has its own implications. For example, a clear information model is essential for accommodating the different resources published in the Model Web (model engines, model services, etc.), and a flexible architecture, capable of integrating different existing distributed computing infrastructures, is required to address the performance requirements. Architectural solutions, in keeping with the Model Web principles, exist for each of these technical challenges. There are also a variety of other key challenges, including difficulties in making models interoperable; calibration and validation; and social, cultural, and institutional constraints. Although the long term vision of a consultative infrastructure is clearly an ambitious goal, even small steps towards that vision provide immediate benefits. A variety of activities are now in progress that are beginning to take those steps.

Journal ArticleDOI
TL;DR: This article presents a novel concept, based on the Model Driven Architecture (MDA), implemented under the Interoperable Manufacturing Knowledge Systems (IMKS) project in order to understand the extent to which manufacturing system interoperability can be supported using radically new methods of knowledge sharing.

Book ChapterDOI
26 May 2013
TL;DR: A significant update to increase the overall quality of RDFized datasets generated from open scripts powered by an API to generate registry-validated IRIs, dataset provenance and metrics, SPARQL endpoints, downloadable RDF and database files is described.
Abstract: Bio2RDF currently provides the largest network of Linked Data for the Life Sciences. Here, we describe a significant update to increase the overall quality of RDFized datasets generated from open scripts powered by an API to generate registry-validated IRIs, dataset provenance and metrics, SPARQL endpoints, downloadable RDF and database files. We demonstrate federated SPARQL queries within and across the Bio2RDF network, including semantic integration using the Semanticscience Integrated Ontology (SIO). This work forms a strong foundation for increased coverage and continuous integration of data in the life sciences.

Proceedings Article
09 Jul 2013
TL;DR: This work examines ways for extending the JDL model from 1998 to support exploitation functions and information management for situation awareness, massive data analytics for contextual awareness, and domain-specific needs for mission awareness.
Abstract: The original Joint Directors of Laboratories (JDL) model was developed in the early 90's, with revisits in 1998, and 2004. Today, with new technologies of big data, cloud computing, and machine analytics, there is an ever increasing need for integration of people and machines. The original JDL model focused on the data fusion (correlation, filtering, and association) issues, while today there is an increasing emphasis on an integrated approach to information exploitation over sensors, users, and missions using enterprise architectures, interoperability standards, and intelligence to the edge. Given these recent changes to computation and distributed access, we examine ways for extending the JDL model from 1998 to support exploitation functions and information management for situation awareness, massive data analytics for contextual awareness, and domain-specific needs for mission awareness.

Book
18 Jun 2013
TL;DR: Examining numerous attacks in detail, the authors look at the tools that intruders use and show how to use this knowledge to protect networks.
Abstract: With the rapid rise in the ubiquity and sophistication of Internet technology and the accompanying growth in the number of network attacks, network intrusion detection has become increasingly important. Anomaly-based network intrusion detection refers to finding exceptional or nonconforming patterns in network traffic data compared to normal behavior. Finding these anomalies has extensive applications in areas such as cyber security, credit card and insurance fraud detection, and military surveillance for enemy activities. Network Anomaly Detection: A Machine Learning Perspective presents machine learning techniques in depth to help you more effectively detect and counter network intrusion. In this book, youll learn about: Network anomalies and vulnerabilities at various layers The pros and cons of various machine learning techniques and algorithms A taxonomy of attacks based on their characteristics and behavior Feature selection algorithms How to assess the accuracy, performance, completeness, timeliness, stability, interoperability, reliability, and other dynamic aspects of a network anomaly detection system Practical tools for launching attacks, capturing packet or flow traffic, extracting features, detecting attacks, and evaluating detection performance Important unresolved issues and research challenges that need to be overcome to provide better protection for networks Examining numerous attacks in detail, the authors look at the tools that intruders use and show how to use this knowledge to protect networks. The book also provides material for hands-on development, so that you can code on a testbed to implement detection methods toward the development of your own intrusion detection system. It offers a thorough introduction to the state of the art in network anomaly detection using machine learning approaches and systems.

Journal ArticleDOI
TL;DR: The results presented here provide the synthesis of the current state of play regarding the work developed by the Enterprise Interoperability (EI) at the European Commission's Future Internet Enterprise Systems (FInES) cluster.
Abstract: The recently posed challenge of developing an Enterprise Interoperability Science Foundation EISF prompted some academic agents to attempt a systematisation of the Interoperability Body of Knowledge IBoK. Still in their embryonic stages, these efforts have sought to organise and aggregate information from very fragmented and disparate sources, and with different granularities of detail, distinct epistemology origins, separate academic fields, etc. This paper aims to distinguish between levels of specificity of the Interoperability academic work, which are often confused, by considering Models, Theories, and Frameworks. The paper revises these concepts within the context of the EISF's recent work. The results presented here, reflecting consultation with the expert community, provide the synthesis of the current state of play regarding the work developed by the Enterprise Interoperability EI at the European Commission's Future Internet Enterprise Systems FInES cluster.

Journal ArticleDOI
TL;DR: This paper provides a structured assessment and classification of existing challenges and approaches, serving as potential guideline for researchers and practitioners in the field of TEL.
Abstract: Purpose - Research in the area of technology-enhanced learning (TEL) throughout the last decade has largely focused on sharing and reusing educational resources and data. This effort has led to a fragmented landscape of competing metadata schemas, or interface mechanisms. More recently, semantic technologies were taken into account to improve interoperability. The linked data approach has emerged as the de facto standard for sharing data on the web. To this end, it is obvious that the application of linked data principles offers a large potential to solve interoperability issues in the field of TEL. This paper aims to address this issue. Design/methodology/approach - In this paper, approaches are surveyed that are aimed towards a vision of linked education, i.e. education which exploits educational web data. It particularly considers the exploitation of the wealth of already existing TEL data on the web by allowing its exposure as linked data and by taking into account automated enrichment and interlinking techniques to provide rich and well-interlinked data for the educational domain. Findings - So far web-scale integration of educational resources is not facilitated, mainly due to the lack of take-up of shared principles, datasets and schemas. However, linked data principles increasingly are recognized by the TEL community. The paper provides a structured assessment and classification of existing challenges and approaches, serving as potential guideline for researchers and practitioners in the field. Originality/value - Being one of the first comprehensive surveys on the topic of linked data for education, the paper has the potential to become a widely recognized reference publication in the area.

Journal ArticleDOI
TL;DR: A comprehensive approach is introduced, including a set of tools as well as methodological guidelines, to deal with the interoperability of CDSSs and EHRs based on archetypes, which is used to build a conceptual layer of the kind of a virtual health record over the EHR whose contents need to be integrated and used in the C DSS.

Journal ArticleDOI
29 Apr 2013
TL;DR: A comprehensive survey on the state-of-the-art efforts to facilitate cloud interoperability, with a focus on interoperability among different IaaS (infrastructure as a service) cloud platforms is conducted.
Abstract: Cloud computing is a new computing paradigm that allows users with different computing demands to access a shared pool of configurable computing resources (e.g., servers, network, storage, database, applications and services). Many commercial cloud providers have emerged in the past 6-7 years, and each typically provides its own cloud infrastructure, APIs and application description formats to access the cloud resources, as well as support for service level agreements (SLAs). Such vendor lock-in has seriously limited the flexibility that cloud end users would like to process, when it comes to deploy applications over different infrastructures in different geographic locations, or to migrate a service from one provider's cloud to another. To enable seamless sharing of resources from a pool of cloud providers, efforts have emerged recently to facilitate cloud interoperability, i.e., the ability for multiple cloud providers to work together, from both the industry and academia. In this article, we conduct a comprehensive survey on the state-of-the-art efforts, with a focus on interoperability among different IaaS (infrastructure as a service) cloud platforms. We investigate the existing studies on taxonomies and standardization of cloud interoperability, as well as practical cloud technologies from both the cloud provider's and user's perspectives to enable interoperation. We pose issues and challenges to advance the topic area, and hope to pave a way for the forthcoming research.

Journal ArticleDOI
TL;DR: A future where regardless of geographical location, scientists will be able to use their Web browsers to seamlessly access data, software, and processing resources that are managed by diverse systems in separate administration domains via Virtual Research Environments is envisaged.
Abstract: Virtual Research Environments are innovative, web-based, community-oriented, comprehensive, flexible, and secure working environments conceived to serve the needs of modern science. We overview the existing initiatives developing these environments by highlighting the major distinguishing features. We envisage a future where regardless of geographical location, scientists will be able to use their Web browsers to seamlessly access data, software, and processing resources that are managed by diverse systems in separate administration domains via Virtual Research Environments. We identify and discuss the major challenges that should be resolved to fully achieve the proposed vision, i.e., large-scale integration and interoperability, sustainability, and adoption.

Proceedings ArticleDOI
07 Jan 2013
TL;DR: The role of standards in cloud-computing interoperability is explored, standard-related efforts are covered, several cloud interoperability use cases are discussed, and some recommendations for moving forward with cloud- computing adoption regardless of the maturity of standards for the cloud are provided.
Abstract: In cloud computing, interoperability typically refers to the ability to easily move workloads and data from one cloud provider to another or between private and public clouds. A common tactic for enabling interoperability is the use of open standards, so there is currently a large amount of active work in standards development for the Cloud. This paper explores the role of standards in cloud-computing interoperability. It covers standard-related efforts, discusses several cloud interoperability use cases, and provides some recommendations for moving forward with cloud-computing adoption regardless of the maturity of standards for the cloud.

Journal Article
TL;DR: The paper suggests that the adoption of a standardized healthcare terminology, education strategy, design of useable interfaces for ICT tools, privacy and security issues as well as the connection of legacy systems to the health network are ways of achieving complete interoperability of electronic based Health Information Systems in healthcare.
Abstract: Information and Communication Technologies (ICTs) play significant roles in the improvement of patient care and the reduction of healthcare cost by facilitating the seamless exchange of vital information among healthcare providers. Thus, clinicians can have easy access to patients' information in a timely manner, medical errors are reduced, and health related records are easily integrated. However, as beneficial as data interoperability is to healthcare, at present, it is largely an unreached goal. This is chiefly because electronic Health Information Systems used within the healthcare organizations have been developed independently with diverse and heterogeneous ICT tools, methods, processes and procedures which result in a large number of heterogeneous and distributed proprietary models for representing and recording patients' information. Consequently, the seamless, effective and meaningful exchange of patients' information is yet to be achieved across healthcare systems. This paper therefore appraises the concepts of interoperability in the context of healthcare, its benefits and its attendant challenges. The paper suggests that the adoption of a standardized healthcare terminology, education strategy, design of useable interfaces for ICT tools, privacy and security issues as well as the connection of legacy systems to the health network are ways of achieving complete interoperability of electronic based Health Information Systems in healthcare.

Journal ArticleDOI
TL;DR: This paper provides a detailed overview of the security challenges related to the deployment of smart objects, including security protocols at network, transport, and application layers, together with lightweight cryptographic algorithms proposed to be used instead of conventional and demanding ones, in terms of computational resources.
Abstract: The Internet of Things (IoT) refers to the Internet-like structure of billions of interconnected constrained devices, denoted as “smart objects”. Smart objects have limited capabilities, in terms of computational power and memory, and might be battery-powered devices, thus raising the need to adopt particularly energy efficient technologies. Among the most notable challenges that building interconnected smart objects brings about, there are standardization and interoperability. The use of IP has been foreseen as the standard for interoperability for smart objects. As billions of smart objects are expected to come to life and IPv4 addresses have eventually reached depletion, IPv6 has been identified as a candidate for smart-object communication. The deployment of the IoT raises many security issues coming from (i) the very nature of smart objects, e.g., the adoption of lightweight cryptographic algorithms, in terms of processing and memory requirements; and (ii) the use of standard protocols, e.g., the need to minimize the amount of data exchanged between nodes. This paper provides a detailed overview of the security challenges related to the deployment of smart objects. Security protocols at network, transport, and application layers are discussed, together with lightweight cryptographic algorithms proposed to be used instead of conventional and demanding ones, in terms of computational resources. Security aspects, such as key distribution and security bootstrapping, and application scenarios, such as secure data aggregation and service authorization, are also discussed.

Journal ArticleDOI
TL;DR: In this article, the authors present potential challenges and opportunities for using thermal simulation tools to optimize building performance, and outline major criteria for the evaluation of building thermal simulation tool based on specifications and capabilities in interoperability.
Abstract: This paper describes potential challenges and opportunities for using thermal simulation tools to optimize building performance. After reviewing current trends in thermal simulation, it outlines major criteria for the evaluation of building thermal simulation tools based on specifications and capabilities in interoperability. Details are discussed including workflow of data exchange of multiple thermal analyses such as the BIM-based application. The present analysis focuses on selected thermal simulation tools that provide functionalities to exchange data with other tools in order to obtain a picture of its basic work principles and to identify selection criteria for generic thermal tools in BIM. Significances and barriers to integration design with BIM and building thermal simulation tools are also discussed.

Journal ArticleDOI
TL;DR: There are currently many mHealth initiatives in Brazil, but some areas have not been much explored, such as solutions for treatment compliance and awareness raising, as well as decision support systems.

Journal ArticleDOI
TL;DR: The recent Brokering approach is introduced; this solution aims at interconnecting the heterogeneous disciplinary and domain service buses, avoiding the imposition of any federated or common specification.
Abstract: For disciplinary and domain applications, systems interoperability largely deals with the adoption of agreed technologies, standards, specifications and interfaces with a disciplinary/domain service bus or means of information exchange, if available. However, multi-disciplinary efforts make more complex demands on the type of systems and arrangements needed to support cross-domain activities. Thus, interoperability among diverse disciplinary and domain systems must be pursued adopting more flexible and sustainable approaches. This paper discusses the challenges for multi-disciplinary interoperability. The recent Brokering approach is introduced; this solution aims at interconnecting the heterogeneous disciplinary and domain service buses, avoiding the imposition of any federated or common specification. It can deliver a range of services such as discovery and access through a Broker Framework. The Brokering approach has been successfully introduced by the EuroGEOSS research project and recently adopted by the GEOSS Common Infrastructure (GCI). US NSF EarthCube initiative also has recognized the importance of brokering for its reference architecture. The GI-* technology, empowering the EuroGEOSS and the GCI brokering frameworks, is presented and discussed.

Book ChapterDOI
28 Aug 2013
TL;DR: This paper considers the crucial design challenges that smart spaces meet for deploying in IoT: interoperability, information processing, security and privacy, and considers solutions to cope with the challenges.
Abstract: The smart spaces paradigm and the M3 concept have already showed their potential for constructing advanced service infrastructures. The Internet of Things (IoT) provides the possibility to make any “thing” a user or component of such a service infrastructure. In this paper, we consider the crucial design challenges that smart spaces meet for deploying in IoT: (1) interoperability, (2) information processing, (3) security and privacy. The paper makes a step toward a systematized view on smart spaces as a computing paradigm for IoT applications. We summarize the groundwork from pilot M3 implementations and discuss solutions to cope with the challenges. The considered solutions can be already used in advanced service infrastructures.

Journal ArticleDOI
TL;DR: The development of the new framework offers ecosystem modelers with unprecedented capabilities to include spatial–temporal time series into food web analysis with a minimal set of required steps is a promising step toward integrating species distribution models and food web dynamics, and future implementations of interdisciplinary model interoperability.