scispace - formally typeset
Search or ask a question

Showing papers on "Standardization published in 2019"


Journal ArticleDOI
TL;DR: Standards and consensus recommendations are presented for manufacturers, clinicians, operators, and researchers with the aims of increasing the accuracy, precision, and quality of spirometric measurements and improving the patient experience.
Abstract: Background: Spirometry is the most common pulmonary function test. It is widely used in the assessment of lung function to provide objective information used in the diagnosis of lung diseases and monitoring lung health. In 2005, the American Thoracic Society and the European Respiratory Society jointly adopted technical standards for conducting spirometry. Improvements in instrumentation and computational capabilities, together with new research studies and enhanced quality assurance approaches, have led to the need to update the 2005 technical standards for spirometry to take full advantage of current technical capabilities.Methods: This spirometry technical standards document was developed by an international joint task force, appointed by the American Thoracic Society and the European Respiratory Society, with expertise in conducting and analyzing pulmonary function tests, laboratory quality assurance, and developing international standards. A comprehensive review of published evidence was performed. A patient survey was developed to capture patients' experiences.Results: Revisions to the 2005 technical standards for spirometry were made, including the addition of factors that were not previously considered. Evidence to support the revisions was cited when applicable. The experience and expertise of task force members were used to develop recommended best practices.Conclusions: Standards and consensus recommendations are presented for manufacturers, clinicians, operators, and researchers with the aims of increasing the accuracy, precision, and quality of spirometric measurements and improving the patient experience. A comprehensive guide to aid in the implementation of these standards was developed as an online supplement.

1,481 citations


Journal ArticleDOI
TL;DR: The main developments and technical aspects of this ongoing standardization effort for compactly representing 3D point clouds, which are the 3D equivalent of the very well-known 2D pixels are introduced.
Abstract: Due to the increased popularity of augmented and virtual reality experiences, the interest in capturing the real world in multiple dimensions and in presenting it to users in an immersible fashion has never been higher. Distributing such representations enables users to freely navigate in multi-sensory 3D media experiences. Unfortunately, such representations require a large amount of data, not feasible for transmission on today’s networks. Efficient compression technologies well adopted in the content chain are in high demand and are key components to democratize augmented and virtual reality applications. Moving Picture Experts Group, as one of the main standardization groups dealing with multimedia, identified the trend and started recently the process of building an open standard for compactly representing 3D point clouds, which are the 3D equivalent of the very well-known 2D pixels. This paper introduces the main developments and technical aspects of this ongoing standardization effort.

470 citations


Journal ArticleDOI
01 Jun 2019-Cities
TL;DR: In this article, the authors compare seven indicator standards for smart sustainable cities and provide guidance for city managers and policy makers to select the indicators and standard that best correspond to their assessment need and goals, and align with their stage in Smart sustainable city implementation.

250 citations


Book ChapterDOI
01 Jan 2019
TL;DR: The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems (A/IS) is a program of the IEEE initiated to address ethical issues raised by the development and dissemination of these systems.
Abstract: The IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems (A/IS) is a program of the IEEE initiated to address ethical issues raised by the development and dissemination of these systems. It identified over one hundred and twenty key issues and provided candidate recommendations to address them. In addition, it has provided the inspiration for fourteen approved standardization projects that are currently under development with the IEEE Standards Association.

154 citations


Journal ArticleDOI
08 Jan 2019
TL;DR: The aspects of the framework such as its created TI architecture, including the elements, functions, interfaces, and other considerations therein, as well as the novel aspects and differentiating factors compared with, e.g., 5G Ultra-Reliable Low-Latency Communication.
Abstract: The IEEE “Tactile Internet” (TI) Standards working group (WG), designated the numbering IEEE 1918.1, undertakes pioneering work on the development of standards for the TI. This paper describes the WG, its intentions, and its developing baseline standard and the associated reasoning behind that and touches on a further standard already initiated under its scope: IEEE 1918.1.1 on “Haptic Codecs for the TI.” IEEE 1918.1 and its baseline standard aim to set the framework and act as the foundations for the TI, thereby also serving as a basis for further standards developed on TI within the WG. This paper discusses the aspects of the framework such as its created TI architecture, including the elements, functions, interfaces, and other considerations therein, as well as the novel aspects and differentiating factors compared with, e.g., 5G Ultra-Reliable Low-Latency Communication, where it is noted that the TI will likely operate as an overlay on other networks or combinations of networks. Key foundations of the WG and its baseline standard are also highlighted, including the intended use cases and associated requirements that the standard must serve, and the TI’s fundamental definition and assumptions as understood by the WG, among other aspects.

113 citations


Journal ArticleDOI
TL;DR: It was discovered that Reference Architecture Model Industry 4.0 was mature with respect to communication and information sharing in the scope of the connected world, that further standardization enabling interoperability of different vendors’ technology is still under development and that technologystandardization enabling executable business processes between networked enterprises was lacking.

110 citations


Journal ArticleDOI
TL;DR: It is found that securing technical capability is only a part of RPA implementation process, and organizations benefit from automating only certain processes, those that are structured, repeated, rules-based, and with digital inputs.

96 citations


Journal ArticleDOI
TL;DR: Key aspects of standardization are discussed, including equipment used, number of chest zones assessed, the method of quantifying B‐lines, the presence and timing of additional investigations and the impact of therapy, to provide standardization in the preparation, review and analysis of manuscripts.
Abstract: Lung ultrasound is a useful tool for the assessment of patients with both acute and chronic heart failure, but the use of different image acquisition methods, inconsistent reporting of the technique employed and variable quantification of ‘B‐lines,’ have all made it difficult to compare published reports. We therefore need to ensure that future studies utilizing lung ultrasound in the assessment of heart failure adopt a standardized approach to reporting the quantification of pulmonary congestion. Strategies to improve patient care by use of lung ultrasound in the assessment of heart failure have been difficult to develop. In the present document, key aspects of standardization are discussed, including equipment used, number of chest zones assessed, the method of quantifying B‐lines, the presence and timing of additional investigations (e.g. natriuretic peptides and echocardiography) and the impact of therapy. This consensus report includes a checklist to provide standardization in the preparation, review and analysis of manuscripts. This will serve as a guide for investigators and clinicians and enhance the quality and transparency of lung ultrasound research.

85 citations


Journal ArticleDOI
TL;DR: A standardized vocabulary is proposed that can be used for storing and sharing ecological trait data and may ease data integration and use of trait data for a broader ecological research community and enable global syntheses across a wide range of taxa and ecosystems.
Abstract: Trait-based approaches are widespread throughout ecological research as they offer great potential to achieve a general understanding of a wide range of ecological and evolutionary mechanisms. Accordingly, a wealth of trait data is available for many organism groups, but this data is underexploited due to a lack of standardization and heterogeneity in data formats and definitions. We review current initiatives and structures developed for standardizing trait data and discuss the importance of standardization for trait data hosted in distributed open-access repositories. In order to facilitate the standardization and harmonization of distributed trait datasets by data providers and data users, we propose a standardized vocabulary that can be used for storing and sharing ecological trait data. We discuss potential incentives and challenges for the wide adoption of such a standard by data providers. The use of a standard vocabulary allows for trait datasets from heterogeneous sources to be aggregated more easily into compilations and facilitates the creation of interfaces between software tools for trait-data handling and analysis. By aiding decentralized trait-data standardization, our vocabulary may ease data integration and use of trait data for a broader ecological research community and enable global syntheses across a wide range of taxa and ecosystems.

72 citations


Journal ArticleDOI
TL;DR: The status of standardization efforts in immersive media with a focus on video signals is addressed, the development timelines are indicated, the main technical details are summarized, and pointers to further points of reference are provided.
Abstract: Based on increasing availability of capture and display devices dedicated to immersive media, coding, and transmission of these media has recently become a highest-priority subject of standardization. Different levels of immersiveness are defined with respect to an increasing degree of freedom in terms of movements of the observer within the immersive media scene. The level ranges from three degrees of freedom allowing the user to look around in all directions from a fixed point of view to six degrees of freedom, where the user can freely alter the viewpoint within the immersive media scene. The moving pictures experts group (MPEG) of ISO/IEC is developing a standards suite on “Coded Representation of Immersive Media,” called MPEG-I, to provide technical solutions for building blocks of the media transmission chain, ranging from architecture, systems tools, coding of video and audio signals, to point clouds and timed text. In this paper, an overview on recent and ongoing standardization efforts in this area is presented. While some specifications, such as high efficiency video coding or version 1 of the omnidirectional media format, are already available, other activities are under development or in the exploration phase. This paper addresses the status of these efforts with a focus on video signals, indicates the development timelines, summarizes the main technical details, and provides pointers to further points of reference.

71 citations


Journal ArticleDOI
TL;DR: The few existing ontologies for I4.0, along with the current state of the standardization effort in the factory 4.0 domain and examples of real-world scenarios for I3.0 are presented.
Abstract: The current fourth industrial revolution, or ‘Industry 4.0’ (I4.0), is driven by digital data, connectivity, and cyber systems, and it has the potential to create impressive/new business opportunities. With the arrival of I4.0, the scenario of various intelligent systems interacting reliably and securely with each other becomes a reality which technical systems need to address. One major aspect of I4.0 is to adopt a coherent approach for the semantic communication in between multiple intelligent systems, which include human and artificial (software or hardware) agents. For this purpose, ontologies can provide the solution by formalizing the smart manufacturing knowledge in an interoperable way. Hence, this paper presents the few existing ontologies for I4.0, along with the current state of the standardization effort in the factory 4.0 domain and examples of real-world scenarios for I4.0.

Journal ArticleDOI
TL;DR: Advice and recommendations to harmonize and support regionalization in LCIA for developers of LCIA methods, LCI databases, and LCA software are given and aggregated CFs should be provided and should be calculated as the weighted averages of constituent CFs using annual flow quantities as weights whenever available.
Abstract: Regionalized life cycle impact assessment (LCIA) has rapidly developed in the past decade, though its widespread application, robustness, and validity still face multiple challenges. Under the umbrella of UNEP/SETAC Life Cycle Initiative, a dedicated cross-cutting working group on regionalized LCIA aims to provide an overview of the status of regionalization in LCIA methods. We give guidance and recommendations to harmonize and support regionalization in LCIA for developers of LCIA methods, LCI databases, and LCA software. A survey of current practice among regionalized LCIA method developers was conducted. The survey included questions on chosen method’s spatial resolution and scale, the spatial resolution of input parameters, the choice of native spatial resolution and limitations, operationalization and alignment with life cycle inventory data, methods for spatial aggregation, the assessment of uncertainty from input parameters and model structure, and the variability due to spatial aggregation. Recommendations are formulated based on the survey results and extensive discussion by the authors. Survey results indicate that majority of regionalized LCIA models have global coverage. Native spatial resolutions are generally chosen based on the availability of global input data. Annual modeled or measured elementary flow quantities are mostly used for aggregating characterization factors (CFs) to larger spatial scales, although some use proxies, such as population counts. Aggregated CFs are mostly available at the country level. Although uncertainty due to input parameter, model structure, and spatial aggregation are available for some LCIA methods, they are rarely implemented for LCA studies. So far, there is no agreement if a finer native spatial resolution is the best way to reduce overall uncertainty. When spatially differentiated model CFs are not easily available, archetype models are sometimes developed. Regionalized LCIA methods should be provided as a transparent and consistent set of data and metadata using standardized data formats. Regionalized CFs should include both uncertainty and variability. In addition to the native-scale CFs, aggregated CFs should always be provided and should be calculated as the weighted averages of constituent CFs using annual flow quantities as weights whenever available. This paper is an important step forward for increasing transparency, consistency, and robustness in the development and application of regionalized LCIA methods.

Journal ArticleDOI
TL;DR: The tensions that emerge between the universal and the local in a global world require continuous negotiation in medical education, but standardization and contextual diversity tend to operate as separate philosophies, with little attention to the interplay between them.
Abstract: CONTEXT The tensions that emerge between the universal and the local in a global world require continuous negotiation. However, in medical education, standardization and contextual diversity tend to operate as separate philosophies, with little attention to the interplay between them. METHODS The authors synthesise the literature related to the intersections and resulting tensions between standardization and contextual diversity in medical education. In doing so, the authors analyze the interplay between these competing concepts in two domains of medical education (admissions and competency-based medical education), and provide concrete examples drawn from the literature. RESULTS Standardization offers many rewards: its common articulations and assumptions promote patient safety, foster continuous quality improvement, and enable the spread of best practices. Standardization may also contribute to greater fairness, equity, reliability and validity in high stakes processes, and can provide stakeholders, including the public, with tangible reassurance and a sense of the stable and timeless. At the same time, contextual variation in medical education can afford myriad learning opportunities, and it can improve alignment between training and local workforce needs. The inevitable diversity of contexts for learning and practice renders any absolute standardization of programs, experiences, or outcomes an impossibility. CONCLUSIONS The authors propose a number of ways to examine the interplay of contextual diversity and standardization and suggest three ways to move beyond an either/or stance. In reconciling the laudable goals of standardization and the realities of the innumerable contexts in which we train and deliver care, we are better positioned to design and deliver a medical education system that is globally responsible and locally engaged.

Journal ArticleDOI
03 Oct 2019-Sensors
TL;DR: An Internet of Medical Things (IoMT) platform for pervasive healthcare that ensures interoperability, quality of the detection process, and scalability in an M2M-based architecture, and provides functionalities for the processing of high volumes of data, knowledge extraction, and common healthcare services is proposed.
Abstract: Pervasive healthcare services have undergone a great evolution in recent years. The technological development of communication networks, including the Internet, sensor networks, and M2M (Machine-to-Machine) have given rise to new architectures, applications, and standards related to addressing almost all current e-health challenges. Among the standards, the importance of OpenEHR has been recognized, since it enables the separation of medical semantics from data representation of electronic health records. However, it does not meet the requirements related to interoperability of e-health devices in M2M networks, or in the Internet of Things (IoT) scenarios. Moreover, the lack of interoperability hampers the application of new data-processing techniques, such as data mining and online analytical processing, due to the heterogeneity of the data and the sources. This article proposes an Internet of Medical Things (IoMT) platform for pervasive healthcare that ensures interoperability, quality of the detection process, and scalability in an M2M-based architecture, and provides functionalities for the processing of high volumes of data, knowledge extraction, and common healthcare services. The platform uses the semantics described in OpenEHR for both data quality evaluation and standardization of healthcare data stored by the association of IoMT devices and observations defined in OpenEHR. Moreover, it enables the application of big data techniques and online analytic processing (OLAP) through Hadoop Map/Reduce and content-sharing through fast healthcare interoperability resource (FHIR) application programming interfaces (APIs).

Journal ArticleDOI
TL;DR: The E-Mobility Systems Architecture (EMSA) Model is proposed, a three-dimensional systems architecture model for the e-mobility sector that fulfills all requirements regarding the management of complexity and ensuring interoperability.
Abstract: The future of e-mobility will consist of a large number of connected electric vehicles, smart charging stations and information systems at the intersection of electricity and mobility sector. When engineering and integrating the multitude of systems into even more complex systems-of-systems for e-mobility, interoperability and complexity handling are vital. Model-based system architectures support the engineering process of information systems with the concepts of abstraction, reduction and separation of concerns. In this paper, we contribute to the research body, by extracting requirements for managing complexity and interoperability of these systems. Further, a comparative analysis of the state-of-the-art in existing architecture models and frameworks for e-mobility is conducted. Based on the identified gaps in existing research, we propose the E-Mobility Systems Architecture (EMSA) Model, a three-dimensional systems architecture model for the e-mobility sector. Its structure originates from the well-established Smart Grid Architecture Model. We further allocate all relevant entities from the e-mobility sector to the EMSA dimensions, including a harmonized role model, functional reference architecture, component and systems allocation, as well as a mapping of data standards and communication protocols. The model then is validated qualitatively and quantitatively against the requirements with a case study approach. Our evaluation shows that the EMSA Model fulfills all requirements regarding the management of complexity and ensuring interoperability. From the case study, we further identify gaps in current data model standardization for e-mobility.

Journal ArticleDOI
TL;DR: The purpose of this paper is to integrate lean tools in the analysis of customer satisfaction and to examine its implications for research and practice, and a real application of QFD and Hoshin Kanri and how they may help the service organizations with future development is presented.
Abstract: Purpose Customer satisfaction refers to the extent to which customers are happy and satisfied with the products and services provided by a business. The purpose of this paper is twofold: first, to integrate lean tools in the analysis of customer satisfaction and, second, to examine its implications for research and practice. Design/methodology/approach The author proposes the combination of three lean tools in order to design a service quality system that has customer expectations (CEs) as the first input. These tools are quality function deployment (QFD), Hoshin Kanri planning process (HKPP) and benchmarking. The author uses a case study to show the functionality of these tools and the final design of a service quality system for a medical center. Findings Interaction between the service provider and the customer is the primary core activity for service-oriented businesses of different natures. A key relationship between trust in service quality and customer satisfaction cannot be ignored in interpersonal-based service encounters. However, there is a gap in the literature in terms of standardized lean-based procedures or methodologies that lead to improved customer satisfaction that are based directly on CEs. Research limitations/implications Given the variety of the population, the authors developed several methodologies to standardize the customer responses. Using several total quality management tools, the standardization allows the authors to separate the different CEs. The gathering of customers’ expectations (voice of the customers) allows the companies to focus on the real problems expressed by the users of the service, increasing their loyalty and, most importantly in the field under study, the customer’s satisfaction with the service received. Practical implications For practitioners, this study helps with the use of lean tools such as QFD, benchmarking and HKPP and attempts to bridge such a gap with an evidence-based real case. Social implications With the incorporation of all the customer needs, additional elements must be considered in the design of new services. Availability for all and sustainability play an important part of the CEs. Originality/value This paper presents a real application of QFD and Hoshin Kanri and how they may help the service organizations with future development.

Journal ArticleDOI
TL;DR: The history of global activities on ICN from 2010 is described, giving references to various projects and the recent progress in the standardization of ICN component technologies in ITU-T and various documents produced by ICNRG are described.
Abstract: Information-centric networking (ICN) is a new approach to networking contents rather than devices that hold the contents. It has recently attracted much attention of network research and standardization communities. National and multi-national funded research projects have progressed worldwide. International Telecommunication Union - Telecommunication Standardization Sector (ITU-T) started ICN standardization activities in 2012. In parallel, the standards-oriented research cooperation is progressing in the Information-Centric Networking Research Group (ICNRG) of the Internet Research Task Force (IRTF). All these global efforts have been collectively advancing the novel network architecture of ICN. However, there are very few surveys and discussions on the detailed ICN standardization status. To update the reader with information about the ICN research and standardization related activities, this paper starts with the history of global activities on ICN from 2010, giving references to various projects. It then describes the recent progress in the standardization of ICN component technologies in ITU-T and various documents produced by ICNRG. Lastly, it discusses the future directions for progressing ICN.


Book ChapterDOI
16 Aug 2019
TL;DR: The second edition of Studies has been updated to conform to the latest available Ada manual, and all of its references to the Ada manual come from this document.
Abstract: Since the initial publication of Studies, some parts of the Ada1 language have been changed in response to the ANSI standardization process. The second edition of Studies has been updated to conform to the latest available Ada manual [Department of Defense 82].2 All of our references to the Ada manual come from this document (see page 36).

Posted Content
TL;DR: A tutorial on Segment Routing technology, with a focus on the novel SRv6 solution and a comprehensive survey on SR technology, analyzing standardization efforts, patents, research activities and implementation results.
Abstract: Fixed and mobile telecom operators, enterprise network operators and cloud providers strive to face the challenging demands coming from the evolution of IP networks (e.g. huge bandwidth requirements, integration of billions of devices and millions of services in the cloud). Proposed in the early 2010s, Segment Routing (SR) architecture helps face these challenging demands, and it is currently being adopted and deployed. SR architecture is based on the concept of source routing and has interesting scalability properties, as it dramatically reduces the amount of state information to be configured in the core nodes to support complex services. SR architecture was first implemented with the MPLS dataplane and then, quite recently, with the IPv6 dataplane (SRv6). IPv6 SR architecture (SRv6) has been extended from the simple steering of packets across nodes to a general network programming approach, making it very suitable for use cases such as Service Function Chaining and Network Function Virtualization. In this paper we present a tutorial and a comprehensive survey on SR technology, analyzing standardization efforts, patents, research activities and implementation results. We start with an introduction on the motivations for Segment Routing and an overview of its evolution and standardization. Then, we provide a tutorial on Segment Routing technology, with a focus on the novel SRv6 solution. We discuss the standardization efforts and the patents providing details on the most important documents and mentioning other ongoing activities. We then thoroughly analyze research activities according to a taxonomy. We have identified 8 main categories during our analysis of the current state of play: Monitoring, Traffic Engineering, Failure Recovery, Centrally Controlled Architectures, Path Encoding, Network Programming, Performance Evaluation and Miscellaneous...

Journal ArticleDOI
TL;DR: This paper envisions the future deep feature coding standard for the AI-oriented large-scale video management and discusses existing techniques, standards, and possible solutions for these open problems.
Abstract: Deep learning has achieved substantial success in intelligent video analysis. To practically facilitate deep neural network models in the large-scale video analysis, there are still unprecedented challenges. Deep feature coding, instead of video coding, provides a practical solution for handling the large-scale video surveillance data. To enable interoperability in the context of deep feature coding, standardization is urgent and important. This paper envisions the future deep feature coding standard for the AI-oriented large-scale video management and discusses existing techniques, standards, and possible solutions for these open problems.

Proceedings ArticleDOI
16 Apr 2019
TL;DR: The current state in regards of standardization of the UWB technology and also the proposed changes to be made are described and the new enhancements to the existing standards are compared.
Abstract: This paper focuses on the new standard of IEEE 802.15.4z, which is seeking to enhance the already existing standards for the Impulse Radio Ultra - Wideband (UWB) technology. We describe the current state in regards of standardization of the UWB technology and also the proposed changes to be made. In the last part, we compare the new enhancements to the existing standards and describe the proposed improvements to be made in ranging capabilities, power consumption and security for both HRP and LRP UWB PHYs while also naming several practical applications where these new enhancements will be used.

Journal ArticleDOI
TL;DR: A toolbox for interpreting current and future regulatory restrictions; an integrated method for design planning, validation and clinical testing is proposed and should be evaluated in the future in order to assess its potentially positive impact to fostering innovation and to ensure timely development.
Abstract: Medical devices are designed, tested, and placed on the market in a highly regulated environment. Wearable sensors are crucial components of various medical devices: design and validation of wearable sensors, if managed according to international standards, can foster innovation while respecting regulatory requirements. The purpose of this paper is to review the upcoming European Union (EU) Medical Device Regulations 2017/745 and 2017/746, the current and future International Electrotechnical Commission (IEC) and International Organization for Standardization (ISO) standards that set methods for design and validation of medical devices, with a focus on wearable sensors. Risk classification according to the regulation is described. The international standards IEC 62304, IEC 60601, ISO 14971, and ISO 13485 are reviewed to define regulatory restrictions during design, pre-clinical validation and clinical validation of devices that include wearable sensors as crucial components. This paper is not about any specific innovation but it is a toolbox for interpreting current and future regulatory restrictions; an integrated method for design planning, validation and clinical testing is proposed. Application of this method to design wearable sensors should be evaluated in the future in order to assess its potentially positive impact to fostering innovation and to ensure timely development.

Journal ArticleDOI
TL;DR: A series of standards on pre-analytical sample processing has been published by the International Organization for Standardization (ISO) and the European Committee forStandardization (CEN) and they are of relevance for IVD product developers in the context of (re)certification under the IVDR.

Proceedings ArticleDOI
01 Oct 2019
TL;DR: This paper is the first to analytically evaluate the performance of the collective perception service when operated using the ad hoc mode of cellular V2X, known as Mode 4, as defined by the 3GPP in Release 14 and aims at supporting the ongoing standardization efforts.
Abstract: One of the major challenges on the way towards full traffic automation is the incomplete environmental awareness of the traffic participants. The most promising approach to overcome the physical performance limitations due to the restricted range of current vehicle on-board sensors such as cameras, LIDARs and radars is to extend the vehicle’s perception using vehicle-to-everything (V2X) communication. It allows traffic participants to share information gathered by their sensors among each other. This application of V2X-communication is commonly referred to as collective perception and will be mainly supported in Europe by the collective perception service currently under standardization at the European Telecommunications Standards Institute (ETSI). This paper is the first to analytically evaluate the performance of the service when operated using the ad hoc mode of cellular V2X, known as Mode 4, as defined by the 3GPP in Release 14 and aims at supporting the ongoing standardization efforts. To this end, several performance metrics are investigated on the example of a highway scenario under varying conditions and the usability of the LTE-V based service for two vehicle safety applications is discussed.

Journal ArticleDOI
TL;DR: In this article, the authors examine the professionalism and professionalization of sustainability assurance providers based on the experiences and perceptions of auditors involved in this activity, and highlight the division of this professional activity between accounting and consulting firms, each of which question the professionalism of the other.
Abstract: The purpose of this paper is to examine the professionalism and professionalization of sustainability assurance providers based on the experiences and perceptions of auditors involved in this activity.,The empirical study was based on 38 semi-directed interviews conducted with assurance providers from accounting and consulting firms.,The findings highlight the division of this professional activity between accounting and consulting firms, each of which question the professionalism of the other. The main standards in this area tend to be used as legitimizing tools to enhance the credibility of the assurance process rather than effective guidelines to improve the quality of the verification process. Finally, the complex and multifaceted skills required to conduct sound sustainability assurance and the virtual absence of recognized and substantial training programs in this area undermine the professionalization of assurance providers.,This work has important practical implications for standardization bodies, assurance providers and stakeholders concerned by the quality and the reliability of sustainability disclosure.,This study shows how practitioners in this area construct and legitimize their professional activity in terms of identity, standardization and competences. The work contributes to the literatures on the assurance of sustainability reports, self-regulation through standardization and professionalization.

Book ChapterDOI
01 Sep 2019
TL;DR: In this paper, the authors argue that international standards bring about and solidify technological evolution, innovation and diffusion of knowledge, and they play a decisive role as to whether the business and market environment will be conducive to increased innovation and trade.
Abstract: Increased international standardization by the private sector results from an ever-increasing demand of consumers for better and safer products, technological advances, the expansion of global trade and the ever-increasing focus on social and sustainability issues. International standards affect our everyday life in multiple ways. Standards bring about and solidify technological evolution, innovation and diffusion of knowledge. In that respect, they have an important impact on consumer wellbeing. They play a decisive role as to whether the business and market environment will be conducive to increased innovation and trade. They form an important condition for doing business and affect access to markets, determining the profitability, growth and ultimately the survival of entrepreneurs and economic operators alike. Hence, standards have a crucial trade facilitation function .

Journal ArticleDOI
TL;DR: The Portuguese reality is characterized, regarding the relevance given to OHS, independently of the sector of activity or size of the companies, and 98% of these companies are aware of the benefits that OHSMS provides or could provide.

Journal ArticleDOI
TL;DR: It is analytically derived that global standardization generally yields inconsistent (asymptotically biased) estimates for the estimand when between-person differences in within-person standard deviations exist and the average within- person relation is nonzero.
Abstract: Person-mean centering has been recommended for disaggregating between-person and within-person effects when modeling time-varying predictors. Multilevel modeling textbooks recommended global standardization for standardizing fixed effects. An aim of this study is to evaluate whether and when person-mean centering followed by global standardization can accurately estimate fixed-effects within-person relations (the estimand of interest in this study) in multilevel modeling. We analytically derived that global standardization generally yields inconsistent (asymptotically biased) estimates for the estimand when between-person differences in within-person standard deviations exist and the average within-person relation is nonzero. Alternatively, a person-mean-SD standardization (P-S) approach yields consistent estimates. Our simulation results further revealed (1) how misleading the results from global standardization were under various circumstances and (2) the P-S approach had accurate estimates and satisfactory coverage rates of fixed-effects within-person relations when the number of occasions is 30 or more (in many conditions, performance was satisfactory with 10 or 20 occasions). A daily diary data example, focused on emotional complexity, was used to empirically illustrate the approaches. Researchers should choose standardization approaches based on theoretical considerations and should clearly describe the purpose and procedure of standardization in research articles.

Proceedings ArticleDOI
09 Jun 2019
TL;DR: The C++ Library for Responsibility Sensitive Safety as mentioned in this paper is an open source executable that implements a subset of the responsibility sensitive safety (RSS) model, and it can be used to explore the usefulness of the RSS model through parameter exploration and analysis on minimum safe longitudinal distance.
Abstract: The need for safety in Automated Driving (AD) is becoming increasingly critical with the accelerating deployment of this technology. Beyond functional safety, industry must guarantee the operational safety of automated vehicles. Towards that end, Mobileye introduced the Responsibility Sensitive Safety (RSS), a model-based approach to Safety [1]. In this paper we expand upon this work introducing the C++ Library for Responsibility Sensitive Safety, an open source executable that implements a subset of RSS. We provide architectural details to integrate the C++ Library for Responsibility Sensitive Safety with AD Software pipelines as safety module overseeing decision making of driving policies. We illustrate this application with an example integration with the Baidu Apollo AD stack and simulator, [2] and [3], that provides safety validation of the planning module. Furthermore, we show how the C++ Library for Responsibility Sensitive Safety can be used to explore the usefulness of the RSS model through parameter exploration and analysis on minimum safe longitudinal distance, ( $d_{min}$ ), considering different weather conditions. We also compare these results with half-of-speed rule followed in some parts of the world. We expect that the C++ Library for Responsibility Sensitive Safety becomes a critical component of future tools for formal verification, testing and validation of AD safety and that it helps bootstrap the AD research efforts towards standardization of safety.