scispace - formally typeset
Search or ask a question

Showing papers on "Quality (business) published in 2017"


Proceedings Article
04 Dec 2017
TL;DR: The authors proposed to penalize the norm of the gradient of the critic with respect to its input to improve the training stability of Wasserstein GANs and achieve stable training of a wide variety of GAN architectures with almost no hyperparameter tuning.
Abstract: Generative Adversarial Networks (GANs) are powerful generative models, but suffer from training instability. The recently proposed Wasserstein GAN (WGAN) makes progress toward stable training of GANs, but sometimes can still generate only poor samples or fail to converge. We find that these problems are often due to the use of weight clipping in WGAN to enforce a Lipschitz constraint on the critic, which can lead to undesired behavior. We propose an alternative to clipping weights: penalize the norm of gradient of the critic with respect to its input. Our proposed method performs better than standard WGAN and enables stable training of a wide variety of GAN architectures with almost no hyperparameter tuning, including 101-layer ResNets and language models with continuous generators. We also achieve high quality generations on CIFAR-10 and LSUN bedrooms.

3,622 citations


Book ChapterDOI
03 Nov 2017
TL;DR: Schemata are active computational devices capable of evaluating the quality of their own fit to the available data as discussed by the authors and are employed in the process of interpreting sensory data, in retrieving information from memory, in organizing actions, in determining goals and subgoals, in allocating resources, and, generally, in guiding the flow of processing in the system.
Abstract: Schemata are employed in the process of interpreting sensory data, in retrieving information from memory, in organizing actions, in determining goals and subgoals, in allocating resources, and, generally, in guiding the flow of processing in the system. Perhaps the central function of schemata is in the construction of an interpretation of an event, object, or situation—that is, in the process of comprehension. Schemata are active computational devices capable of evaluating the quality of their own fit to the available data. Schemata consist of subschemata as procedures consist of subprocedures. A schema is said to be activated from the bottom-up whenever a subschema that has been somehow activated causes the various schemata of which it is a part to be activated. One of the central problems of a schema theory is a specification of the process whereby new schemata are developed.

2,061 citations


Journal ArticleDOI
TL;DR: Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.
Abstract: A prototype decision support IT-tool for the food industry was developed in the STARTEC project. Typical processes and decision steps were mapped using real life production scenarios of participating food companies manufacturing complex ready-to-eat foods. Companies looked for a more integrated approach when making food safety decisions that would align with existing HACCP systems. The tool was designed with shelf life assessments and data on safety, quality, and costs, using a pasta salad meal as a case product. The process flow chart was used as starting point, with simulation options at each process step. Key parameters like pH, water activity, costs of ingredients and salaries, and default models for calculations of Listeria monocytogenes, quality scores, and vitamin C, were placed in an interactive database. Customization of the models and settings was possible on the user-interface. The simulation module outputs were provided as detailed curves or categorized as "good"; "sufficient"; or "corrective action needed" based on threshold limit values set by the user. Possible corrective actions were suggested by the system. The tool was tested and approved by end-users based on selected ready-to-eat food products. Compared to other decision support tools, the STARTEC-tool is product-specific and multidisciplinary and includes interpretation and targeted recommendations for end-users.

1,187 citations


Posted Content
TL;DR: This survey presents a comprehensive review of detecting fake news on social media, including fake news characterizations on psychology and social theories, existing algorithms from a data mining perspective, evaluation metrics and representative datasets, and future research directions for fake news detection on socialMedia.
Abstract: Social media for news consumption is a double-edged sword. On the one hand, its low cost, easy access, and rapid dissemination of information lead people to seek out and consume news from social media. On the other hand, it enables the wide spread of "fake news", i.e., low quality news with intentionally false information. The extensive spread of fake news has the potential for extremely negative impacts on individuals and society. Therefore, fake news detection on social media has recently become an emerging research that is attracting tremendous attention. Fake news detection on social media presents unique characteristics and challenges that make existing detection algorithms from traditional news media ineffective or not applicable. First, fake news is intentionally written to mislead readers to believe false information, which makes it difficult and nontrivial to detect based on news content; therefore, we need to include auxiliary information, such as user social engagements on social media, to help make a determination. Second, exploiting this auxiliary information is challenging in and of itself as users' social engagements with fake news produce data that is big, incomplete, unstructured, and noisy. Because the issue of fake news detection on social media is both challenging and relevant, we conducted this survey to further facilitate research on the problem. In this survey, we present a comprehensive review of detecting fake news on social media, including fake news characterizations on psychology and social theories, existing algorithms from a data mining perspective, evaluation metrics and representative datasets. We also discuss related research areas, open problems, and future research directions for fake news detection on social media.

887 citations


Journal ArticleDOI
TL;DR: In this paper, the authors proposed a framework to evaluate sustainable supplier selection by using an integrated analytical hierarchy process (AHP), ViseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR), a multi-criteria optimization and compromise solution approach.

610 citations


Journal ArticleDOI
TL;DR: An exhaustive evaluation of 24 identical units of a commercial low-cost sensor platform against CEN (European Standardization Organization) reference analyzers, evaluating their measurement capability over time and a range of environmental conditions shows that their performance varies spatially and temporally.

607 citations


Journal ArticleDOI
TL;DR: This review focuses on the recent various important challenges in quality evaluation of medicinal plants in the authenticity, efficacy, toxicity and consistency.
Abstract: Human societies have been in close contact with their environments since the beginning of their formation and used the ingredients of the environment to obtain food and medicine. Awareness and application of plants to prepare food and medicine have been realized through trial and error, and gradually human became able to meet his needs from his surroundings. Information about medicinal plants has long been transmitted gradually and from generation to generation, a human knowledge has gradually become complete with the formation of civilizations and the provision of more facilities. Medicinal plants are used as a medical resource in almost all cultures. Ensuring the safety, quality and effectiveness of medicinal plants and herbal drugs very recently became a key issue in industrialized and developing countries. By standardizing and evaluating the health of active plant-derived compounds, herbal drugs can help the emergence of a new era of the healthcare system to treat human diseases in the future. Awareness of traditional knowledge and medicinal plants can play a key role in the exploitation and discovery of natural plant resources. In order to maintain this knowledge, comprehensive approach and collaboration are needed to maintain historical records on medicinal plants and use these resources in favour of human beings, before they are destroyed forever. Therefore, this review was conducted to investigate and describe the process of using medicinal plants throughout history. This review focuses on the recent various important challenges in quality evaluation of medicinal plants in the authenticity, efficacy, toxicity and consistency.

419 citations


Journal ArticleDOI
TL;DR: How the Directive has been interpreted is reviewed, focusing on its intentions and how they were applied, revealing the absence of the paradigm shift towards the systems (integrated) thinking that the WFD was grounded on, as a fundamental problem with its implementation.

399 citations


Journal ArticleDOI
TL;DR: The COS-STAD project has established 11 minimum standards to be followed by COS developers when planning their projects and by users when deciding whether a COS has been developed using reasonable methods.
Abstract: Background The use of core outcome sets (COS) ensures that researchers measure and report those outcomes that are most likely to be relevant to users of their research. Several hundred COS projects have been systematically identified to date, but there has been no formal quality assessment of these studies. The Core Outcome Set-STAndards for Development (COS-STAD) project aimed to identify minimum standards for the design of a COS study agreed upon by an international group, while other specific guidance exists for the final reporting of COS development studies (Core Outcome Set-STAndards for Reporting [COS-STAR]). Methods and findings An international group of experienced COS developers, methodologists, journal editors, potential users of COS (clinical trialists, systematic reviewers, and clinical guideline developers), and patient representatives produced the COS-STAD recommendations to help improve the quality of COS development and support the assessment of whether a COS had been developed using a reasonable approach. An open survey of experts generated an initial list of items, which was refined by a 2-round Delphi survey involving nearly 250 participants representing key stakeholder groups. Participants assigned importance ratings for each item using a 1–9 scale. Consensus that an item should be included in the set of minimum standards was defined as at least 70% of the voting participants from each stakeholder group providing a score between 7 and 9. The Delphi survey was followed by a consensus discussion with the study management group representing multiple stakeholder groups. COS-STAD contains 11 minimum standards that are the minimum design recommendations for all COS development projects. The recommendations focus on 3 key domains: the scope, the stakeholders, and the consensus process. Conclusions The COS-STAD project has established 11 minimum standards to be followed by COS developers when planning their projects and by users when deciding whether a COS has been developed using reasonable methods.

380 citations


Journal ArticleDOI
TL;DR: The OriginChain project as mentioned in this paper is a real-world traceability system using a blockchain, which provides transparent tamper-proof traceability information, automates regulatory compliance checking, and enables system adaptability.
Abstract: Traceability allows tracking products through all stages of a supply chain, which is crucial for product quality control. To provide accountability and forensic information, traceability information must be secured. This is challenging because traceability systems often must adapt to changes in regulations and to customized traceability inspection processes. OriginChain is a real-world traceability system using a blockchain. Blockchains are an emerging data storage technology that enables new forms of decentralized architectures. Components can agree on their shared states without trusting a central integration point. OriginChain’s architecture provides transparent tamper-proof traceability information, automates regulatory compliance checking, and enables system adaptability.

376 citations


Journal ArticleDOI
TL;DR: A theoretical model integrating theories of information systems' satisfaction and success in the e-learning systems is proposed and empirically validated in higher education institutions and university centers in Brazil through a quantitative method of structural equation modeling.
Abstract: E-learning is a web-based learning ecosystem for the dissemination of information, communication, and knowledge for education and training. Understanding the impact of e-learning on society, as well as its benefits, is important to link e-learning systems to their success drivers. The aim of this study is to find the determinants of user perceived satisfaction, use, and individual impact of e-learning. This study proposes a theoretical model integrating theories of information systems' satisfaction and success in the e-learning systems. The model was empirically validated in higher education institutions and university centers in Brazil through a quantitative method of structural equation modeling. Collaboration quality, information quality, and user perceived satisfaction explain e-learning use. The drivers of user perceived satisfaction are information quality, system quality, instructor attitude toward e-learning, diversity in assessment, and learner perceived interaction with others. System quality, use, and user perceived satisfaction explain individual impact.

Journal ArticleDOI
13 Sep 2017-Neuron
TL;DR: An automated clustering approach and associated software package that has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible and has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques.

Journal ArticleDOI
TL;DR: In this article, the authors present a review of recent optimization methods applied to solve the problem of placement and sizing of distributed generation units from renewable energy sources based on a classification of the most recent and highly cited papers.

Journal ArticleDOI
TL;DR: A growing evidence base supports the hypothesis that greener cities are healthier cities, and recommendations for further research are made.
Abstract: Currently half the world population lives in cities, and this proportion is expected to increase rapidly to 70% over the next years. Over the years, we have created large, mostly grey cities with many high-rise buildings and little green space. Disease rates tend to be higher in urban areas than in rural areas. More green space in cities could reduce these rates. Here, we describe the importance of green space for health, and make recommendations for further research. Green space has been associated with many beneficial health effects, including reduced all-cause and cardiovascular mortality and improved mental health, possibly through mediators, such as reduced air pollution, temperature and stress, and increased physical activity, social contacts, and restoration. Additional studies are needed to strengthen the evidence base and provide further guidelines to transport planners, urban planners, and landscape architects. We need more longitudinal studies and intervention studies, further understanding of the contribution of various mechanisms toward health, and more information on susceptible populations and on where, when, how much, and what type of green space is needed. Also needed are standardized methods for green space quality assessments and evaluations of effectiveness of green prescriptions in clinical practice. Many questions are ideally suited for environmental epidemiologists, who should work with other stakeholders to address the right questions and translate knowledge into action. In conclusion, a growing evidence base supports the hypothesis that greener cities are healthier cities.

01 Apr 2017
TL;DR: This first study to systematically, quantitatively analyze the links between healthcare provider burnout and healthcare quality and safety across disciplines shows consistent negative relationships with perceived quality, quality indicators, and perceptions of safety.
Abstract: Background Healthcare provider burnout is considered a factor in quality of care, yet little is known about the consistency and magnitude of this relationship. This meta-analysis examined relationships between provider burnout (emotional exhaustion, depersonalization, and reduced personal accomplishment) and the quality (perceived quality, patient satisfaction) and safety of healthcare.

Proceedings ArticleDOI
01 Sep 2017
TL;DR: This work provides an architecture to enable robotic grasp planning via shape completion through the use of a 3D convolutional neural network trained on a new open source dataset of over 440,000 3D exemplars captured from varying viewpoints.
Abstract: This work provides an architecture to enable robotic grasp planning via shape completion. Shape completion is accomplished through the use of a 3D convolutional neural network (CNN). The network is trained on our own new open source dataset of over 440,000 3D exemplars captured from varying viewpoints. At runtime, a 2.5D pointcloud captured from a single point of view is fed into the CNN, which fills in the occluded regions of the scene, allowing grasps to be planned and executed on the completed object. Runtime shape completion is very rapid because most of the computational costs of shape completion are borne during offline training. We explore how the quality of completions vary based on several factors. These include whether or not the object being completed existed in the training data and how many object models were used to train the network. We also look at the ability of the network to generalize to novel objects allowing the system to complete previously unseen objects at runtime. Finally, experimentation is done both in simulation and on actual robotic hardware to explore the relationship between completion quality and the utility of the completed mesh model for grasping.

Journal ArticleDOI
TL;DR: The main limitations to promoting exercise through the patient-clinician interaction are the inadequate quality and scope of existing evidence, incomplete understanding of the mechanisms underlying the beneficial effects of exercise in people with multiple sclerosis, and the absence of a conceptual framework and toolkit for translating the evidence into practice.
Abstract: Summary Exercise can be a beneficial rehabilitation strategy for people with multiple sclerosis to manage symptoms, restore function, optimise quality of life, promote wellness, and boost participation in activities of daily living. However, this population typically engages in low levels of health-promoting physical activity compared with adults from the general population, a fact which has not changed in the past 25 years despite growing evidence of the benefits of exercise. To overcome this challenge, the main limitations to promoting exercise through the patient–clinician interaction must be addressed. These limitations are the inadequate quality and scope of existing evidence, incomplete understanding of the mechanisms underlying the beneficial effects of exercise in people with multiple sclerosis, and the absence of a conceptual framework and toolkit for translating the evidence into practice. Future research to address those limitations will be essential to inform decisions about the inclusion of exercise in the clinical care of people with multiple sclerosis.

Journal ArticleDOI
TL;DR: In this article, the authors examined the relationship of sustainable manufacturing practices with sustainability performance, which considers the environmental, economic and social aspects, and concluded that manufacturing process is the manufacturing stage that gives the most impact on the improvement of sustainability performance.
Abstract: Sustainable manufacturing practices are one of the significant environmental initiatives taken by manufacturing industries to preserve the environment and improve the quality of human life while performing manufacturing activities. The emergence of the value creation concept, economic value no longer counts as a single factor for measuring manufacturing performance. Within the sustainability context, the impact of manufacturing activities on the environmental and social aspects should be taken into account as the basis for assessing manufacturing performance, which is called sustainability performance. The purpose of this paper is to examine the relationship of sustainable manufacturing practices with sustainability performance, which considers the environmental, economic and social aspects.,A questionnaire survey is carried out among 443 ISO 14001 certified manufacturing companies in Malaysia. Structural equation modelling is used to evaluate the relationship of sustainable manufacturing practices with sustainability performance.,The findings of this study indicate that manufacturing process is the manufacturing stage that gives the most impact on the improvement of sustainability performance. Hence, it is concluded that manufacturing companies in Malaysia are highly focussed on the production bound when implementing sustainable manufacturing practices.,Although this study indicates a good estimation of the proposed model, additional variables might be added to improve the prediction strength of the proposed model such as considering type of industries, economic scale or ownership. Adding the comparison of sustainable manufacturing practices between different countries also a valuable research to investigated.,The framework proposed here can also assist manufacturing industries to conduct sustainability assessments by providing elements of sustainability performance and can serve as a guideline to select appropriate sustainable manufacturing practices and to what level the practices need to be improved to leverage companies’ sustainability performance.,The framework proposed here can also assist manufacturing industries to conduct sustainability assessments by providing elements of sustainability performance and can serve as a guideline to select appropriate sustainable manufacturing practices and to what level the practices need to be improved to leverage companies’ sustainability performance.

Journal ArticleDOI
TL;DR: Common types of mixed methods designs are outlined and examples of how nursing researchers can apply different mixed methods designs in order to answer important nursing practice questions are provided.
Abstract: ‘Mixed methods’ is a research approach whereby researchers collect and analyse both quantitative and qualitative data within the same study.1 2 Growth of mixed methods research in nursing and healthcare has occurred at a time of internationally increasing complexity in healthcare delivery. Mixed methods research draws on potential strengths of both qualitative and quantitative methods,3 allowing researchers to explore diverse perspectives and uncover relationships that exist between the intricate layers of our multifaceted research questions. As providers and policy makers strive to ensure quality and safety for patients and families, researchers can use mixed methods to explore contemporary healthcare trends and practices across increasingly diverse practice settings. This article will outline common types of mixed methods designs and provide examples of how nursing researchers can apply different mixed methods designs in order to answer important nursing practice questions. Mixed methods research requires a purposeful mixing of methods in data …

Journal ArticleDOI
TL;DR: The present review aims to provide a detailed description and critique of WB procedures and technicalities, from sample collection through preparation, blotting and detection, to analysis of the data collected, to produce reproducible and reliable blots.
Abstract: The applications of Western/immunoblotting (WB) techniques have reached multiple layers of the scientific community and are now considered routine procedures in the field of physiology. This is none more so than in relation to skeletal muscle physiology (i.e., resolving the mechanisms underpinning adaptations to exercise). Indeed, the inclusion of WB data is now considered an essential aspect of many such physiological publications to provide mechanistic insight into regulatory processes. Despite this popularity, and due to the ubiquitous and relatively inexpensive availability of WB equipment, the quality of WB in publications and subsequent analysis and interpretation of the data can be variable, perhaps resulting in spurious conclusions. This may be due to poor laboratory technique and/or lack of comprehension of the critical steps involved in WB and what quality control procedures should be in place to ensure robust data generation. The present review aims to provide a detailed description and critique of WB procedures and technicalities, from sample collection through preparation, blotting and detection, to analysis of the data collected. We aim to provide the reader with improved expertise to critically conduct, evaluate, and troubleshoot the WB process, to produce reproducible and reliable blots.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the impact of AR on retail user experience and its subsequent influence on user satisfaction and user's willingness to buy, and found that AR significantly shapes user experience by impinging on various characteristics of product quality.

Journal ArticleDOI
TL;DR: This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation, since DoE is presented in detail since it represents the first choice for rational pharmaceutical development.
Abstract: At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

Journal ArticleDOI
TL;DR: Three different promising approaches from the perspective of dynamic systems and neural networks in tactical performance analysis revealed inter-player coordination, inter-team and inter-line coordination before critical events, as well as team-team interaction and compactness coefficients.
Abstract: Tactical match performance depends on the quality of actions of individual players or teams in space and time during match-play in order to be successful. Technological innovations have led to new possibilities to capture accurate spatio-temporal information of all players and unravel the dynamics and complexity of soccer matches. The main aim of this article is to give an overview of the current state of development of the analysis of position data in soccer. Based on the same single set of position data of a high-level 11 versus 11 match (Bayern Munich against FC Barcelona) three different promising approaches from the perspective of dynamic systems and neural networks will be presented: Tactical performance analysis revealed inter-player coordination, inter-team and inter-line coordination before critical events, as well as team-team interaction and compactness coefficients. This could lead to a multi-disciplinary discussion on match analyses in sport science and new avenues for theoretical and practical implications in soccer.

Journal ArticleDOI
TL;DR: The study identifies that system quality and information quality are key to enhance business value and F PER in a big data environment and proposes that the relationship between quality and FPER is mediated by business value of big data.
Abstract: Big data analytics have become an increasingly important component for firms across advanced economies. This paper examines the quality dynamics in big data environment that are linked with enhancing business value and firm performance (FPER). The study identifies that system quality (i.e. system reliability, accessibility, adaptability, integration, response time and privacy) and information quality (i.e. completeness, accuracy, format and currency) are key to enhance business value and FPER in a big data environment. The study also proposes that the relationship between quality and FPER is mediated by business value of big data. Drawing on the resource-based theory and the information systems success literature, this study extends knowledge in this domain by linking system quality, information quality, business value and FPER.

Posted Content
TL;DR: A literature review of quality issues and attributes as they relate to the contemporary issue of chatbot development and implementation is presented, and a quality assessment method based on these attributes and the Analytic Hierarchy Process is proposed and examined.
Abstract: Chatbots are one class of intelligent, conversational software agents activated by natural language input (which can be in the form of text, voice, or both). They provide conversational output in response, and if commanded, can sometimes also execute tasks. Although chatbot technologies have existed since the 1960s and have influenced user interface development in games since the early 1980s, chatbots are now easier to train and implement. This is due to plentiful open source code, widely available development platforms, and implementation options via Software as a Service (SaaS). In addition to enhancing customer experiences and supporting learning, chatbots can also be used to engineer social harm - that is, to spread rumors and misinformation, or attack people for posting their thoughts and opinions online. This paper presents a literature review of quality issues and attributes as they relate to the contemporary issue of chatbot development and implementation. Finally, quality assessment approaches are reviewed, and a quality assessment method based on these attributes and the Analytic Hierarchy Process (AHP) is proposed and examined.

Journal ArticleDOI
TL;DR: In this article, the authors used the best worst method to rank and prioritize attributes of service quality that were identified through extensive literature review and VIKOR (VlseKriterijuska Optimizacija I Komoromisno Resenje) methodology is used to rank the best airline with respect to these attributes.

Journal ArticleDOI
TL;DR: It is found that multicomponent interventions addressing both patient and clinician roles in overuse have the greatest potential to reduce low-value care.
Abstract: The effectiveness of different types of interventions to reduce low-value care has been insufficiently summarized to allow for translation to practice. This article systematically reviews the literature on the effectiveness of interventions to reduce low-value care and the quality of those studies. We found that multicomponent interventions addressing both patient and clinician roles in overuse have the greatest potential to reduce low-value care. Clinical decision support and performance feedback are promising strategies with a solid evidence base, and provider education yields changes by itself and when paired with other strategies. Further research is needed on the effectiveness of pay-for-performance, insurer restrictions, and risk-sharing contracts to reduce use of low-value care. While the literature reveals important evidence on strategies used to reduce low-value care, meaningful gaps persist. More experimentation, paired with rigorous evaluation and publication, is needed.

Journal ArticleDOI
TL;DR: In this paper, the authors provide a discourse on the relative significance of various factors configuring quality in fruits and vegetables, with emphasis on intrinsic factors pertaining to the preharvest period, and also on extrinsic factors shaping quality for supply chain stakeholders and consumers.

Journal ArticleDOI
TL;DR: The suitability of the partial least squares approach to SEM (PLS-SEM) in estimating a complex model drawing on the philosophy of verisimilitude and the methodology of soft modelling assumptions is shown.
Abstract: The emergence of multivariate analysis techniques transforms empirical validation of theoretical concepts in social science and business research. In this context, structural equation modelling (SEM) has emerged as a powerful tool to estimate conceptual models linking two or more latent constructs. This paper shows the suitability of the partial least squares (PLS) approach to SEM (PLS-SEM) in estimating a complex model drawing on the philosophy of verisimilitude and the methodology of soft modelling assumptions. The results confirm the utility of PLS-SEM as a promising tool to estimate a complex, hierarchical model in the domain of big data analytics quality.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed the relationship between customer satisfaction and consumer spending in e-commerce and found that higher e-satisfaction results in more spending in online retailing.