scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Low-rank tensors applications for dimensionality reduction of complex hydrocarbon reservoirs

20 Nov 2021-Energy (Elsevier BV)-pp 122680
TL;DR: In this paper, the concept of low-rank tensor decomposition is introduced for unconventional reservoir modeling to target issues such as huge dataset management and missing data generation. But it is not suitable for large-scale data sets.
About: This article is published in Energy.The article was published on 2021-11-20 and is currently open access. It has received 4 citations till now. The article focuses on the topics: Rank (graph theory) & Petrophysics.
Citations
More filters
Journal ArticleDOI
TL;DR: In this article , the contribution of different gas adsorption phenomena in Marcellus shale for different fractured well configurations is investigated, and the authors determined that a fractured horizontal well is a viable option that allows high gas desorption.
Abstract: Abstract Gas adsorption onto the shale system carries significant importance in accurately forecasting gas production and estimating underground reserves. For the Marcellus shale system, the adsorption of gas also plays a critical role in ultimate recovery and overall reserves quantification. Yet, the effect of different adsorption characteristics in different Marcellus shale regions is not systematically analyzed together. In this study, the contribution of different gas adsorption phenomena in Marcellus shale for different fractured well configurations is investigated. The objective is to understand the reservoir production responses under various adsorption characteristics and well design. For this analysis, a mechanistic Marcellus shale model under confining stresses is numerically simulated with the available literature data. After that, six samples containing adsorption characteristics of different Marcellus shale regions are taken from the literature and specified in the model for accurately defining the adsorption physics in the shale system. In the end, two different well configurations including the fractured vertical and horizontal well are specified in the model separately to analyze the impact of gas desorption on production response. The analysis indicates that the gas desorption improves the overall gas production by a maximum of 5% in a single-stage multi-clustered fractured horizontal well. In addition, the effect of desorption is found to be minimal during initial flow periods, and considerable at longer flow periods. Additionally, the gas desorption is found to be more responsive towards high surface area and large fracture networks. Finally, it is determined that a fractured horizontal well is a viable option that allows high gas desorption in Marcellus shale. This study, hence, aids widely in deciding better production strategies based on adsorption characteristics for producing Marcellus shale.

8 citations

Journal ArticleDOI
01 Oct 2022
TL;DR: This study presents a methodology to select an adequate DR method to deal with high-dimensional spatial attributes with more than 105 dimensions to build an FOFE and avoid overfitting when a massive number of data are used.
Abstract: One of the challenges related to reservoir engineering studies is working with essential high-dimensional inputs, such as porosity and permeability, which govern fluid flow in porous media. Dimensionality reduction (DR) methods have enabled spatial variability in constructing a fast objective function estimator (FOFE). This study presents a methodology to select an adequate DR method to deal with high-dimensional spatial attributes with more than 105 dimensions. We investigated 18 methods of DR commonly applied in the literature. The proposed workflow accomplished (1) definition of the adequate number of dimensions; (2) evaluation of the time spent for each data set generated using the elapsed computational time; (3) training using the automated machine learning (AutoML) technique; (4) validation using the root mean square logarithmic error (RMSLE) and the confidence interval (CI) of 95%; (5) a score equation using elapsed computational time and RMSLE; and (6) consistency check to evaluate if the FOFE is reliable to mimic simulator output. We used FOFE to generate risk curves at the final forecast period (10,957 days) as an application. We obtained methods that reduced the high-dimensional spatial attributes with a computational time lower than 10 minutes, enabling us to consider them in the FOFE building. We could deal with high-dimensional spatial variability from those selected approaches. Moreover, we can use the DR method selected to deal with high complexity problems to build an FOFE and avoid overfitting when a massive number of data are used.

1 citations

Proceedings ArticleDOI
31 Oct 2022
TL;DR: In this paper , the authors presented an effective workflow to characterize the petrophysical properties of unconventional shales at different maturation stages. But, the work was limited to the use of X-ray diffraction (XRD) analysis, rock-Eval pyrolysis, helium porosity, NMR, and dielectric experiments.
Abstract: Multiple challenges are associated with the characterization and development of unconventional shale reservoirs. The petrophysical properties play significant roles in hydrocarbon production from unconventional reservoirs. Several techniques can be used to determine the petrophysical properties such as routine core analysis, nuclear magnetic resonance (NMR), and dielectric techniques. This study presents an effective workflow to characterize the petrophysical properties of unconventional shales at different maturation stages. In this study, the conducted measurements are X-ray diffraction (XRD) analysis, Rock-Eval pyrolysis, helium porosity, NMR, and dielectric experiments. The rock samples were prepared for the measurements by drying the samples under a vacuum. In addition, the samples were artificially maturated using a muffle furnace at different temperatures and heating times. The impact of shale maturation on the petrophysical properties was captured by evaluating the rock properties after each maturation stage. Results show that the shale samples have a TOC of 17.5 wt.% on average, and a hydrogen index (HI) of 809, indicating that the samples are belonging to kerogen type I. The mineralogical analysis indicates that the used shale samples have a calcite percentage of around 59.9%. Moreover, the artificial maturation led to reducing the total organic content, due to the conversion of organic matter into hydrocarbon fluids. NMR and dielectric measurements showed that the shale porosity system was altered due to artificial maturation. The real dielectric constant was reduced indicating a reduction in the kerogen percentages. The cumulative probability density was increased after the maturation, revealing the shale porosity was increased which could be attributed to the dissolution of kerogen during the maturation process. Ultimately, this study improves our understanding of characterizing unconventional shale formations. Also, a reliable workflow is proposed for a better characterization of the unconventional formations by integrating routine core analysis, Rock-Eval, NMR, and dielectric techniques. Such workflow can pave the way to introduce a downhole technique to characterize unconventional resources more effectively.
References
More filters
Journal ArticleDOI
TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Abstract: This survey provides an overview of higher-order tensor decompositions, their applications, and available software. A tensor is a multidimensional or $N$-way array. Decompositions of higher-order tensors (i.e., $N$-way arrays with $N \geq 3$) have applications in psycho-metrics, chemometrics, signal processing, numerical linear algebra, computer vision, numerical analysis, data mining, neuroscience, graph analysis, and elsewhere. Two particular tensor decompositions can be considered to be higher-order extensions of the matrix singular value decomposition: CANDECOMP/PARAFAC (CP) decomposes a tensor as a sum of rank-one tensors, and the Tucker decomposition is a higher-order form of principal component analysis. There are many other tensor decompositions, including INDSCAL, PARAFAC2, CANDELINC, DEDICOM, and PARATUCK2 as well as nonnegative variants of all of the above. The N-way Toolbox, Tensor Toolbox, and Multilinear Engine are examples of software packages for working with tensors.

9,227 citations

Journal ArticleDOI
TL;DR: In this paper, an individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common "psychological space" and a corresponding method of analyzing similarities data is proposed, involving a generalization of Eckart-Young analysis to decomposition of three-way (or higher-way) tables.
Abstract: An individual differences model for multidimensional scaling is outlined in which individuals are assumed differentially to weight the several dimensions of a common “psychological space”. A corresponding method of analyzing similarities data is proposed, involving a generalization of “Eckart-Young analysis” to decomposition of three-way (or higher-way) tables. In the present case this decomposition is applied to a derived three-way table of scalar products between stimuli for individuals. This analysis yields a stimulus by dimensions coordinate matrix and a subjects by dimensions matrix of weights. This method is illustrated with data on auditory stimuli and on perception of nations.

4,520 citations

Journal ArticleDOI
TL;DR: Benefiting from the power of multilinear algebra as their mathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints which match data properties and extract more general latent components in the data than matrix-based methods.
Abstract: The widespread use of multisensor technology and the emergence of big data sets have highlighted the limitations of standard flat-view matrix models and the necessity to move toward more versatile data analysis tools. We show that higher-order tensors (i.e., multiway arrays) enable such a fundamental paradigm shift toward models that are essentially polynomial, the uniqueness of which, unlike the matrix methods, is guaranteed under very mild and natural conditions. Benefiting from the power of multilinear algebra as their mathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints which match data properties and extract more general latent components in the data than matrix-based methods.

1,250 citations

Journal ArticleDOI
TL;DR: In this article, a comprehensive introduction to tensor decompositions is provided from a signal processing perspective, starting from the algebraic foundations, via basic Canonical Polyadic and Tucker decomposition, through to advanced cause-effect and multi-view data analysis schemes.
Abstract: The widespread use of multi-sensor technology and the emergence of big datasets has highlighted the limitations of standard flat-view matrix models and the necessity to move towards more versatile data analysis tools. We show that higher-order tensors (i.e., multiway arrays) enable such a fundamental paradigm shift towards models that are essentially polynomial and whose uniqueness, unlike the matrix methods, is guaranteed under verymild and natural conditions. Benefiting fromthe power ofmultilinear algebra as theirmathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints that match data properties, and to find more general latent components in the data than matrix-based methods. A comprehensive introduction to tensor decompositions is provided from a signal processing perspective, starting from the algebraic foundations, via basic Canonical Polyadic and Tucker models, through to advanced cause-effect and multi-view data analysis schemes. We show that tensor decompositions enable natural generalizations of some commonly used signal processing paradigms, such as canonical correlation and subspace techniques, signal separation, linear regression, feature extraction and classification. We also cover computational aspects, and point out how ideas from compressed sensing and scientific computing may be used for addressing the otherwise unmanageable storage and manipulation problems associated with big datasets. The concepts are supported by illustrative real world case studies illuminating the benefits of the tensor framework, as efficient and promising tools for modern signal processing, data analysis and machine learning applications; these benefits also extend to vector/matrix data through tensorization. Keywords: ICA, NMF, CPD, Tucker decomposition, HOSVD, tensor networks, Tensor Train.

369 citations

Journal ArticleDOI
TL;DR: It is found that the ALS estimated models are generally of a better quality than any of the alternatives even when overfactoring the model, but it is also found that ALS is significantly slower.

253 citations