Voici les éléments 1 - 10 sur 41
  • Publication
    Accès libre
    Exploring substitution random functions composed of stationary multi-Gaussian processes
    Simulation of random felds is widely used in Earth sciences for modeling and uncertainty quantifcation. The spatial features of these felds may have a strong impact on the forecasts made using these felds. For instance, in fow and transport problems the connectivity of the permeability felds is a crucial aspect. Multi-Gaussian random felds are the most common tools to analyze and model continuous felds. Their spatial correlation structure is described by a covariance or variogram model. However, these types of spatial models are unable to represent highly or poorly connected structures even if a broad range of covariance models can be employed. With this type of model, the regions with values close to the mean are always well connected whereas the regions of low or high values are isolated. Substitution random functions (SRFs) belong to another broad class of random functions that are more fexible. SRFs are constructed by composing (Z = Yâ—¦T) two stochastic processes: the directing function T (latent feld) and the coding process Y (modifying the latent feld in a stochastic manner). In this paper, we study the properties of SRFs obtained by combining stationary multi-Gaussian random felds for both T and Y with bounded variograms. The resulting SRFs Z are stationary, but as T has a fnite variance Z is not ergodic for the mean and the covariance. This means that single realizations behave diferently from each other. We propose a simple technique to control which values (low, intermediate, or high) are connected. It consists of adding a control point on the process Y to guide every single realization. The conditioning to local values is obtained using a Gibbs sampler.
  • Publication
    Accès libre
  • Publication
    Accès libre
    Comparison of three recent discrete stochastic inversion methods and influence of the prior choice
    Groundwater flow depends on subsurface heterogeneity, which often calls for categorical fields to represent different geological facies. The knowledge about subsurface is however limited and often provided indirectly by state variables, such as hydraulic heads of contaminant concentrations. In such cases, solving a categorical inverse problem is an important step in subsurface modeling. In this work, we present and compare three recent inverse frameworks: Posterior Population Expansion (PoPEx), Ensemble Smoother with Multiple Data Assimilation (ESMDA), and DREAM-ZS (a Markov chain Monte Carlo sampler). PoPEx and ESDMA are used with Multiple-point statistics (MPS) as geostatistical engines, and DREAM-ZS is used with a Wasserstein generative adversarial network (WGAN). The three inversion methods are tested on a synthetic example of a pumping test in a fluvial channelized aquifer. Moreover, the inverse problem is solved three times with each method, each time using a different training image to check the performance of the methods with different geological priors. To assess the quality of the results, we propose a framework based on continuous ranked probability score (CRPS), which compares single true values with predictive distributions. All methods performed well when using the training image used to create the reference, but their performances were degraded with the alternative training images. PoPEx produced the least geological artifacts but presented a rather slow convergence. ESMDA showed initially a very fast convergence which reaches a plateau, contrary to the remaining methods. DREAM-ZS was overly confident in placing some incorrect geological features but outperformed the other methods in terms of convergence.
  • Publication
    Accès libre
    Automated Hierarchical 3D Modeling of Quaternary Aquifers: The ArchPy Approach
    When modeling groundwater systems in Quaternary formations, one of the first steps is to construct a geological and petrophysical model. This is often cumbersome because it requires multiple manual steps which include geophysical interpretation, construction of a structural model, and identification of geostatistical model parameters, facies, and property simulations. Those steps are often carried out using different software, which makes the automation intractable or very difficult. A non-automated approach is time-consuming and makes the model updating difficult when new data are available or when some geological interpretations are modified. Furthermore, conducting a cross-validation procedure to assess the overall quality of the models and quantifying the joint structural and parametric uncertainty are tedious. To address these issues, we propose a new approach and a Python module, ArchPy, to automatically generate realistic geological and parameter models. One of its main features is that the modeling operates in a hierarchical manner. The input data consist of a set of borehole data and a stratigraphic pile. The stratigraphic pile describes how the model should be constructed formally and in a compact manner. It contains the list of the different stratigraphic units and their order in the pile, their conformability (eroded or onlap), the surface interpolation method (e.g., kriging, sequential Gaussian simulation (SGS), and multiple-point statistics (MPS)), the filling method for the lithologies (e.g., MPS and sequential indicator simulation (SIS)), and the petrophysical properties (e.g., MPS and SGS). Then, the procedure is automatic. In a first step, the stratigraphic unit boundaries are simulated. Second, they are filled with lithologies, and finally, the petrophysical properties are simulated inside the lithologies. All these steps are straightforward and automated once the stratigraphic pile and its related parameters have been defined. Hence, this approach is extremely flexible. The automation provides a framework to generate end-to-end stochastic models and then the proposed method allows for uncertainty quantification at any level and may be used for full inversion. In this work, ArchPy is illustrated using data from an alpine Quaternary aquifer in the upper Aare plain (southeast of Bern, Switzerland).
  • Publication
    Accès libre
    A parsimonious parametrization of the Direct Sampling algorithm for multiple-point statistical simulations
    Multiple-point statistics algorithms allow modeling spatial variability from training images. Among these techniques, the Direct Sampling (DS) algorithm has advanced capabilities, such as multivariate simulations, treatment of non-stationarity, multi-resolution capabilities, conditioning by inequality or connectivity data. However, finding the right trade-off between computing time and simulation quality requires tuning three main parameters, which can be complicated since simulation time and quality are affected by these parameters in a complex manner. To facilitate the parameter selection, we propose the Direct Sampling Best Candidate (DSBC) parametrization approach. It consists in setting the distance threshold to 0. The two other parameters are kept (the number of neighbors and the scan fraction) as well as all the advantages of DS. We present three test cases that prove that the DSBC approach allows to identify efficiently parameters leading to comparable or better quality and computational time than the standard DS parametrization. We conclude that the DSBC approach could be used as a default mode when using DS, and that the standard parametrization should only be used when the DSBC approach is not sufficient.
  • Publication
    Accès libre
    Efficiency of template matching methods for Multiple-Point Statistics simulations
    (2021-8)
    Sharifzadeh Lari, Mansoureh
    ;
    ;
    Almost all Multiple-Point Statistic (MPS) methods use internally a template matching method to select patterns that best match conditioning data. The purpose of this paper is to analyze the performances of ten of the most frequently used template matching techniques in the framework of MPS algorithms. Performance is measured in terms of computing efficiency, accuracy, and memory usage. The methods were tested with both categorical and continuous training images (TI). The analysis considers the ability of those methods to locate rapidly and with minimum error a data event with a specific proportion of known pixels and a certain amount of noise. Experiments indicate that the Coarse to Fine using Entropy (CFE) method is the fastest in all configurations. Skipping methods are efficient as well. In terms of accuracy, and without noise all methods except CFE and cross correlation (CC) perform well. CC is the least accurate in all configurations if the TI is not normalized. This method performs better when normalized training images are used. The Binary Sum of Absolute Difference is the most robust against noise. Finally, in terms of memory usage, CFE is the worst among the ten methods that were tested; the other methods are not significantly different.
  • Publication
    Accès libre
    Ice volume and basal topography estimation using geostatistical methods and GPR measurements: Application on the Tsanfleuron and Scex Rouge glacier, Swiss Alps
    Ground Penetrating Radar (GPR) is nowadays widely used for determining glacier thickness. However, this method provides thickness data only along the acquisition lines and therefore interpolation has to be made between them. Depending on the interpolation strategy, calculated ice volumes can differ and can lack an accurate error estimation. Furthermore, glacial basal topography is often characterized by complex geomorphological features, which can be hard to reproduce using classical 5 interpolation methods, especially when the conditioning data are sparse or when the morphological features are too complex. This study investigates the applicability of multiple-point statistics (MPS) simulations to interpolate glacier bedrock topography using GPR measurements. In 2018, a dense GPR data set was acquired on the Tsanfleuron Glacier (Switzerland). The results obtained with the direct sampling MPS method are compared against those obtained with kriging and sequential Gaussian simulations (SGS) on both a synthetic data set – with known reference volume and bedrock topography – and the real data 10 underlying the Tsanfleuron glacier. Using the MPS modelled bedrock, the ice volume for the Scex Rouge and Tsanfleuron Glacier is estimated to be 113.9 ± 1.6 Miom3 . The direct sampling approach, unlike the SGS and the kriging, allowed not only an accurate volume estimation but also the generation of a set of realistic bedrock simulations. The complex karstic geomorphological features are reproduced, and can be used to significantly improve for example the precision of under-glacial flow estimation.
  • Publication
    Accès libre
    Conditioning Multiple-Point Statistics Simulation to Inequality Data
    Stochastic modeling is often employed in environmental sciences for the analysis and understanding of complex systems. For example, random fields are key components in uncertainty analysis or Bayesian inverse modeling. Multiple-point statistics (MPS) provides efficient simulation tools for simulating fields reproducing the spatial statistics depicted in a training image (TI), while accounting for local or block conditioning data. Among MPS methods, the direct sampling algorithm is a flexible pixel-based technique that consists in first assigning the conditioning data values (so-called hard data) in the simulation grid, and then in populating the rest of the simulation domain in a random order by successively pasting a value from a TI cell sharing a similar pattern. In this study, an extension of the direct sampling method is proposed to account for inequality data, that is, constraints in given cells consisting of lower and/or upper bounds for the simulated values. Indeed, inequality data are often available in practice. The new approach involves the adaptation of the distance used to compare and evaluate the match between two patterns to account for such constraints. The proposed method, implemented in the DeeSse code, allows generating random fields both reflecting the spatial statistics of the TI and honoring the inequality constraints. Finally examples of topography simulations illustrate and show the capabilities of the proposed method.
  • Publication
    Accès libre
    3D multiple-point statistics simulations of the Roussillon Continental Pliocene aquifer using DeeSse
    (2020-10) ; ; ;
    Issautier, Benoît
    ;
    Cabellero, Yvan
    This study introduces a novel workflow to model the heterogeneity of complex aquifers using the multiplepoint statistics algorithm DeeSse. We illustrate the approach by modeling the Continental Pliocene layer of the Roussillon aquifer in the region of Perpignan (southern France). When few direct observations are available, statistical inference from field data is difficult if not impossible and traditional geostatistical approaches cannot be applied directly. By contrast, multiple-point statistics simulations can rely on one or several alternative conceptual geological models provided using training images (TIs). But since the spatial arrangement of geological structures is often non-stationary and complex, there is a need for methods that allow to describe and account for the non-stationarity in a simple but efficient manner. The main aim of this paper is therefore to propose a workflow, based on the direct sampling algorithm DeeSse, for these situations. The conceptual model is provided by the geologist as a 2D non-stationary training image in map view displaying the possible organization of the geological structures and their spatial evolution. To control the non-stationarity, a 3D trend map is obtained by solving numerically the diffusivity equation as a proxy to describe the spatial evolution of the sedimentary patterns, from the sources of the sediments to the outlet of the system. A 3D continuous rotation map is estimated from inferred paleoorientations of the fluvial system. Both trend and orientation maps are derived from geological insights gathered from outcrops and general knowledge of processes occurring in these types of sedimentary environments. Finally, the 3D model is obtained by stacking 2D simulations following the paleotopography of the aquifer. The vertical facies transition between successive 2D simulations is controlled partly by the borehole data used for conditioning and by a sampling strategy. This strategy accounts for vertical probability of transitions, which are derived from the borehole observations, and works by simulating a set of conditional data points from one layer to the next. This process allows us to bypass the creation of a 3D training image, which may be cumbersome, while honoring the observed vertical continuity.