440 documents found in 384ms
# 1
Jatnieks, Janis • Sips, Mike • De Lucia, Marco • Dransch, Doris
Abstract: Geochemical models are used to seek answers about composition and evolution of groundwater, spill remediation, viability of geothermal resources and other important geoscientific applications. To understand these processes, it is useful to evaluate geochemical model response to different input parameter combinations. Running the model with varying input parameters creates a large amount of output data. It is a challenge to screen this data from the model to identify the significant relationships between input parameters and output variables. For addressing this problem we developed a Visual Analytics approach in an ongoing collaboration between Geoinformatics and Hydrogeology sections of GFZ German Research Centre for Geosciences. We implement our approach as an interactive data exploration tool called the GCex. GCex is a Visual Analytics approach and prototype that supports interactive exploration of geochemical models. It encodes many-to-many input/output relationships by the simple yet effective approach called Stacked Parameter Relation (SPR). GCex assists in the setup of simulations, model runs, data collection and result exploration, greatly enhancing the user experience in tasks such uncertainty and sensitivity analysis, inverse modeling and risk assessment. While in principle model-agnostic, the prototype currently supports and is tied to the popular geochemical code PHREEQC. Modification to support other models would not be complicated. GCex prototype was originally written by Janis Jatnieks at GFZ-Potsdam. It relies on Rphree (R-PHREEQC geochemical simulation model interface) written by Marco De Lucia at GFZ-Potsdam. A compatible version of Rphee is bundled with this installation.
# 2
Jatnieks, Janis • De Lucia, Marco • Sips, Mike • Dransch, Doris
Abstract: Surrogate playground is an automated machine learning approach written for rapidly screening a large number of different models to serve as surrogates for a slow running simulator. This code was written for a reactive transport application where a fluid flow model (hydrodynamics) is coupled to a geochemistry simulator (reactions in time and space) to simulate scenarios such as underground storage of CO2 or hydrogen storage for excess energy from wind farms. The challenge for such applications is that the geochemistry simulator is typically slow compared to fluid dynamics and constitutes the main bottleneck for producing highly detailed simulations of such application scenarios. This approach attempts to find machine learning models that can replace the slow running simulator when trained on input-output data from the geochemistry simulator. The code may be of more general interest as this prototype can be used to screen many different machine learning models for any regression problem in general. To illustrate this it also includes a demonstration example using the Boston housing standard data-set.
# 3
Muñoz, Gerard • Ritter, Oliver • Weckmann, Ute • Meqbel, Naser • Becken, Michael
Abstract: The Integrated Geophysical Exploration Technologies for Deep Fractured Geothermal Systems project (I-GET) was aimed at developing an innovative strategy for geophysical exploration, particularly to exploit the full potential of seismic and electromagnetic exploration methods in detecting permeable zones and fluid bearing fractures. The proposed geothermal exploration approach was applied in selected European geothermal systems with different geological and thermodynamic reservoir characteristics: in Italy (high enthalpy reservoir in metamorphic rocks), in Iceland (high enthalpy reservoir in volcanic rocks) and in Germany and Poland (low to middle enthalpy reservoir in sedimentary rocks). The Groß Schönebeck in-situ geothermal laboratory, located 40 km north of Berlin in northeastern Germany, is a key site for testing the geothermal potential of deep sedimentary basins. The target reservoir is located in Lower Permian sandstones and volcanic strata, which host deep aquifers throughout the Northeast German Basin (NEGB). The laboratory consists of two 4.3-km-deep boreholes. The electrical conductivity of the subsurface is a very important parameter for characterizing geothermal systems as hot and mineralized (saline) fluids of deep aquifers can be imaged as regions of high electrical conductivity. In the first phase of the I-GET project, carried out in summer 2006, MT data was recorded at 55 stations along a 40-km long profile. In order to reduce the effect of the cultural noise, 4 remote reference stations located at distances of about 100 km from the profile were used. This profile is spatially coincident with a seismic tomography profile (Bauer et al., 2010). The main objective of the geophysical site characterization experiments was to derive combined electrical conductivity and P- and S-velocity tomographic models for a joint interpretation in high resolution. The data are provided in EMERALD format (Ritter et al., 2015). The folder structure and content is described in detail in Ritter et al., 2019. The project specific description is available in the associated data description file including information on the experimental setup and data collection, the instrumentation, recording configuration and data processing. Scientific outcomes of this project were published by Muñoz et al., (2010a, 2010b).
# 4
Mikhailova, Natalya • Poleshko, N.N. • Aristova,, I.L. • Mukambayev, A.S. • Kulikova, G.O.
Abstract: Version History11 Sep 2019: Release of Version 1.1 with the following changes: (1) new licence: CC BY SA 4.0, modification of the title: removal of file name and version); (2) addition of ORIDs when available. The metadata of the first version 1.0 is available in the download folder.. Data and file names remain unchanged. The EMCA (Earthquake Model Central Asia) catalogue (Mikhailova et al., 2015) includes information for 33620 earthquakes that occurred in Central Asia (Kazakhstan, Kyrgyzstan, Tajikistan, Uzbekistan and Turkmenistan). The catalogue provides for each event the estimated magnitude in terms of MLH (surface wave magnitude) scale, widely used in former USSR countries.MLH magnitudes range from 1.5 to 8.3. Although the catalogue spans the period from 2000 BC to 2009 AD, most of the entries (i.e. 33378) describe earthquakes that occurred after 1900. The catalogue includes the standard parametric information required for seismic hazard studies (i.e., time, location and magnitude values). The catalogue has been composed by integrating different sources (using different magnitude scales) and harmonised in terms of MLH scale. The MLH magnitude is determined from the horizontal component of surface waves (Rautian and Khalturin, 1994) and is reported in most of the seismic bulletins issued by seismological observatories in Central Asia. For the instrumental period MLH magnitude was estimated, when not directly measured, either from body wave magnitude (Mb), the energy class (K) or Mpva (regional magnitude by body waves determined by P-wave recorded by short-period instruments) using empirical regression analyses. The following relationships were used to estimate MLH (see Mikhailova, internal EMCA report, 2014):(1) MLH=0.47 K-1.15(2) MLH=1.34 Mb-1.89(3) MLH=1.14 Mpva-1.45When multiple scales were available for the same earthquake, priority was given to the conversion from K class. For the historical period, the MLH values were obtained from macroseismic information (Kondorskaya and Ulomov, 1996).
The catalogue is distributed as a ascii file in CSV (Comma Separated Value) format and UTF-8 encoding. A separate .csvt file is provided for column type specification (useful for importing the .csv file in QGIS and other similar environments).For each event the estimated location is provided as longitude, latitude, with the following spatial reference system: +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defsWhen possible, precise indication of the events´ time in UTC format are provided.Distribution file: "EMCA_SeismoCat_v1.0.csv" Version: v1.0 Release date: 2015-07-30Header of CSV file:id: (int) serial ID of the eventyear: (int) Year of the event. Negative years refer to BCE (Before Common Era / Before Christ) eventsmonth: (int, 1-12) Month of the year for the eventday: (int, 1-31) Day of the month for the eventhour : (int, 0-23) Hour of the daymin: (int, 0-59) Minute of the hoursec: (int, 0-59) Second (and hundredth of second, if available) of the minutelat: (float) Latitude of the eventlon: (float) Longitude of the eventfdepth: (int) Focal depth of event in kmmlh: (float) Surface wave magnitude (see e.g. Rautian T. and V. Khalturin, 1994)
# 5
Ullah, Shahid • Abdrakhmatov, Kanat • Sadykova, Alla • Ibragimov, Roman • Ishuk, Anatoly • (et. al.)
Abstract: Version History11 Sep 2019: Release of Version 1.1 with the following changes: (1) new licence: CC BY SA 4.0, modification of the title: removal of file name and version); (2) addition of ORIDs when available; (3) actualisation of affiliations for some authors The metadata of the first version 1.0 is available in the download folder.. Data and file names remain unchanged. Area Source model for Central AsiaThe area sources for Central Asia within the EMCA model are defined by mainly considering the pattern of crustal seismicity down to 50 km depth. Although tectonic and geological information, such as the position and strike distribution of known faults, have also been taken into account when available. Large area sources (see, for example source_id 1, 2, 5, 45 and 52, source ids are identified by parameter “source_id” in the related shapefile) are defined where the seismicity is scarce and there are no tectonic or geological features that would justify a further subdivision. Smaller area sources (e.g., source_id values 36 and 53) have been designed where the seismicity can be assigned to known fault zones.In order to obtain a robust estimation of the necessary parameters for PSHA derived by the statistical analysis of the seismicity, due to the scarcity of data in some of the areas covered by the model, super zones are introduced. These super zones are defined by combining area sources based on similarities in their tectonic regime, and taking into account local expert’s judgments. The super zones are used to estimate: (1) the completeness time of the earthquake catalogue, (2) the depth distribution of seismicity, (3) the tectonic regime through focal mechanisms analysis, (4) the maximum magnitude and (5) the b values via the GR relationship.The earthquake catalogue for focal mechanism is extracted from the Harvard Global Centroid Moment Tensor Catalog (Ekström and Nettles, 2013). For the focal mechanism classification, the Boore et al. (1997) convention is used. This means that an event is considered to be strike-slip if the absolute value of the rake angle is <=30 or >=150 degrees, normal if the rake angle is <-30 or >-150 and reverse (thrust) if the rake angle is >30 or <150 degrees. The distribution of source mechanisms and their weights are estimated for the super zones.For area sources, the maximum magnitude is usually taken from the historical seismicity, but due to some uncertainties in the magnitudes of the largest events, the opinions of the local experts are also included in assigning the maximum magnitude to each super zone. Super zones 2 and 3, which belongs to stable regions, are each assigned a maximum magnitude of 6, after Mooney et al. (2012), which concludes after analyses and observation of modern datasets that at least an event of magnitude 6 can occur anywhere in the world. For hazard calculations, each area source is assigned the maximum magnitude of their respective super zone.For processing the GR parameters (a and b values) for the area sources, the completeness analysis results estimated for the super zones are assigned to the respective smaller area sources. If the individual area source has at least 20 events, the GR parameters are then estimated for the area source. Otherwise, the b value is adopted from the respective super zone to which the smaller area source belongs, and the a value is estimated based on the Weichert (1980) method. This ensures the stability in the b value as well as the variation of activity rate for different sources.The hypocentral depth distribution is estimated from the seismicity inside each super zone. The depth distribution is considered for maximum up to three values. Based on the number of events, the weights are assigned to each distribution. These depth distributions, along with corresponding weights, are further assigned to the area sources within the same super zones.
Distribution file: "EMCA_seismozonesv1.0_shp.zip"Version: v1.0Release date: 2015-07-30Format: ESRI ShapefileGeometry type: polygonsNumber of features: 63Spatial Reference System: +proj=longlat +ellps=WGS84 +datum=WGS84 +no_defs Distribution file: "EMCA_seismozonesv1.0_nrml.zip"Version: v1.0Release date: 2015-07-30Format: NRML (XML) Format compatible with the GEM OpenQuake platform (http://www.globalquakemodel.org/openquake/about/platform/) Feature attributes:src_id : Id of the seismic sourcesrc_name : Name of the seismic sourcetect_reg: Tectonic regime of the seismic sourceupp_seismo : Upper level of the the seismogenic depth (km)low_seismo : Lower level of the seismogenic depth (km)mag_scal_r: Magnitude scaling relationshiprup_asp_ra: Rupture aspect ratiomfd_type : Magnitude frequency distribution typemin_mag: Minimum magnitude of the magnitude frequency relationshipmax_mag: Maximum magnitude of the magnitude frequency relationshipa_value: a value of the magnitude frequency relationshipb_balue : b value of the magnitude frequency relationshipnum_npd: number of nodal plane distributionweight_1 : weight of 1st nodal plane distributionstrike_1: Strike of the seismic source (degrees)rake_1: rake of the seismic source (degrees)dip_1: dip of the seismic source (degrees)num_hdd: number of hypocentral depth distributionhdd_d_1: Depth of 1st hypocentral depth distribution (km)hdd_w_1: Weight of 1st hypocentral depth distribution
# 6
Encarnacao, Joao • Visser, Pieter • Jaeggi, Adrian • Bezdek, Ales • Mayer-Gürr, Torsten • (et. al.)
Abstract: Although the knowledge of the gravity of the Earth has improved considerably with CHAMP, GRACE and GOCE satellite missions, the geophysical community has identified the need for the continued monitoring of its time-variable component with the purpose of estimating the hydrological and glaciological yearly cycles and long-term trends. Currently, the GRACE-FO satellites are the sole provider of this data, while previously the GRACE mission collected these data for 15 years. Between the GRACE and GRACE-FO data periods lies a gap spanning from July 2017 to May 2018, while the Swarm satellites have collected gravimetric data with its GPS receivers since December 2013. This project aims at providing high-quality gravity field models from Swarm data that constitute an alternative and independent source of gravimetric data, which could help alleviate the consequences of the 10-month gap between GRACE and GRACE-FO, as well as the short gaps in the existing GRACE and GRACE-FO monthly time series. The geodetic community has realized that the combination of the different gravity field solutions is superior to any individual model. This project exploits this fact and delivers to the highest quality monthly-independent gravity field models, resulting from the combination of 4 different gravity field estimation approaches. All solutions are unconstrained and estimated independently from month to month. Preliminary comparison with GRACE data has demonstrated that the signal in the Swarm gravity field models is restricted to degrees 12-15 and below, while the temporal correlations decrease considerably above degree 10. The 750km smoothed models are suitable to retrieve the global annual temporal variations of Earth's gravity field and the agreement with GRACE over large basins (e.g. Amazon, Congo-Zambezi, Ganges-Brahmaputra) is within 1cm RMS in terms of Equivalent Water Height. The global RMS relative to a bias, trend, an annual and semi-annual model derived from GRACE over deep ocean areas (those roughly 1000km from shorelines) is under 1mm geoid height during periods of low ionospheric activity. More information about this project can be found at https://www.researchgate.net/project/Multi-approach-gravity-field-models-from-Swarm-GPS-data and ESA's Swarm DISC (the Data, Innovation and Science Cluster) Website (https://earth.esa.int/web/guest/missions/esa-eo-missions/swarm/activities/scientific-projects/disc#MAGF). This project is funded by ESA via the Swarm DISC, Sub-Contract No. SW-CO-DTU-GS-111.
# 7
Schleicher, Anja Maria • Jurado, Maria-Jose
Abstract: This data publication uses XRD bulk rock analyses carried out on cuttings aboard D/V Chikyu during the International Ocean Discovery Program (IODP) Expeditions 338 and 348 of the Nankai Trough Seismogenic Zone Experiment (NanTroSEIZE) project (Strasser et al, 2014, Tobin et al., 2015). More data on clay minerals in the C0002F and C0002P holes are published by Underwood and Song (2016a and 2016b), and Underwood (2017). These data are supplementary material for Schleicher and Jurado (2019). XRD data of the clay size fraction were analyzed at the University of Michigan, USA, and the GFZ Potsdam, Germany. All XRD analyses of the random powder and texture (oriented) preparation followed the analytical methods described in Moore and Reynolds (1997). Oriented clay size samples were measured under air-dried and glycolated conditions, the latter treatment caused interlayer expansion of swelling clays, allowing the recognition of discrete smectite and mixed-layer smectitic phases. In order to compare the clay mineral content, and the mineral amount relative to the adjacent material, exactly 45 μg of the material was mixed with 1.5 ml deionized water and dropped on a round glass slide (diameter 32 mm). All air-dried samples were measured at a relative humidity (RH) of ~30%, and afterward stored in a desiccator filled with ethylene glycol, in order to investigate the final swelling stage of the smectitic phases. The data are provided as tab delimited table (2019-002_Schleicher-Jurado_XRD-data.txt, see also Table 1 in Schleicher and Jurado, 2019) with the following columns:- Hole: name of the C0002 subhole- Depth (mbsf): depth in meter below surface (mbsf)- Sample (SMW): sample number SMW (solid cuttings taken from drilling mud)- Smectite (int./cps): intensity of smectite in counts per second (cps)- Illite (int./cps): intensity of smectite in counts per second (cps) In addition, the original XRD measurements are provided in raw and text formats (2019-002_Schleicher-Jurado_original-XRD-measurements.zip). All science data from these expeditions are also accesible via the Database of the science data acquired by International Ocean Discovery Program and Integrated Ocean Drilling Program expeditions of D/V Chikyu (http://sio7.jamstec.go.jp/).
# 8
Warsitzka, Michael • Závada, Prokop • Pohlenz, Andre • Rosenau, Matthias
Abstract: This dataset provides friction data from ring-shear tests (RST) for a quartz sand used in analogue experiments at the Institute of Geophysics of the Czech Academy of Science (IGCAS) (Kratinová et al., 2006; Zavada et al., 2009; Lehmann et al., 2017; Krýza et al., 2019). It is characterized by means of internal friction coefficients µ and cohesion C. According to our analysis the materials show a Mohr-Coulomb behaviour characterized by a linear failure envelope. Peak friction coefficients µP of the tested material is ~0.75, dynamic friction coeffi-cients µD is ~0.60 and reactivation friction coefficients µR is ~0.64. Cohesions of the material range between 90 and 130 Pa. The material shows a minor rate-weakening of <1% per ten-fold change in shear velocity v.
# 9
Schimmel, Mariska • Hangx, Suzanne • Spiers, Chris
Abstract: We studied the effect of pore fluid chemistry on compaction creep in quartz sand aggregates, as an analogue for clean, highly porous, quartz-rich reservoir sands and sandstone. Creep is specifically addressed, because it is not yet well understood and can potentially cause reservoir compaction even after production has ceased. Going beyond previous work, we focused on fluids typically considered for pressure maintenance or for permanent storage, e.g. water, wastewater, CO2 and N2, as well as agents, such as AlCl3, a quartz dissolution inhibitor, and scaling inhibitors used in water treatment facilities and geothermal energy production. Uniaxial (oedometer) compaction experiments were performed on cylindrical sand samples at constant effective stress (35 MPa) and constant temperature (80 °C), simulating typical reservoir depths of 2-4 km. Insight into the deformation mechanisms operating at the grain scale was obtained via acoustic emission (AE) counting, and by means of microstructural study and grain size analysis applied before and after individual compaction tests.
Data logging and output:The present data was obtained using an Instron loading frame employed with a uniaxial (oedometer) compaction vessel located in the HPT laboratory at Utrecht University. A complete description of the machine is provided by Schimmel et al., (2019). Mechanical and acoustic emission (AE) data were recorded at 1 Hz using National Instrument (NI) VI Logger software, an overview is presented in Table 1. Table 1. Overview of recorded data Name Unit Description Row - - Instron load V Load externally measured by the Instron loading frame Instron position V Position of the Instron loading ramp measured by the Instron LVDT Local load V Load internally measured by the local load cell Local position V Position of the top measured by the local LVDT Temperature V Sample temperature measured close to the sample Count A - Number of AE counts from counter A Count B - Number of AE counts from counter B Data processingAll measured quantities were converted to realistic units using the following conversions: - Time [s] = row * 1 - Instron load [kN]= Instron load [V] * 10 - Instron position [mm] = Instron position [V] * 5 - Local load [kN] = local load [V] * 33.3 - Local position [mm]= local position [V] * -0.100684133 - Temperature [°C] = temperature [V] * 100 The displacement data were calculated from the Instron and local position, which were corrected for apparatus distortion and thermal expansion using calibrations carried out in an empty vessel at pressure and temperature conditions covering the present experiments. The displacement data (D) were corrected according to Dsample = Dtotal – D¬distortion Where D¬distortion = 1.126e-09 * x8 - 7.744e-08 * x7 + 2.059e-06 * x6 - 2.5e-05 * x5 + 9.109e-05 * x4 + 0.0009916 * x3 - 0.01238 * x2 + 0.066 * x And is x is the applied load (Instron load). Microstructural dataGrain size analysis was performed on one undeformed and several deformed samples using a Malvern laser diffraction particle sizer. This allowed determination of the average grain size and grain size distribution before and after deformation. Laser particle size analysis systematically overestimates grain size by approximately 25 %, due to fines adhering to coarse grains. Stitched micrographs are given for one sample that was only pre-compacted and several samples that were allowed to creep after pre-compaction. Portions of these micrographs were used for crack density analysis.
# 10
Ziegler, Moritz O. • Ziebarth, Malte • Reiter, Karsten
Abstract: In geosciences the discretization of complex 3D model volumes into finite elements can be a time-consuming task and often needs experience with a professional software. Especially outcropping or out-pinching geological units, i.e. geological layers that are represented in the model volume, pose serious challenges. Changes in the geometry of a model may occur well into a project at a point, when re-meshing is not an option anymore or would involve a significant amount of additional time to invest. In order to speed up and automate the process of discretization, Apple PY (Automatic Portioning Preventing Lengthy manual Element assignment for PYthon) separates the process of mesh-generation and unit assignment. It requires an existing uniform mesh together with separate information on the depths of the interfaces between geological units (herein called horizons). These two pieces of information are combined and used to assign the individual elements to different units. The uniform mesh is created with a standard meshing software and contains no or only very few and simple structures. The mesh has to be available as an Abaqus input file. The information on the horizons depths and lateral variations in the depths is provided in a text file. Apple PY compares the element location and depth with that of the horizons in order to assign each element to a corresponding geological unit below or above a certain horizon. Version History: Version 1.01 (29 August 2019) : Bug fixes - no change in functionality Manual for Version 1.0 remains valid - elems_exclude works now as designed and described in the manual.- commenting out elems_exclude does not crash the script anymore.- create_horizon_file does not create two instances of the uppermost horizon.
spinning wheel Loading next page