16 documents found in 262ms
# 1
Zaccarelli, Riccardo
Abstract: The task of downloading comprehensive datasets of event-based seismic waveforms has been made easier through the development of standardised web services, but is still highly non-trivial, as the likelihood of temporary network failures or even worse subtle data errors naturally increase when the amount of requested data is in the order of millions of relatively short segments. This is even more challenging as the typical workflow is not restricted to a single massive download but consists of fetching all possible available input data (e.g., with several repeated download executions) for a processing stage producing any desired user-defined output. Here, we present stream2segment, a highly customisable Python 2+3 package helping the user through the whole workflow of downloading, inspecting and processing event-based seismic data by means of a relational database management system as archiving storage, which has clear performance and usability advantages. Stream2segment provides an integrated processing implementation able to produce any kind of user-defined output based on a configuration file and a user-defined Python function. Stream2segment can also produce diagnostic maps or user-defined plots which, unlike existing tools, do not require external software dependencies and are not static images but interactive browser-based applications ideally suited for data inspection or annotation tasks.
# 2
Ziegler, Moritz • Heidbach, Oliver
Abstract: The distribution of data records for the maximum horizontal stress orientation S_Hmax in the Earth’s crust is sparse and very unequally. To analyse the stress pattern and its wavelength and to predict the mean S_Hmax orientation on regular grids, statistical interpolation as conducted e.g. by Coblentz and Richardson (1995), Müller et al. (2003), Heidbach and Höhne (2008), Heidbach et al. (2010) or Reiter et al. (2014) is necessary. Based on their work we wrote the Matlab® script Stress2Grid that provides several features to analyse the mean S_Hmax pattern. The script facilitates and speeds up this analysis and extends the functionality compared to the publications mentioned before. This script is the update of Stress2Grid v1.0 (Ziegler and Heidbach, 2017). It provides two different concepts to calculate the mean S_Hmax orientation on regular grids. The first is using a fixed search radius around the grid points and computes the mean S_Hmax orientation if sufficient data records are within the search radius. The larger the search radius the larger is the filtered wavelength of the stress pattern. The second approach is using variable search radii and determines the search radius for which the standard deviation of the mean S_Hmax orientation is below a given threshold. This approach delivers mean S_Hmax orientations with a user-defined degree of reliability. It resolves local stress perturbations and is not available in areas with conflicting information that result in a large standard deviation. Furthermore, the script can also estimate the deviation between plate motion direction and the mean S_Hmax orientation. The script is fully documented by the accompanying WSM Technical Report 19/02 (Ziegler and Heidbach, 2019) which includes a changelog in the beginning.
# 3
Ziegler, Moritz • Heidbach, Oliver
Abstract: The distribution of data records for the maximum horizontal stress orientation SHmax in the Earth’s crust is sparse and very unequally. In order to analyse the stress pattern and its wavelength or to predict the mean SHmax orientation on a regular grid, statistical interpolation as conducted e.g. by Coblentz and Richardson (1995), Müller et al. (2003), Heidbach and Höhne (2008), Heidbach et al. (2010) or Reiter et al. (2014) is necessary. Based on their work we wrote the Matlab® script Stress2Grid that provides several features to analyse the mean SHmax pattern. The script facilitates and speeds up this analysis and extends the functionality compared to aforementioned publications. The script is complemented by a number of example and input files as described in the WSM Technical Report (Ziegler and Heidbach, 2017, http://doi.org/10.2312/wsm.2017.002). The script provides two different concepts to calculate the mean SHmax orientation on a regular grid. The first is using a fixed search radius around the grid point and computes the mean SHmax orientation if sufficient data records are within the search radius. The larger the search radius the larger is the filtered wavelength of the stress pattern. The second approach is using variable search radii and determines the search radius for which the variance of the mean SHmax orientation is below a given threshold. This approach delivers mean SHmax orientations with a user-defined degree of reliability. It resolves local stress perturbations and is not available in areas with conflicting information that result in a large variance. Furthermore, the script can also estimate the deviation between plate motion direction and the mean SHmax orientation.
# 4
Dreiling, Jennifer • Tilmann, Frederik
Abstract: BayHunter is an open source Python tool to perform an McMC transdimensional Bayesian inversion of receiver functions and/ or surface wave dispersion. It is inverting for the velocity-depth structure, the number of layers and noise parameters (noise correlation and amplitude). The forward modeling codes are provided within the package, but are easily replaceable with own codes. It is also possible to add (completely different) data sets. The BayWatch module can be used to live-stream the inversion while it is running: this makes it easy to see how each chain is exploring the parameter space, how the data fits and models change and in which direction the inversion progresses.
# 5
Radosavljevic, Boris
Abstract: This publication contains tools for statistical evaluation and exploration of data published by Radosavljevic et al. (2016). These data contain bulk geochemistry data (total organic carbon, nitrogen, stable carbon isotope) and granulometry of nearshore samples in the vicinity of Herschel Island, Yukon, Canadian Beaufort Sea. In addition, the functions of the script herein provide a means for summaries and comparison with terrestrial (Couture, 2010; Tanski et al., 2017; Obu et al., 2016) and marine (a subset of Naidu et al., 2000) data. The tools are contained in a script written for the R software environment for statistical computing and graphics. The script (sediments_geochemistry_plots_and_summaries.r) is richly documented and explains the functionality. Each data file also contains a description of the data in a comma separated file (csv).The functions of the script are:myinteract() - interactive modemysum() - provides numerical summaries for WBP and TB, a box plot and runs a Two-sided Mann-Whitney-Wilcoxon testmyloc() - provides numerical summaries and comparisons among the current study, marine, and terrestrial samples, a box plot and runs a Two-sided Mann-Whitney-Wilcoxon testmyseds() - provides numerical summaries and comparisons of grain size data among the current studymycums() - plots cumulative frequency curves of grain size distributions by transectThe package contains (included in the zip folder):sediments_geochemistry_plots_and_summaries.r - script filegeochemistry_data_including_other_studies.csv - contains data by Radosavljevic et al. (2016) and other studies in the regionVolFrequenciesCoordsTransects.csv - contains volumetric grain size frequenciesgranulometry_stats.csv - contains summary statistics of grain size dataTransectSampleIndex.csv - provides an index of transectsTransectMap.png - an overview map of sample transects
# 6
Ziegler, Moritz O.
Abstract: The 3D geomechanical-numerical modelling of the in-situ stress state requires observed stress information at reference locations within the model area to be compared to the modelled stress state. This comparison of stress states and the ensuing adaptation of the displacement boundary conditions provide a best fit stress state in the entire model region that is based on the available stress information. This process is also referred to as calibration. Depending on the amount of available information and the complexity of the model the calibration is a lengthy process of trial-and-error modelling and analysis. The Fast Automatic Stress Tensor Calibration (FAST Calibration) is a method and a Matlab script that facilitates and speeds up the calibration process that has been developed in the framework of the World Stress Map (WSM, Heidbach et al., 2010; 2016). The method requires only three model scenarios with different boundary conditions. The modelled stress states at the locations of the observed stress state are extracted. Then they are used to compute the displacement boundary conditions that are required in order to achieve the best fit of the modelled to the observed stress state. Furthermore, the influence of the individual observed stress information on the resulting stress state can be weighted. The FAST-Calibration (Fast Automatic Stress Tensor Calibration) is a Matlab tool that controls the statistical calibration of a 3D geomechanical-numerical model of the stress state following the approach described by Reiter and Heidbach (2014), Hergert et al. (2015), and Ziegler et al. (2016). It is mainly designed to support the multi-stage modelling procedure presented by Ziegler et al. (2016). However, it can also be used for the calibration of a single-stage model. The tools run in Matlab 2017a and higher and are meant to work with the visualization software Tecplot 360 EX 2015 R2 and higher (https://www.tecplot.com/products/tecplot-360/) in conjunction with the Tecplot 360 Add-on GeoStress (Stromeyer and Heidbach, 2017). The user should be familiar with 3D geomechanical-numerical modelling, Matlab, Tecplot 360 EX, including a basic knowledge of Tecplot 360 EX macro functions, and the Tecplot 360 EX Add-on GeoStress. This FAST Calibration manual provides an overview of the scripts and is designed to help the user to adapt the scripts for their own needs.
# 7
Ziegler, Moritz O. • Ziebarth, Malte • Reiter, Karsten
Abstract: In geosciences the discretization of complex 3D model volumes into finite elements can be a time-consuming task and often needs experience with a professional software. Especially outcropping or out-pinching geological units, i.e. geological layers that are represented in the model volume, pose serious challenges. Changes in the geometry of a model may occur well into a project at a point, when re-meshing is not an option anymore or would involve a significant amount of additional time to invest. In order to speed up and automate the process of discretization, Apple PY (Automatic Portioning Preventing Lengthy manual Element assignment for PYthon) separates the process of mesh-generation and unit assignment. It requires an existing uniform mesh together with separate information on the depths of the interfaces between geological units (herein called horizons). These two pieces of information are combined and used to assign the individual elements to different units. The uniform mesh is created with a standard meshing software and contains no or only very few and simple structures. The mesh has to be available as an Abaqus input file. The information on the horizons depths and lateral variations in the depths is provided in a text file. Apple PY compares the element location and depth with that of the horizons in order to assign each element to a corresponding geological unit below or above a certain horizon.
# 8
Nooshiri, Nima • Heimann, Sebastian • Tilmann, Frederik • Dahm, Torsten • Saul, Joachim
Abstract: We present SCOTER, an open-source Python programming package that is designed to relocate multiple seismic events by using direct P- and S-wave station correction terms. The package implements static and shrinking-box source-specific station terms techniques extended to regional and teleseimic distances and adopted for probabilistic, non-linear, global-search location for large-scale multiple-event location. This program provides robust relocation results for seismic event sequences over a wide range of spatial and temporal scales by applying empirical corrections for the biasing effects of 3-D velocity structure. Written in the Python programming language, SCOTER is run as a stand-alone command-line tool (requiring no knowledge of Python) and also provides a set of sub-commands to develop required input files (e.g. phase files, travel-time grid files, configuration) and export relocation results (such as hypocenter parameters, travel-time residuals) in different formats -- routine but non-trivial tasks that can consume much user time. This package can be used for relocating data sets in local, regional, and teleseimic scales.
# 9
Quinteros, Javier
Abstract: This service provides routing information for distributed data centres, in the case where multiple different seismic data centres offer access to data and products using compatible types of services. Examples of the data and product objects are seismic timeseries waveforms, station inventory, or quality parameters from the waveforms. The European Integrated Data Archive (EIDA) is an example of a set of distributed data centres (the EIDA „nodes“). EIDA have offered Arclink and Seedlink services for many years, and now offers FDSN web services, for accessing their holdings. In keeping with the distributed nature of EIDA, these services could run at different nodes or elsewhere; even on computers from normal users. Depending on the type of service, these may only provide information about a reduced subset of all the available waveforms. To be effective, the Routing Service must know the locations of all services integrated into a system and serve this information in order to help the development of smart clients and/or services at a higher level, which can offer the user an integrated view of the entire system (EIDA), hiding the complexity of its internal structure. The service is intended to be open and able to be queried by anyone without the need of credentials or authentication.
# 10
Heimann, Sebastian • Isken, Marius • Kühn, Daniela • Sudhaus, Henriette • Steinberg, Andreas • (et. al.)
Abstract: Grond is an open source software tool for robust characterization of earthquake sources. Moment tensors and finite fault rupture models can be estimated from a combination of seismic waveforms, waveform attributes and geodetic observations like InSAR and GNSS. It helps you to investigate diverse magmatic, tectonic, and other geophysical processes at all scales. It delivers meaningful model uncertainties through a Bayesian bootstrap-based probabilistic joint inversion scheme. The optimisation explores the full model space and maps model parameter trade-offs with a flexible design of objective functions. Rapid forward modelling is enabled by using pre-computed Green's function databases, handled through the Pyrocko software library. They serve synthetic near-field surface displacements and synthetic seismic waveforms for arbitrary earthquake source models and geometries.
spinning wheel Loading next page