44 documents found in 119ms
# 1
Christian Willmes • Daniel Becker • Sebastian Brocks • Christoph Hütt • Georg Bareth
Abstract: Python based pyGRASS scripts implementing Köppen-Geiger climate classifications from CMIP5 and PMIP3 climate model simulation datasets. And a script for computation of area statistics of the classification data, as well as documentation for using the scripts and accessing the source climate simulation data.
# 2
Christian Willmes • Daniel Becker • Sebastian Brocks • Christoph Hütt • Georg Bareth
Abstract: This geospatial dataset, in raster and vector format, is a Köppen-Geiger climate classification of the MPI-ESM-P Last Glacial Maximum (21k yBP) r1i1p1 model simulations according to the PMIP III 21k experiment. The classifications were computed using the Python pyGRASS library and GRASS GIS.
# 3
Christian Willmes • Daniel Becker • Christoph Hütt • Sebastian Brocks • Georg Bareth
Abstract: This geospatial dataset, in raster and vector format, is a Kppen-Geiger climate classification of the MPI-ESM-P Mid-Holocene (6k yBP) r1i1p1 model simulations according to the PMIP III 21k experiment. The classifications were computed using the Python pyGRASS library and GRASS GIS.
# 4
Christian Willmes • Daniel Becker • Sebastian Brocks • Christoph Hütt • Georg Bareth
Abstract: This geospatial dataset, in raster and vector format, is a Kppen-Geiger climate classification of the MPI-ESM-P PreIndustrial r1i1p1 model simulations according to the PMIP III 21k experiment. The classifications were computed using the Python pyGRASS library and GRASS GIS.
# 5
Martina Klose • Stefan Zellmann • Yaping Shao • Ulrich Lang
Abstract: The video shows results of a large-eddy simulation of Convective Turbulent Dust Emission (CTDE) described by Klose and Shao (2013), Article Large-eddy simulation of turbulent dust emission, Aeolian Research, 8 (doi10.1016/j.aeolia.2012.10.010). Modeled dust concentration is visualized with a volume rendering approach. A particle-tracing is applied to illustrate a dust devil occurring in the simulation. Colors indicate the direction of particle movement. The visualization has been done with DeskVOX, a visualization package developed at the Regional Computing Center Cologne. This video is licensed under CC BY 3.0 (https//creativecommons.org/licenses/by/3.0/).
# 6
T. Hauck • M. Domnina • C. Molden • J. Cetinkaya
Abstract: This dataset comprises the techno-typological analysis of the lithic artefact assemblage of Yabroud rock-shelter II – Layer 4. This assemblage is part of a much larger collection of lithic artefacts and bone tools excavated at different localities in Yabroud by Alfred Rust between 1930 and 1933. The Yabroud collection is housed at the Institute of Prehistoric Archaeology of Cologne University, Germany.
# 7
Daniel Becker
Abstract: Die Bewohner einer archäologischen Fundstätte müssen sich regelmäßig mit Ressourcen versorgen, dabei geht man davon aus, dass die Fläche die er dazu nutzen kann, durch seine fußläufige Mobilität begrenzt ist. Die Geschwindigkeit mit der der Mensch sich durch seine Umwelt bewegen kann, ist dabei, neben seiner physiologie, maßgeblich durch topographische Faktoren beeinflusst. Die Arbeit untersucht die Möglichkeit mit Hilfe von Berechnung der hangneigungsabhängigen Gehgeschwindigkeit Einzugsgebiete (Site Catchments) archäologischer Fundstätten zu modellieren und zeigt eine Methode auf, die dies ermöglicht. Neben der Hangneigung werden weitere Faktoren wie Gewässernetze, Vegetation und Schneebedeckung in die Modellierung mit einbezogen, um die Machbarkeit aufzuzeigen und ihren Einfluss auf die Einzugsgebiete zu demonstrieren. Als Endergebnis der Modellberechnungen werden neben Site Catchments auch Least Cost Paths modelliert, um die Auswirkungen der verschiedenen Faktoren auf ein potenzielles Verbindungsnetzwerk zwischen mehreren archäologischen Fundstätten darzustellen.
# 8
Daniel Kürner
Abstract: Die vorliegende Arbeit beschreibt die Implementation des Metadatenmanagements für Geodaten aus den Projekten des SFB806. Es soll ein internetbasiertes Informationssystem geschaffen werden, das den Mitgliedern des Pro jektes ermöglicht, Daten an einer zentralen Stelle zu veröffentlichen und somit der projektinternen Forschungsgemeinschaft zugänglich zu machen. Das Informationssystem soll eine sichere und nachhaltige Archivierung der Forschungsergebnisse aus den Projekten des SFB806 gewährleisten und gleichermaßen als integrierte Datenbasis die Forschungsarb eit der Pro jektmitglieder des SFB806 unterstützen.
# 9
Christian Willmes • Yasa Yener • Anton Gilgenberg • Georg Bareth
Abstract: This Poster contribution for the 2nd Research Data Management Workshop, held on November 27th and 28th at the University of Cologne, describes the advancements of the new CRC806-Databse frontend. It was decided to update the system with some major changes to the overall architecture, by preserving the current API functionality and the URLs of the datasets in the database. This paper describes the system architecture of the upcoming version of the CRC806-Database. The SDI part of the system is migrated from the current MapServer, GeoServer, MapProxy and pyCSW based implementation to a GeoNode based system. Additionally the Typo3 based frontend of the web portal is changed to use mostly server side Extbase & Fluid based content handling and rendering, instead of the current AgularJS based frontend. Due to stability and consistency difficulties of client side rendering we decided to build a more robust system and move to server side rendering.
# 10
Finn Viehberg • Christian Willmes • Sarah Esteban • Ralf Vogelsang
Abstract: Our research interests in palaeo-studies in East Africa remain high, partly because of the Out-of-Africa hypotheses predicting Homo sapiens origin to be situated in East Africa. Since several decades archaeologists and geoscientists explore suitable sites to answer related questions. Simultaneously, analytical methods applied to the archives improved in their sensibility or resolution over the given time. The amount of published scientific data is enormous, but has to be carefully checked on their robustness compared to modern standards. Therefore, it is necessary to compile datasets and excerpt the given data that are source of scientific interpretations (e.g., palaeoenvironment, palaeoclimate, evolution patterns, time models etc.). In addition, the names of the study sitesin East Africa are often transcribed from different languages or hold several synonyms for various reasons. Thus, we decided to use a semantic wiki to have the advantage of queryable structured data combined with the ability of web based frontend for collaborative editing of the content. The presented application is based on Semantic Mediawiki (SMW), an extension of the famous Mediawiki software, the widely used wiki system mostly developed and maintained by the Wikimedia foundation as the software base of Wikipedia. The SMW extension allows to enter structured semantic data on wiki pages. This data can then be queried through several interfaces within the wiki and the Mediawiki API as well as an SPARQL endpoint for access from external applications. Query results can be exported in several well known formats, such as CSV, XML, JSON, and more. It is also possible to display query results directly in the wiki, using a number of provided display formats, like tables, data graphs or maps. The presented system allows to collaboratively develop and maintain a data basis and thus implements a collaborative research environment (CRE). The data can be exported for use in scientific software packages for e.g. statistical or GIS analysis. A further outcome of the approach is a domain data model of the structured information stored in the system, which can be formalised and mapped to existing data bases to allow data integration between applications. Details of published and unpublished archaeological and geological sites/localities in East Africa are collected in the presented wiki including their bibliographical reference. For example from sediment records, results from available sedimentological/chemical/biological proxy data (e.g., grain size, total organic carbon, stable isotopes, diatoms, ostracods, magnetic susceptibility) are copied into the database including their spatial resolution. Related dating samples (i.e., 14C, OSL, TSL) are also included with their metadata and lab-codes.
spinning wheel Loading next page