Feed on
Posts
Comments

Stellenausschreibung Universität Heidelberg – GIScience

Wissenschaftliche Mitarbeiter:in Geoinformatik - Projekt GeCO

GeCO: Generating high-resolution CO2 maps by Machine Learning-based geodata fusion

Du hast Interesse an Klimawandel, Treibhausgasemissionen und innovativen Geoinformatik-Methoden?
Im Rahmen des vom Heidelberg Center for the Environment (HCE) durch die Exzellenzstrategie geförderten Kooperationsprojektes GeCO suchen wir baldmöglichst nach einer wissenschaftlichen Mitarbeiter:in (m/f/d). Die Abteilung Geoinformatik entwickelt im Projekt GeCO Methoden zur Generierung von räumlich hoch-aufgelösten CO2 Emissionsinventaren mittels Methoden aus Spatial Data Science und Machine Learning (insb. Deep Learning). Ziel ist es Eingabedaten für die nachfolgende Modellierung des atmosphärischen Transports durch die Projektpartner aus der Umweltphysik zu erstellen. Hierzu werden mehrere Geodatensätze zu Landnutzung und weiterer relevanter Quellen für Treibhausgase (Industrie, Verkehr, Wohnen, Abfall, Landwirtschaft etc.) genutzt. Eine wichtige Datenquelle stellt dabei OpenStreetMap (OSM) dar, welches Geodaten zu Gebäuden, Industrieanlagen, Verkehrsinfrastruktur und Landnutzung enthält, die die Grundlage für die Verortung von Emissionen liefern. Die Frage der Datenqualität wird untersucht und für die Bewertung der OSM-Objekte verwendet. Vgl.: https://www.geog.uni-heidelberg.de/gis/geoco.html

Wir bieten eine attraktive Stelle (Teilzeit) in einem interdisziplinär ausgerichteten dynamischen Team und in einem hochaktuellen Forschungsgebiet. Die Abteilung ist u.a. Mitglied im Interdisziplinären Zentrum für Wissenschaftliches Rechnen (IWR) der Universität und Gründungsmitglied des Heidelberg Center for the Environment (HCE). Das An-Institut HeiGIT gGmbH setzt die Forschungsergebnisse in praxisnahe Anwendungen um. Die Exzellenz-Universität Heidelberg bietet in einer der attraktivsten Städte Deutschlands ein besonders anregendes interdisziplinäres Forschungsumfeld mit vielen Entwicklungsmöglichkeiten und attraktiven Weiterbildungsangeboten.

Wir erwarten ein überdurchschnittlich abgeschlossenes Universitätsstudium oder eine Promotion in einem der Fächer Geoinformatik, Informatik, Geographie oder ähnlichen Disziplinen. Erforderlich sind neben ausgeprägtem Teamgeist und hoher Motivation, ausgezeichnete und breite Methodenkompetenz und Forschungserfahrungen im Bereich Geoinformatik, insbesondere in einigen der oben genannten Gebiete (Spatial Data Science, Geodata Fusion, Spatial Disaggregation, Machine Learning, Deep Learning, Programmierung, GeoDB), effektiver und effizienter Umgang mit sehr großen heterogenen Geodatensätzen, vertiefte Kenntnisse von OpenStreeetMap und die Fähigkeit zum selbständigen wissenschaftlichen Arbeiten und zum Projektmanagement, sowie ausgezeichnete Fähigkeiten zur Kommunikation und Präsentation.

Die Stelle ist baldmöglichst zu besetzen und zunächst bis 08/2023 befristet (Teilzeit). Die Vergütung erfolgt nach TV-L E13. Aussagekräftige Bewerbungsunterlagen (Zeugnisse, Referenzen, etc.) senden Sie baldmöglichst bis spätestens 10. Jan. 2022 - bzw. solange bis die Position besetzt ist - digital an bettina.knorr@uni-heidelberg.de. Es besteht die Option zum Thema eine Promotion zu schreiben. Schwerbehinderte werden bei gleicher Eignung bevorzugt berücksichtigt.

Wir freuen uns auf Deine Bewerbung!

GIScience Research Group

Heidelberg University

Prof. Dr. Alexander Zipf

Institute of Geography · INF 368 · 69120 Heidelberg

PDF: ausschreibunggisciencegeco2021.pdf

This week on 29 November, it was announced that our team member Nina Krašovec received Nahtigal Award from the Faculty of Arts of the University of Ljubljana (UL) for her master’s thesis “Detection of standing dead trees using leaf-on and leaf-off UAV-borne laser scanning point cloud data in mixed forests”. The research was conducted under the supervision of Assist. Prof. Dr. Blaž Repe (UL) in collaboration with the 3DGeo group and co-supervision of Prof. Dr. Bernhard Höfle.

The committee of the Department of Geography acknowledged her outstanding research and nominated her for the award. The Faculty received 30 theses from the departments. The committee thoroughly reviewed all the submitted works, evaluations from the supervisors and justifications from the departmental committees and selected four recipients of the Nahtigal Award. All four were also nominated for the prestigious Prešeren Prize for Students of the University of Ljubljana (one of whom received it).

The Nahtigal award is in memoriam of Rajko Nahtigal, a Slovenian Slavicist, philologist, academic and pedagogue. He was the first dean of the Faculty of Arts as well as the first president of the Slovenian Academy of Sciences and Arts.

We want to congratulate also the other awardees for their outstanding work.

Der nächste Vortrag der Heidelberger Geographischen Gesellschaft HGG findet am Dienstag, 02. November 2021, 19:15 Uhr ONLINE statt:

Dienstag, 30. November 2021, 19:15 Uhr
Prof. Dr. Alexander Brenning (Universität Jena)

Hangrutschungsmodellierung unter dem Einfluss von Klima- und Landnutzungswandel mit Data-Science-Methoden

Regionalskalige empirische Gefährdungsanalysen für Naturgefahren nutzen Innovationen der Datenwissenschaften, um gefährdete Reliefeinheiten präziser und effizienter zu identifizieren. Im Kontext von Hangrutschen kommen Methoden des maschinellen Lernens insbesondere bei der automatisierten Kartierung von Rutschungsinventaren sowie der Erstellung von Gefahrenhinweiskarten zum Einsatz. Die Frage der Abschätzung der Auswirkungen von Klima- und Landnutzungswandel steht ferner zunehmend im Fokus. Herausforderungen ergeben sich dabei aufgrund von Verzerrungen in Eingangsdaten, Modellüberanpassung und der Notwendigkeit von Extrapolationen. Gegenüber reinen Black-Box-Modellen versprechen hybride Modellierungsansätze, die prozessbasierte Elemente integrieren, eine verbesserte Plausibilität, Interpretierbarkeit und Übertragbarkeit. Der Vortrag gibt einen Überblick über methodische Ansätze anhand von Fallstudien aus aktuellen Forschungsprojekten.

HGG-Programm im Wintersemester 2021/22
(PDF-Flyer)

On Friday, November 26th, Dr. Carolin Klonner (GIScience Research Group) and Melanie Eckle-Elze (HeiGIT) will be supporting the Urban Context Unit of German Red Cross (GRC) in the “Data and Digitalisation in Urban Humanitarian Action” session at the Virtual Conference of the Red Cross Red Crescent Urban Collaboration Platform 2021.

The focus of the session is on the potentials and limitations of digital solutions in the context of humanitarian assistance. Participants will have the chance to learn about different developments and deployments of digital solutions in urban humanitarian action but also the related complexities and challenges.

The panel and audience will discuss and share experiences about the (dis) advantages of digital solutions in humanitarian assistance and potential approaches to collaboratively overcome current hindrances.

Sounds interesting? If you want to join the discussion, please follow the link and join us tomorrow at 1 pm CET. We are very looking forward to a lively discussion.

Registration for our Innsbruck Summer School of Alpine Research 2022 is now open (until 15 January 2022). If you want to learn innovative practical and methodological skills to characterize complex terrain and object features using close-range and remote sensing techniques - apply now!

The Summer School in the Ötztal Alps in Austria will be the fourth edition after three successful implementations in 2015, 2017, and 2019.

All details regarding learning objectives, keynote speakers, contents and program, registration and deadlines are given on the Summer School website: https://www.uibk.ac.at/geographie/summerschool/2022/

The 3DGeo Research Group will be co-organizing the summer school and lead assignments about 4D monitoring of high-mountain phenomena (4D rocks!).

At long last, welcome back to a new blog post of the How to become ohsome-series. As it’s been quite a while since you got an introduction to how to access the ohsome API, we would like to pick up this topic one more time this month. The former post with different ways to access the ohsome API is to be found here. Some new tools are available to help you analyze OpenStreetMap data. Below is a brief overview of the ohsome-py package, the ohsome R package, the ohsome QGIS plug-in ohsomeTools and ohsome2x. For each of the clients, we provide an example of how to query for monthly counts of school buildings (→ building=school) which are mapped as building outline (→ type:way) within a given boundary (”yourboundary.geojson“).

Access via python with the ohsome-py package:

If you want to analyze data with Python, the ohsome-py package can be your tool of choice and can be easily installed using pip. How it works is listed here, as well as an explanation of how to use it. For an even more in-depth introduction to using the ohsome-py package, you can check out this use case about public green spaces.

A request with ohsome-py could look like this:

import ohsome
import geopandas as gpd
client = ohsome.OhsomeClient()

bpolys = gpd.read_file("/yourpath/yourboundary.geojson")
example = client.elements.count.post(bpolys=bpolys,
                                      time='2010-01-01/2021-11-01/P1M',
                                      filter="type:way and building=school")
example_df = response.as_dataframe()


Access via R with the ohsome R package:

If you prefer to work with R, don’t worry, we have got you covered! One other of our projects is the ohsome R package, which allows you to send requests to the ohsome API via R. Again, you can find a very detailed explanation on how to install and use it here.

The ohsome R package is currently at an experimental stage. You can install it from Github.  While all kinds of queries to the ohsome API are possible with the package, it works most comfortably for OSM elements aggregation at the moment. A new version with full functionality is expected to be submitted to CRAN in the very near future.

A request with the ohsome R package could look like this:


library(ohsome)
library(sf)

htbo_example <- read_sf("yourboundary.geojson")
query <- ohsome_elements_count(
    boundary = htbo_example,
    filter = "type:way and building=school",
    time="2010-01-01/2021-11-01/P1M"
)
example <- ohsome_post(query)


Access via ohsomeTools:

If you want to analyze your data with QGIS and visualize it on a map, then the ohsome QGIS Plug-In (ohsomeTools) is the right tool for you. With it you can access the ohsome API directly in QGIS instead of sending a separate request and loading the data into your GIS. Please note that only QGIS v3.14 or newer is supported!

A very convenient feature of this tool is the automatic activation of the QGIS native temporal controller, if the geometry is suitable. Note that ohsomeTools has not yet been released in a public repository, but this will happen as soon as a suitable version of it is ready.

Again, you can find a short introduction on how to install and use the tool here. As you will surely notice, sending a request as well as using the output-file, is possible with very few clicks and in short time, which makes this tool a great addition for QGIS based examinations.

A request with ohsomeTools could look like this:


Access via ohsome2x:

Last but not least, our nodeJS client “ohsome2x” must not be left out. It allows you to access the ohsome API and store the output using either a command-line tool called “ohsome2x-cli”, or you can use it as library in your nodeJS scripts (JavaScript and TypeScript). For a small number of boundaries, you can request the ohsome API in one go and store the results in a simple GeoJSON output file, but the strength of ohsome2x is batch processing of thousands or millions of boundaries coming from a PostGIS DB. ohsome2x can then step by step query the ohsome API with parts of your input data and also cares about storage and useful indexing of the results in a PostGIS output table.

Of course you can find more information about installation and usage on the npm-registry and it’s repository.

A request with ohsome2x could look like this:

{
  "ohsomeQuery": {
    "queryType": "elements/count/groupBy/boundary",
    "filter": "building=school and type:way",
    "time": "2010-01-01/2021-11-01/P1M"
  },
  "source": {
    "geometryId": "id",
    "name": "yourboundary.geojson",
    "store": { "path": "yourboundary.geojson", "type": "geojson" }
  },
  "target": {
    "horizontalTimestampColumns": false,
    "createGeometry": true,
    "transformToWebmercator": false,
    "storeZeroValues": true,
    "computeValuePerArea": true,
    "name": "htbo_ohsome2x_example_output.geojson",
    "store": { "path": "htbo_ohsome2x_example_output.geojson",
    "type": "geojson" }
  }
}



Below you can see visualizations of the output dataset from our example request (count of school-buildings):

This first visualization was created with the OSM Boundaries for the arrondissements of Paris, the request was sent with ohsome2x and the visualization was generated with QGIS.

The second visualization used the OSM boundaries of Paris as well, the request and plot were both generated with R.

Thank you for reading our new blog post of the How to become ohsome” series! We hope it was a helpful addition to the previous posts.

Background info: the aim of the ohsome OpenStreetMap History Data Analytics Platform is to make OpenStreetMap’s full-history data more easily accessible for various kinds of OSM data analytics tasks, such as data quality analysis, on a regional, country-wide, or global scale. The ohsome API is one of its components, providing free and easy access to some of the functionalities of the ohsome platform via HTTP requests. Some intro can be found here:

Diesen Donnerstag, 25.11.2021, werden wir unser Kooperationsprojekt SocialMedia2Traffic beim Fachaustausch Geoinformation zum Thema “Smarte Region Rhein-Neckar” vorstellen.

Aktuelle Verkehrsinformationen sind eine Voraussetzung für Navigationslösungen, um die beste Route und genaue Reisezeiten zu ermitteln. Diese sind momentan jedoch nicht offen verfügbar. Im Projekt SocialMedia2Traffic entwickeln eine Methode, um Verkehrsinformation aus georeferenzierten Social-Media-Daten zu extrahieren und offen zur Verfügung zu stellen. Beim Fachaustausch Geoinformation werden wir unser Konzept, Teile der Methodik und erste Ergebnisse in der Session “Smarte Beiträge aus dem Netzwerk” um 15:30 Uhr vorstellen.

Die Veranstaltung wird organisiert vom GeoNet.MRN e.V. und findet online statt. Die Teilnahme ist kostenlos. Das Programm und die Zugangslinks sind hier verfügbar.

Das Projekt SocialMedia2Traffic (SM2T) von HeiGIT und GIScience Heidelberg wird im Rahmen der Förderrichtlinie Modernitätsfonds („mFUND“) durch das Bundesministerium für Verkehr und digitale Infrastruktur gefördert.

Recently a new project has been starting in the context of Climate Change Action research:

GeCO: Generating high-resolution CO2 maps by Machine Learning-based geodata fusion and atmospheric transport modelling

The spatiotemporal distribution of greenhouse gases and their sources on Earth has so far been considered mainly at relatively coarse resolutions. There is a lack of sound information on local emissions and their spatiotemporal distribution. However, these data are urgently needed to design local options for action in climate change mitigation and to validate mitigation efforts.

In the GeCO project, significant progress will be achieved by very high-resolution detection and modeling of the sources of selected greenhouse gases, as well as their turbulent dispersion in the atmosphere. Both the modeling of the distribution of the sources, as well as the dispersion of the gases will be done by means of computer-aided data-intensive methods through collaboration of expertise from environmental physics and geoinformatics. Current methods such as machine learning (ML) and geospatial data fusion are used. The computer computation time is kept low despite the high resolution by using a so-called catalog approach, which enables the dispersion calculation of greenhouse gases over longer time periods despite the high resolution.

Partners from geoinformatics and environmental physics are working in a team on data-driven methods to analyze the sources of greenhouse gases on the Earth’s surface and their turbulent dispersion in the atmosphere. The heterogeneous database for the sources of greenhouse gases on the Earth’s surface consists of diverse sensor readings, citizen science observations, the social web, and official and other statistical data. The discovered patterns are validated by measured values. Machine Learning (ML) methods are used and further developed for the analysis and fusion of the heterogeneous input data and generation of spatially (and temporally) highly resolved emission inventories of relevant greenhouse gases.

The GIScience Research Group (IWR, HCE) develops the methods for data preparation for the emission inventories using methods of Spatial Data Science and ML (esp. deep learning). For this purpose, different geodata sets, in particular on land use and other relevant sources of greenhouse gases (industry, transport, housing, waste, agriculture, etc.) will be analyzed. An important data source is OpenStreetMap (OSM), which contains high-resolution spatial information on buildings, industrial facilities, transport infrastructure and also land use information, which provides the basis for the localization of emissions.
Challenges include the spatially heterogeneous data quality of OSM on the one hand, and the spatially heterogeneous emission factors on the other hand. The question of data quality is investigated and used for the evaluation of the OSM objects.

The team at the Institute of Environmental Physics (IWR, HCE) uses the resulting high-resolution emission maps to set up turbulent dispersion calculations for the dispersion of greenhouse gases in the atmosphere on top of them, and to compare them at the end with in-situ concentration measurements for validation purposes.

Funded by HCE (Heidelberg Center for the Environment): (supported by DFG Initiative of Excellence).
Funding Period: 2021-2023
Principal investigators:
Alexander Zipf, GIScience HD Heidelberg University; Sven Lautenbach (HeiGIT gGmbH).
Sanam Vardag, Institut für Umweltphysik; André Butz, Institut für Umweltphysik; Heidelberg University.

The project builds upon earlier work in the context of the HCE project “Klimahandeln fundiert gestalten, Vergleichende Analyse Ba-Wü / Californien”

Also this year HeiGIT is participating at the HOT summit with the topic of “The Evolution of Local Humanitarian Open Mapping Ecosystems: Understanding Community, Collaboration, and Contribution”. Again the conference will be virtual and is organized in three blocks across several time zones to allow global participation.

Benjamin Herfort from HeiGIT will give a talk about MapSwipe and how the app can be used to go beyond mapping buildings. In the talk he will show examples where MapSwipe has been used to map damage or solid waste. Furthermore, he will briefly show ongoing work funded by American Red Cross to introduce ‘building footprint review’ projects in MapSwipe. This project type will be applied to investigate the quality of OpenStreetMap building footprints mapped through the HOT Tasking Manager. If you want to know how MapSwipe has started back in 2016, check our talk at HOT summit 2016 on YouTube: https://www.youtube.com/watch?v=pRZ_mWn0Lmc&t=1118s

Sami Petricola will give a lightning talk about his internship work at HeiGIT. He will present a method to assess the impact of floods disaster on road network criticality in accessing healthcare for the case of cyclone Idai in Mozambique in March 2019. The analysis provides an indicator of network criticality to be used in risk preparedness/mitigation by prioritizing the focus on the road segments ensuring the physical access to facilities.

Figure 1: MapSwipe Damage Mapping Results

It’s our 6th consecutive HOT summit. You can find out more about our past contributions here:

On Tuesday, 16 November 2021, Dr Katharina Anders defended her PhD research with great success.

In her dissertation, Katharina developed methods on 4D change analysis to extract natural surface changes from near-continuous LiDAR time series of dynamic landscapes.

Congratulations, Katharina, for this excellent research - we are very proud to have you in our team and wish you all the best for the future!

The defense took place in a hybrid format with Prof. Olaf Bubenzer, Jun.-Prof. Anna Growe, Prof. Bernhard Höfle and Prof. Maggi Kelly as members of the committee.

The defense took place in a hybrid format with Prof. Olaf Bubenzer, Jun.-Prof. Anna Growe, Prof. Bernhard Höfle and Prof. Maggi Kelly as members of the committee.

4D PhD hat.

4D PhD hat.

Using time series-based change analysis, the developed approach takes advantage of the history of surface change by performing spatiotemporal segmentation with the concept of 4D objects-by-change (4D-OBCs). The method identifies areas in the scene where surfaces change similarly over time in sub-periods of the full time series at neighboring locations.

Key features of time series-based change analysis are:

  • Removing the requirement to select and predefine periods for the analysis of changes
  • Extracting change forms at different (unknown) timing, change rates, durations of change processes, and persistence of (temporary) change forms
  • Separating spatially overlapping changes, which might be aggregated in bitemporal change information of a scene

This enables a generic extraction of surface changes in their varying spatial and temporal extents from large and dense 4D geospatial data.

Concept of research in the dissertation.

Concept of research in the dissertation.

More details on Katharina’s research can be found e.g. in the publications below and on the project website:

Older Posts »