• Home
  • About

GIScience News Blog

News of Heidelberg University’s GIScience Research Group.

Feed on
Posts
Comments
« Sphinx is ohsome - new documentation of the ohsome API
HeiGIT gGmbH is One Year Old Today! »

A Multi-Sensor Fusion Framework Based on Coupled Residual Convolutional Neural Networks

Jun 30th, 2020 by Hao Li

Multi-sensor remote sensing image classification has been considerably improved by deep learning feature extraction and classification networks. In this recent paper, we propose a novel multi-sensor fusion framework (CResNet-AUX) for the fusion of diverse remote sensing data sources. The novelty of this paper is grounded in three important design innovations:

  • A unique adaptation of the coupled residual networks to address multi-sensor data classification;
  • A smart auxiliary training via adjusting the loss function to address classifications with limited samples;
  • A unique design of the residual blocks to reduce the computational complexity while preserving the discriminative characteristics of multi-sensor features.

The proposed classification framework is evaluated using three different remote sensing datasets: the urban Houston university datasets (including Houston 2013 and the training portion of Houston 2018) and the rural Trento dataset. The proposed framework achieves high overall accuracies of 93.57%, 81.20%, and 98.81% on Houston 2013, the training portion of Houston 2018, and Trento datasets, respectively. Additionally, the experimental results demonstrate considerable improvements in classification accuracies compared with the existing state-of-the-art methods.

More importantly, the proposed CResNet-AUX is designed to be a fully automatic generalized multi-sensor fusion framework, where the network architecture is largely independent from the input data types and not limited to specific sensor systems. Our framework is applicable to a wide range of multi-sensor datasets in an end-to-end, wall-to-wall manner.

Future works in developing intelligent and robust multi-sensor fusion methods may benefit from the insights we have produced in this paper. In further research, we propose to test the performance of our framework on a large-scale application (e.g., continental and/or planetary land use land cover classification) and include additional types of remote sensing data. Find more details in the paper:

Li, H.; Ghamisi, P.; Rasti, B.; Wu, Z.; Shapiro, A.; Schultz, M.; Zipf, A. (2020) A Multi-Sensor Fusion Framework Based on Coupled Residual Convolutional Neural Networks. Remote Sensing. 12, 2067. https://doi.org/10.3390/rs12122067

Tags: classificaiton, deep learning, LULC, machine-learning, multi-sensor fusion, residual neural networks

Posted in Digital Earth, Land use, Publications, Research, VGI Group

Comments are closed.

  • About

    GIScience News Blog
    News of Heidelberg University’s GIScience Research Group.
    There are 1,509 Posts and 0 Comments so far.

  • Meta

    • Log in
    • Entries RSS
    • Comments RSS
    • WordPress.org
  • Recent Posts

    • geoEpi - new DFG research project on spatio-temporal epidemiology of emerging viruses
    • MapAction and HeiGIT partner to further geoinformation innovation
    • All countries of Sub-Saharan Africa now in Open Healthcare Access Map
    • Mapping Public Urban Green Spaces Based on OpenStreetMap and Sentinel-2 Imagery Using Belief Functions
    • COVID-19 – Geographien der Zäsur: CfP
  • Tags

    3D 3DGEO Big Spatial Data CAP4Access Citizen Science Colloquium crisis mapping Crowdsourcing data quality disaster DisasterMapping GeoNet.MRN GIScience heigit HOT humanitarian Humanitarian OpenStreetMap team intrinsic quality analysis isochrones landuse laser scanning Lidar Mapathon MapSwipe Missing Maps MissingMaps ohsome ohsome example Open data openrouteservice OpenStreetMap OSM OSM History Analytics OSMlanduse Quality quality analysis remote sensing routing social media spatial analysis Teaching terrestrial laser scanning VGI Wheelchair Navigation Workshop
  • Archives

    • April 2021
    • March 2021
    • February 2021
    • January 2021
    • December 2020
    • November 2020
    • October 2020
    • September 2020
    • August 2020
    • July 2020
    • June 2020
    • May 2020
    • April 2020
    • March 2020
    • February 2020
    • January 2020
    • December 2019
    • November 2019
    • October 2019
    • September 2019
    • August 2019
    • July 2019
    • June 2019
    • May 2019
    • April 2019
    • March 2019
    • February 2019
    • January 2019
    • December 2018
    • November 2018
    • October 2018
    • September 2018
    • August 2018
    • July 2018
    • June 2018
    • May 2018
    • April 2018
    • March 2018
    • February 2018
    • January 2018
    • December 2017
    • November 2017
    • October 2017
    • September 2017
    • August 2017
    • July 2017
    • June 2017
    • May 2017
    • April 2017
    • March 2017
    • February 2017
    • January 2017
    • December 2016
    • November 2016
    • October 2016
    • September 2016
    • August 2016
    • July 2016
    • June 2016
    • May 2016
    • April 2016
    • March 2016
    • February 2016
    • January 2016
    • December 2015
    • November 2015
    • October 2015
    • September 2015
    • August 2015
    • July 2015
    • June 2015
    • May 2015
    • April 2015
    • March 2015
    • February 2015
    • January 2015
    • December 2014
    • November 2014
    • October 2014
    • September 2014
    • August 2014
    • July 2014
    • June 2014
    • May 2014
    • April 2014
    • March 2014
    • February 2014
    • January 2014
    • December 2013
    • November 2013
    • October 2013
    • September 2013
    • August 2013
    • July 2013
    • June 2013
    • May 2013
    • April 2013
  •  

    June 2020
    M T W T F S S
    « May   Jul »
    1234567
    891011121314
    15161718192021
    22232425262728
    2930  
  • Recent Comments

    GIScience News Blog CC by-nc-sa Some Rights Reserved.

    Free WordPress Themes | Fresh WordPress Themes