Hierarchical joint classification models for multi-resolution, multi-temporal and multi-sensor remote sensing images: Application to natural disasters Event as iCalendar

20 December 2017

10 - 11am

Venue: PLT2 (303-G02), Building 303

Location: 38 Princes Street, City Campus

Host: Department of Computer Science

Cost: Free - all welcome

Professor Josiane Zerubia
Professor Josiane Zerubia

Monitoring the Earth's surface, e.g., for protection from environmental disasters, such as floods or earthquakes, plays an important role in socioeconomic human activities.

Rapid and reliable assessment of changes and damages induced by a disaster calls for accurate and timely classification of substantial amounts and variety of very high resolution data from last generation satellites such as Pleiades, GeoEye, COSMO-SkyMed, or RadarSat-2. The main difficulty in utilising benefits of multi-band, multi-resolution, multi-date, and multi-sensor imagery is to develop powerful and flexible enough classifiers.

Professor Josiane Zerubia and colleagues propose to fuse multi-date/sensor/resolution data with explicit joint probability models of multi-sensor and multi-temporal images based on novel hierarchical Markov random fields. The resulting efficient supervised classifiers use the maximum posterior marginal criterion to extract multi-temporal and/or sensor information from images taken over the same area at different times, with different sensors, and/or at different spatial resolutions.

These classifiers have been experimentally validated on complex optical multi-spectral (Pleiades), X-band SAR (COSMO-Skymed), and C-band SAR (RadarSat-2) imagery taken after the Haiti earthquake. Comparisons with the state-of-the-art methods confirmed the effectiveness of Professor Zerubia and team's techniques in fusing multiple data sources for classification, in particular, the higher accuracy and spatial regularity of the output classification maps, smaller spatial artifacts, and promising tradeoffs with respect to computation time.