Tutorials

Registration Deadline: June 30, 2018

Tutorial Pricing

Advance Rate
by May 18
Regular Rate
after May 18
Tutorials (each) Sunday, July 22

Full-Day Tutorial includes tutorial materials, morning coffee/tea break, lunch, and afternoon coffee/tea break

Half-Day tutorial includes tutorial materials and a coffee/tea break

Regular, Full-Day USD $225 USD $250
Regular, Half-Day USD $125 USD $150
Student, Full-Day USD $125 USD $150
Student, Half-Day USD $100 USD $125

Summary

Full-Day Tutorials: Sunday, July 22, 09:30 - 18:00
FD-1: Remote Sensing with GNSS Reflectometry (GNSS-R) and Signals of Opportunity (SoOp) James L. Garrison, Purdue University; Estel Cardellach, Institute of Space Sciences (ICE-CSIC/IEEC); and Adriano Camps, Universitat Politechnica de Catalunya (UPC)
FD-2: DART 3D radiative transfer model: an efficient tool for remote sensing studies J.P. Gastellu-Etchegorry, CESBIO, Toulouse University (Paul Sabatier University, (CNRS, CNES, IRD), France; T. Yin, NASA GSFC, Washington, USA, CESBIO, Toulouse University, France; and Jianbo Qi, Beijing Normal University, China
FD-3: Open Data Cube - A new way to manage satellite data utilizing an open source platform Brian Killough, NASA; Syed Rizvi, Sanjay Gowda; Analytical Mechanics and Associates Inc.
FD-4: High Performance and Cloud Computing for Remote Sensing Data Gabriele Cavallaro, Ahmed Shiraz Memon, Jülich Supercomputing Centre; Ernir Erlingsson, University of Iceland; Juan Mario Haut, Mercedes Paoletti, Antonio Plaza, and Javier Plaza, University of Extremadura
FD-5: Machine Learning in Remote Sensing - Best practices and recent solutions Devis Tuia, Alexandre Boulch, Yuliya Tarabalka, Ronny Hänsch; Laboratory of GeoInformation Science and Remote Sensing - Wageningen University and Research - The Netherlands, Information Processing Dpt at ONERA - The French Aerospace Laboratory, TITANE team of Inria Sophia-Antipolis Méditerranée, Computer Vision & Remote Sensing group of the Technische Universität Berlin
FD-6: SAR and optical data fusion with hands-on session using the ESA Toolbox SNAP Lorenzo Bruzzone, Universitá di Trento; Miguel Castro Gómez, Serco SpA; Cécile Cazals, CS-SI; Francesco Sarti, ESA; Mickaël Savinaud, CS-SI; Tereza Šmejkalová, Serco SpA
FD-7: Earth Observation Big Data Intelligence: theory and practice of deep learning and big data mining Mihai Datcu, DLR; Feng Xu, Fudan University
FD-8: SAR Polarimetry & Applications for Current (Sentinel 1) & New (GF3, Biomass, SAOCOM, RCM) Missions Carlos Lopez-Martinez, Luxembourg Institute of Science and Technology; Eric Pottier, Universite de Rennes-1
FD-10: Spectroscopic, fluorescence & thermal observations for SIF & physiological processes Alasdair MacArthur, University of Edinburgh; Luis Alonso and Juan Carlos Jiménez-Muñoz, University of Valencia
Morning Half-Day Tutorials: Sunday, July 22, 09:30 - 11:00
HD-1: Satellite based L-Band observation of land surfaces Ahmad AL Bitar, CESBIO/CNRS; Susanne Mecklenburg, ESA; Arnaud Mialon and Nemesio Rodriguez-Fernandez, CESBIO/CNRS
Afternoon Half-Day Tutorials: Sunday, July 22, 14:30 - 18:00
HD-4: Spectrum Management, Detection and Mitigation of RFI in Microwave Remote Sensing Jasmeet Judge, University of Florida; Jeffrey Piepmeier, NASA Goddard Space Flight Center; Michael Inggs, University of Cape Town; and Paolo de Matthaeis, NASA Goddard Space Flight Center
HD-6: Classification of satellite image time series with the Orfeo ToolBox and QGIS Stéphane May, CNES
HD-7: Introduction to the ARTMO raditiative transfer models and retrieval toolboxes Jochem Verrelst and Juan Pablo Rivera-Caicedo, Image Processing Laboratory (IPL), Laboratory of Earth Observation, University of Valencia

Full-Day Tutorials

FD-1: Remote Sensing with GNSS Reflectometry (GNSS-R) and Signals of Opportunity (SoOp)

Presented by

James L. Garrison, Purdue University; Estel Cardellach, Institute of Space Sciences (ICE-CSIC/IEEC); and Adriano Camps, Universitat Politechnica de Catalunya (UPC)

Description

GNSS reflectometry (GNSS-R) methods enable the use of small, low power, passive instruments. The power and mass of GNSS-R instruments can be made low enough to enable deployment on small satellites, balloons and UAV’s. Early research sets of satellite-based GNSS-R data were first collected by the UK-DMC satellite (2003), Tech Demo Sat-1 (2014) and the 8-satellite CYGNSS constellation (2016). GEROS-ISS (GNSS ReEflectometry, Radio-Occultation and Scatterometry on the International Space Station)  will demonstrate GNSS-R altimetry. Availability of spaceborne GNSS-R data and the development of new applications from these measurements, is expected to increase significantly following launch of these new satellite missions.

Recently, methods of GNSS-R have been applied to satellite transmissions in other frequencies, ranging from P-band(230 MHz) to K-band (18.5 GHz).  So-called “Signals of Opportunity” (SoOp) methods enable microwave remote sensing outside of protected bands, using frequencies allocated to satellite communications.  Measurements of sea surface height, wind speed, snow water equivalent, and soil moisture have been demonstrated with SoOp.

This all-day tutorial will summarize the current state of the art in physical modeling, signal processing and application of GNSS-R and SoOp measurements from fixed, airborne and satellite-based platforms.

FD-2: DART 3D radiative transfer model: an efficient tool for remote sensing studies

Presented by

J.P. Gastellu-Etchegorry, CESBIO, Toulouse University (Paul Sabatier University, (CNRS, CNES, IRD), France; T. Yin, NASA GSFC, Washington, USA, CESBIO, Toulouse University, France; and Jianbo Qi, Beijing Normal University, China

Description

Objective: to learn DART 3D radiative transfer model (http://www.cesbio.ups-tlse.fr/dart) and how to use it on machine:

  • to better understand radiometric mechanisms that lead to spectro-radiometer images, LiDAR data, fluorescence and  radiative budget of complex 3D landscapes.
  • for sensitivity studies and inversion approaches (e.g., creation of LUTs)

Agenda

  • 60 minutes: Overview of DART scientific related questions, including major radiometric terms used in optical remote sensing.
  • 30 minutes: Interactive presentation of DART major functionalities and Graphic User Interface. From this stage, participants work with their laptops. DART is installed if needed.
  • 90 minutes: Exercises about reflectance in the short waves and thermal infrared emission, for 2D landscapes.
  • 30 minutes: Creation of 3D mock-up: trees, crops, houses, atmosphere, topography.

Then, depending on participant expectations (e.g., spectroradiometer,  LiDAR, fluorescence, radiative budget,... ), several options are possible:

  1. Some or all participants focus on parts of DART of direct interest for them
  2. Participants focus their work on 1 or 2 case studies of the participants
  3. Participants continue to work with the general program:
    • 60 minutes: crop field: spectroradiometer, LiDAR, fluorescence
    • 45 minutes: forest plot: spectroradiometer, LiDAR, radiative budget
    • 30 minutes: urban landscape: spectroradiometer, DART calibration with satellite images (Landsat, Sentinel)

Requirements

In order to make the tutorial more efficient, participants are advised:

FD-3: Open Data Cube - A new way to manage satellite data utilizing an open source platform

Presented by

Brian Killough, NASA; Syed Rizvi, Sanjay Gowda; Analytical Mechanics and Associates Inc.

Description

The Open Data Cube (ODC), created and facilitated by the Committee on Earth Observation Satellites (CEOS), is an open source data architecture that allows analysis-ready satellite data to be packaged in "cubes" to minimize data preparation complexity and take advantage of modern computing for increased value and impact of Earth observation data. The ODC is a common analytical framework that includes API development, cloud integration, a web-based user interface, and data analytics to facilitate the organization and analysis of large, gridded data collections. Based on analysis ready data from current CEOS satellite systems, the ODC is a technological solution that removes the burden of data preparation, yields rapid results, and utilizes an international global community of contributors. The ODC is currently operating in Australia, Colombia and Switzerland with plans to expand operations to more than 20 countries by the year 2020.

Objective

This workshop will provide a hands-on introduction to the Open Data Cube including the topics of data cube creation and application analyses. The target audience is scientists, researchers, and students with knowledge or experience in satellite imagery interpretation and a desire to learn new and innovative data analysis techniques. Participants will be required to bring a laptop computer with internet access. The course will utilize online cloud-based computing resources for all course demonstrations and participant tasks.

Materials

The following websites may be of interest in preparation for the tutorial:

FD-4: High Performance and Cloud Computing for Remote Sensing Data

Presented by

Gabriele Cavallaro, Ahmed Shiraz Memon, Jülich Supercomputing Centre; Ernir Erlingsson, University of Iceland; Juan Mario Haut, Mercedes Paoletti, Antonio Plaza, and Javier Plaza, University of Extremadura

Description

The development of the latest-generation sensors mounted on board of Earth observation platforms has led to a necessary re-definition of the challenges within the entire lifecycle of remote sensing data. The acquisition, processing and application phases face problems, which are well described by the Vs big data definitions: (1) Volume - the increasing scale of archived data (i.e., hundreds of Terabytes per day) raises not only data storage but also massive analysis issues, (2) Variety - the data are delivered by sensors acting over different spatial, spectral, radiometric and temporal resolutions, (3) Velocity - the data processing and analysis must confront the rapidly growing rate of data generation, (4) Veracity - the massive amount of sensed data coming in at high speed is associated with uncertainty and accuracy measurement and (5) Value - the acquired data are not straightforward to interpret and they require a powerful yet highly accurate processing scheme in order to extract reliable and valuable information. Traditional serial methods (i.e., desktop approaches, such as MATLAB, R, SAS, etc.) present several limitations and they become ineffective when considering these challenges. Despite modern desktop computers and laptops becoming increasingly multi core and performing better, they preserve limitations in terms of memory and core availability. Trends in parallel architectures like many-core systems (e.g. GPUs) are in continuous expansion to satisfy the growing demands of domain-specific applications for handling computationally intensive problems. In the context of large scale remote sensing applications, High Performance Computing (HPC) and Cloud Computing have the chance of overcoming the limitations of serial algorithms. Parallel architectures such as clusters, grids, or clouds provide tremendous computation capacity and outstanding scalability underpinned by strong and stable standards used for decades, e.g. message passing interface (MPI). The tutorial aims at providing a complete overview for an audience that is not familiar with these topics. The tutorial will follow a twofold approach: selected background lectures (morning session) followed by practical hands-on exercises (afternoon session) in order to enable you to perform your own research. The tutorial will discuss the fundamentals of supercomputing as well as cloud computing, and how we can take advantage of such systems to solve remote sensing problems that require fast and highly scalable solutions.

Materials

The morning session provides the general background material and the speakers will present their slides. The afternoon is more focused on particular remote sensing  datasets and using them in hands-on fashion: each participant has to bring a laptop. The attendees who use Windows, should install a free SSH client (e.g., MobaXterm, Putty).

FD-5: Machine Learning in Remote Sensing - Best practices and recent solutions

Presented by

Devis Tuia, Alexandre Boulch, Yuliya Tarabalka, Ronny Hänsch; Laboratory of GeoInformation Science and Remote Sensing - Wageningen University and Research - The Netherlands, Information Processing Dpt at ONERA - The French Aerospace Laboratory, TITANE team of Inria Sophia-Antipolis Méditerranée, Computer Vision & Remote Sensing group of the Technische Universität Berlin

Description

Despite the wide and often successful application of machine learning techniques to analyze and interpret remotely sensed data, the complexity, special requirements, as well as selective applicability of these methods often hinders to use them to their full potential. The gap between sensor- and application-specific expertise on the one hand, and a deep insight and understanding of existing machine learning methods often leads to suboptimal results, unnecessary or even harmful optimizations, and biased evaluations. The aim of this tutorial is twofold: First, spread good practices for data preparation: Inform about common mistakes and how to avoid them (e.g. dataset bias, non-iid samples), provide recommendations about proper preprocessing and initialization (e.g. data normalization), and state available sources of data and benchmarks. Second, present efficient and advanced machine learning tools: Give an overview of standard machine learning techniques and when to use them (e.g. standard regression and classification techniques, clustering, etc.), as well as introducing the most modern methods (such as random fields, ensemble learning, and deep learning).

FD-6:  SAR and optical data fusion with hands-on session using the ESA Toolbox SNAP

Presented by

Lorenzo Bruzzone, Universitá di Trento; Miguel Castro Gómez, Serco SpA; Cécile Cazals, CS-SI; Francesco Sarti, ESA; Mickaël Savinaud, CS-SI; Tereza Šmejkalová, Serco SpA

Seats are limited to 50 participants on a first-come first-served basis.

Description

A wide range of different and complementary data (radar, optical, IR) from Sentinel-1,2,3 and now also from Sentinel-5P are nowadays available with an open and free policy. For some applications, such as agriculture, the synergy between these data has been already shown. For several applications, there has been an increasing interest in jointly using both radar and optical data to compensate for the limitations of using single data products alone. The combination of the weather and illumination independence and the sensitivity to the size, density, orientation and dielectric properties of SAR sensors together with the multi-spectral information related to the leaf structure, pigmentation and moisture captured by optical sensors can provide greater insight and context in many areas of application.

After a theoretical lecture provided by ESA speakers and experts from Academia, where the Sentinel missions will be introduced with a focus on a theoretical part aiming to provide overall background on image fusion methods and applications, a hands-on session with data processing exercises about Sentinel-1 and Sentinel-2 data fusion methods and applications will follow. The hands-on session will be based on Open tools developed within the ESA SEOM Programme (with focus on the ESA SNAP Toolbox, STEP platform) in the RUS environment.

RUS - Research and User support for Sentinel core products - is an initiative funded by the European Commission and managed by ESA with the objective to promote the uptake of Copernicus Sentinel data and support R&D activities. It is a free and open service offering not only powerful computational environment - in the form of customized Virtual Machines (VMs) preinstalled with wide variety of open source toolboxes - but also EO expertise and support for application-specific data selection, data processing and visualization as well as algorithm development and scaling-up to large amount of Sentinel products.

This all-day tutorial will demonstrate the usage of the Open Tools (ESA SNAP; Orfeo Toolbox; QGIS; etc.) available within the RUS environment for optical and SAR data fusion and introduce three different approaches: 1) composite creation and analysis; 2) temporal resolution increase (complementing optical imagery with SAR for higher temporal resolution monitoring); 3) pixel based fusion methods (IHS, PCA, HPF, etc.). It will be followed by practical demonstration of the processing steps related to each of the three approaches listed above including necessary pre-processing steps and potential issues. This will take the form of a hands-on exercise using the RUS environment.

Tutorial Learning Objectives

Participants will have a good overview and hands-on experience with selected methods for optical and SAR data fusion. Participants will be familiar with the RUS environment and will be informed on how to request RUS VM for other EO projects and activities exploiting the Copernicus Sentinel data.

FD-7:  Earth Observation Big Data Intelligence: theory and practice of deep learning and big data mining

Presented by

Mihai Datcu, DLR; Feng Xu, Fudan University

Description

In the big data era of earth observation, deep learning and other data mining technologies become critical to successful end applications. Over the past several years, there has been exponentially increasing interests related to deep learning techniques applied to remote sensing including not only hyperspectral imagery but also synthetic aperture radar (SAR) imagery.

This tutorial has two parts. The first half introduces the basic principles of machine learning, and the evolution to deep learning paradigms. It presents the methods of stochastic variational and Bayesian inference, focusing on the methods and algorithms of deep learning generative adversarial networks. Since the data sets are organic part of the learning process, the EO dataset biases pose new challenges. The tutorial answers to open questions on relative data bias, cross-dataset generalization, for very specific EO cases as multispectral, SAR observation with a large variability of imaging parameters and semantic content.

The second half introduces the theory of deep neural networks and the practices of deep learning-based remote sensing applications. It introduces the major types of deep neural networks, the backpropagation algorithms, programming toolboxes, and several examples of deep learning-based remote sensing imagery processing.

Tutorial Learning Objectives

The first part of the tutorial is expected to bring a joint understanding of the “classical” machine learning and the generative adversarial networks indicating integrated optimal solutions in complex EO applications, including the choice or generation of labeled data sets and the biases influence in validation or benchmarking. Through the second half of the tutorial, participants are expected to understand the basic theory for deep neural networks including convolutional neural network, backpropagation algorithm, etc., and learn the relevant skills including network design, hyper-parameter tuning, training algorithm, dataset preparation, toolbox usage and result analyses and diagnosis

FD-8: SAR Polarimetry & Applications for Current (Sentinel 1) & New (GF3, Biomass, SAOCOM, RCM) Missions

Presented by

Carlos Lopez-Martinez, Luxembourg Institute of Science and Technology; Eric Pottier, Universite de Rennes-1

Description

Nowadays, several space borne Polarimetric Synthetic Aperture Radar (PolSAR) systems are in operation as TerraSAR-X (X-Band) launched in June 2007, RADARSAT-2 (C-Band) launched in December 2007, Sentinel-1A&B (C-band) launched in April 2014, ALOS-2 (L-band) launched in May 2014. Also, future missions as BIOMASS (P-band), SAOCOM (L-band), RCM (C-band) and GF3 (C-Band) are designed to have parametric sensitivity.

The availability of spaceborne PolSAR data provides an unprecedented opportunity for applying advanced PolSAR information processing techniques to the important tasks of environmental monitoring and risk management. PolSAR remote sensing offers an efficient and reliable means of collecting information required to extract quantitative geophysical and biophysical parameters from Earth’s surface. This remote sensing technique has found many successful applications in crop monitoring and damage assessment, in forestry clear cut mapping, deforestation and burn mapping, in land surface structure (geology) land cover (biomass) and land use, in hydrology (soil moisture, flood delineation), in sea ice monitoring, in oceans and coastal monitoring (oil spill detection) etc. The scope of different applications is increasing nowadays thanks to the availability of mulitemporal and  polarimetric acquisitions.

SAR Polarimetry represents today a very active area of research in Radar Remote Sensing, and for instance operational polarimetric applications start to be operational in the frame of the Sentinel-1. Consequently, it becomes important to train and to prepare the future generation of researchers to this very important topic.

FD-10: Spectroscopic, fluorescence & thermal observations for SIF & physiological processes

Presented by

Alasdair MacArthur, University of Edinburgh; Luis Alonso and Juan Carlos Jiménez-Muñoz, University of Valencia

Description

SIF is a bi-product of photosynthesis. It is the photon flux emitted by chlorophyll molecules after excitation by sunlight and the most directly measurable reporter of photosynthetic efficiency. Hence, it is a key indicator of the carbon fixation and stress limited state of photosynthesizing organisms. However, measurements of remotely sensed spectral reflectance and surface temperature (TIR) are also required to fully understand and use the SIF signal. This data combination (reflectance, SIF and TIR) has the potential greatly advance our understanding of the dynamics of Earth surface photosynthesis, gross primary productivity, and ecosystem change. Retrieving information from these measurements requires a deeper understanding of the interaction of sunlight with Earth's surface biophysical and biochemical constituents, and of field optical and TIR measurement techniques, than is required when applying more common approaches e.g. use of broadband spectral indices. In recognition of the importance of SIF, the European Space Agency (ESA) is supporting an space-based explorer mission (FLEX) to measure Earth surface SIF from space, and exploit that unique vantage point to study the dynamics of natural processes and the impact human activity and climate change at local, regional and global scales. There is now an imperative to train the next generation of environmental scientists to use these forms of Earth observation to enable them to make use of this new capability, to understand the physics of photosynthesis, how these measurements are made, uncertainties quantified and data analyzed.

The aim of this one day course is to provide those attending IGARSS 2018 with a unique opportunity to gain both theoretical and practical 'hands-on' experience in near-ground hyperspectral, sun induced fluorescence and thermal infra-red field measurements for investigation of vegetation sun induced fluorescence (SIF). This course will be based at the University of Valencia and include the theory behind SIF, field work using cutting-edge scientific instruments developed by the trainer for SIF research, and data processing and analysis.

Requirements

To make the best use of time in Valencia trainees should review these lectures before attending the course.

A basic understanding of optical and thermal remote sensing of vegetated surfaces; study of online course lecture series; ability to carry out field work in the natural environment (within U. of Valencia campus).

Morning Half-Day Tutorials

HD-1: Satellite based L-Band observation of land surfaces

Presented by

Ahmad AL Bitar, CESBIO/CNRS; Susanne Mecklenburg, ESA; Arnaud Mialon and Nemesio Rodriguez-Fernandez, CESBIO/CNRS

Description

This tutorial aims at providing the participants the principles of earth observation in passive microwave with a focus on L-Band frequency (1.4Ghz). The practical sessions will consist of data analysis of observations provided by currently operational missions (SMOS from ESA and SMAP from NASA). The participants will  be able to see an overview of the products from SMOS mission that includes radiometric products, geophysical products of soil moisture, and high-end products from data fusion/merging.

This tutorial will offer the participants a unique insight into fundamental elements that will be discussed in three sessions during IGARSS 2018 on SMOS, SMAP and L-Band radiometry.

Afternoon Half-Day Tutorials

HD-4: Spectrum Management, Detection and Mitigation of RFI in Microwave Remote Sensing

Presented by

Jasmeet Judge, University of Florida; Jeffrey Piepmeier, NASA Goddard Space Flight Center; Michael Inggs, University of Cape Town; and Paolo de Matthaeis, NASA Goddard Space Flight Center

Description

The demand for frequency bands is continually increasing. In this context, both active and passive spaceborne instruments are experiencing problems with the Radio Frequency Interference (RFI) more and more often. This also happens at frequency bands allocated exclusively to passive services due to illegal transmitters and out-of-band emissions from adjacent services. The presence of RFI is always detrimental to scientific missions. When detected, RFI causes information loss and reduces measurement accuracy; when not detected, it yields to inaccurate measurements that are not recognized as such, and therefore to potentially wrong conclusions. In some cases, the presence of RFI can jeopardise the objectives of the mission. RFI represents a significant threat to microwave remote sensing sensors and will need proper attention in all future missions.

This tutorial will provide an overall review of RFI management for microwave remote sensing, starting from the processes that lead to frequency allocations and to enforcement of Radio-Regulations, and then focusing on state-of-the art techniques to detect and mitigate RFI in both passive and active sensors. This tutorial can be very useful for anyone interested in learning about RFI, from recently graduated engineers who seek a career development in the remote sensing community to mission managers that look for possible ways to mitigate presence of RFI.

Tutorial Learning Objectives

Attendees will become familiar with how RFI can affect the measurements of spaceborne instruments and with the extent of the RFI issue in some satellite missions. They will also receive insights into the different types of RFI observed so far and into the latest RFI detection and mitigation techniques.

Attendees will also be able to better understand the different needs and the different actors that are involved in the frequency allocation process.

HD-6: Classification of satellite image time series with the Orfeo ToolBox and QGIS

Presented by

Stéphane May, CNES

Description

This tutorial is a step by step guide to the classification tools available in the Orfeo ToolBox. Tailored for big reference data and images, those tools inter-operates easily with GIS software such as Quantum GIS, allowing for in depth analysis of the supervised classification process.

After a short introduction to the Orfeo ToolBox, we will start from a Landsat 8 and Sentinel-2 time series and real world reference data used to produce national land cover maps.

We will review all the steps of the classification framework : samples selection, feature extraction, learning with different machine learning algorithms and features selection, classification and accuracy assessment.

After attending this tutorial, participants will also be able to discover and use other tools from the toolkit on their own (such as remote sensing preprocessing, segmentation or change detection).

Tutorial Learning Objectives

At the end of the tutorial, it is expected that the participants will be able to perform the full classification framework with Orfeo ToolBox and QGIS on their own data.

Materials

An installation guide will be given beforehand to each participant so as to install QGIS and Orfeo ToolBox. All major OS are supported (Linux/Mac/Windows). The organizer will be available for support, should they need help during the installation process.

HD-7: Introduction to the ARTMO raditiative transfer models and retrieval toolboxes

Presented by

Jochem Verrelst and Juan Pablo Rivera-Caicedo, Image Processing Laboratory (IPL), Laboratory of Earth Observation, University of Valencia

Description

This tutorial aims to demonstrate the ARTMO toolboxes for running of leaf and canopy radiative transfer models (RTMs) and mapping of vegetation properties from optical data. First an overview of the various options for RTM forward running will be given. Then simulations will be run for specific sensor settings. In the Graphics tool, the various options of plotting and exporting simulations will be shown.

As a second part the retrieval toolboxes will be taught. ARTMO consists of four retrieval toolboxes: (1) vegetation indices toolboxes, where all kinds of possible indices can be evaluated; (2) machine learning toolbox, which includes over 15 machine learning algorithms, dimensionality reduction methods and band selection methods. Each of these toolboxes can develop retrieval models based on experimental data or based on RTMs. Further, the RTM-based retrieval toolboxes consists of: (3) LUT-based inversion toolbox, which includes various optimization options such as over 50 cost functions.  And finally: (4) numerical inversion toolbox where per-pixel takes place against an RTM through a spectral fitting function.

Further, if time allows, ARTMO's global sensitivity analysis toolbox, scene generation toolbox and the emulator toolbox will be demonstrated.

Requirements

Students need to take their own laptop (Windows) with a Matlab version (2013 or more recent). Preferably have MySQL 5.5 installed (remember password) (https://dev.mysql.com/downloads/mysql/5.5.html#downloads). See also: http://ipl.uv.es/artmo/ Register, and go to the download page.