Community-Contributed Sessions

Show/Hide Description
CCS.1: Accelerating progress towards the Sustainable Development Goals: the contribution of international standards for land cover and land use and guidelines for SDG reporting.
SP.8: Special Scientific Themes — GIS development and remote sensing
Food security is a pressing global issue, with the world population projected to reach 9.7 billion by 2050. Agriculture and natural resources are essential for ensuring food security, affecting both local communities and the global population. In 2015, Sustainable Development Goal 2 (SDG 2) was adopted as part of the 2030 Agenda for Sustainable Development, with the aim to end hunger, achieve food security, improve nutrition, and promote sustainable agriculture by 2030. However, despite the urgency, world hunger is expected to affect 670 million people by 2030—the same figure as when the Goals were first established. With only five years remaining, there is an urgent need to accelerate progress toward achieving the Sustainable Development Goals (SDGs).

The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) recognizes Land Cover and Land Use (LCLU) as fundamental data themes, which are critical to achieving various SDGs. Given the interconnected nature of the SDGs, accurate data on land and water resources is essential for fostering sustainable and resilient agriculture. Adopting LCLU standards, along with following guidelines for SDG reporting, is crucial to ensure that local data effectively addresses global challenges.

One example of adopting LCLU standards is the assessment and monitoring of land degradation through Land Degradation Neutrality, as reported under the United Nations Convention to Combat Desertification (UNCCD). The UNCCD plays a vital role in advancing progress toward several SDGs, particularly SDG 15, which focuses on life on land and aims to halt and reverse land degradation by 2030.

Numerous national, regional, and international organizations leverage recent advances in Earth Observation and Information and Communication Technologies to support the monitoring of natural resources and agriculture. A notable example is the European Space Agency's Copernicus Program, which provides comprehensive Earth Observation data and information. This program has developed various applications that enhance technical capacities by offering science-based data to monitor land degradation, restore ecosystems, and support sustainable agriculture and food security, recognizing the interconnectivity of multiple SDGs.

This session will explore the essential role of international standards and guidelines in ensuring the consistency, accuracy, and interoperability of land cover data used in national and regional plans, particularly in relation to SDG 15.3.1. By combining international standards with tools like SEPAL and following the Good Practice Guidance Version 2 (GPGv2) for SDG 15.3, this session will illustrate how effective land cover mapping can accelerate progress toward the Sustainable Development Goals.

Key topics covered in this session will include:
 - Recent advances in Earth Observation for monitoring land cover, land use, agriculture, and land degradation.
 - The importance of integrating standardized mapping guidelines into land cover, land degradation, and crop mapping activities.
 - Case studies demonstrating the use of standardized mapping methods to assess land cover and land degradation and to advance sustainable land management practices.
 - The significance of adhering to these standards within global initiatives such as the SDGs, UNCCD, and GEO-LDN.
 - The applicability of standards alongside machine learning, big data analytics, and cloud-based platforms for processing extensive datasets.
Show/Hide Description
CCS.2: Advanced and Emerging Technologies for Microwave Radiometers
S/I.15: Sensors, Instruments and Calibration — Advanced Future Instrument Concepts
Microwave radiometers are crucial instruments in geoscience and remote sensing, providing essential data for monitoring Earth's environment. These instruments measure key parameters such as temperature, water vapor, clouds and precipitation, ocean winds, soil moisture, and sea ice, which are critical for weather nowcasting and forecasting, climate studies, and disaster management. As environmental monitoring demands become more complex, advancements in microwave radiometer technology are necessary to meet the growing need for accurate and reliable data.
This session will explore the latest advancements in microwave radiometry, addressing the following key areas:
Nascent Technologies and Novel Architectures: Advances in technologies and system architectures are essential for developing new and revolutionary approaches to remote sensing. For example, significant recent advances have occurred in submillimeter-wave and Terahertz systems as well as in the use of microwave photonics.
Advanced Calibration Techniques: Ensuring precise calibration is fundamental to obtaining reliable measurements. The session will present innovative calibration methods, including new approaches for maintaining long-term stability and accuracy in radiometer readings.
Miniaturization and Efficiency: With the rise of smaller, cost-effective satellite platforms, there is a need for compact and efficient radiometers. We will examine the latest developments in miniaturized radiometer designs that maintain high performance while reducing size and power consumption.
Machine Learning and Artificial Intelligence: The integration of AI and machine learning is transforming data processing for microwave radiometry. This session will explore how AI-driven techniques are improving data accuracy, automating calibration processes, and reducing measurement noise.
Interference Mitigation: As the radio frequency spectrum becomes increasingly crowded, addressing radio frequency interference (RFI) is essential. The session will showcase new technologies and strategies that mitigate RFI, enabling clearer and more reliable measurements.
Multi-frequency and Multi-platform Systems: Radiometers that operate at multiple frequencies and are deployed across various platforms (satellites, aircraft, ground stations) offer more comprehensive environmental monitoring. This session will discuss advances in multi-frequency systems and their role in improving spatial and temporal resolution for Earth observations.
Importance to Geoscience and Remote Sensing:
Microwave radiometers provide critical data for understanding global environmental changes. As climate change, extreme weather events, and natural disasters increase, the ability to obtain precise measurements through advanced radiometry becomes even more important. The session will highlight the role of emerging technologies in ensuring that microwave radiometers continue to deliver high-quality, reliable data for scientific and practical applications. This data is crucial for decision-making in areas such as climate change mitigation, disaster response, and environmental protection.
Bringing together leading experts from academia, industry, and government, this session will offer insights into the future of microwave radiometry and how it is evolving to meet the challenges of modern Earth observation. Participants will gain a deeper understanding of the technological advancements that are shaping the next generation of microwave radiometers and their critical role in advancing geoscience and remote sensing efforts.
Show/Hide Description
CCS.3: Advanced polarimetric SAR methods and applications
T/S.4: SAR Imaging and Processing Techniques — PolSAR and PolInSAR
Polarimetry is starting a new operational ere with the recent successful launch of  ALOS4 (1st July 2024)  by JAXA, and the upcoming polarimetric satellite SAR missions (NISAR (NASA-ISRO), and ROSE (ESA)) planned for launch in 2025 and 2026. All these  satellite SAR missions are equipped with the new technology of digital antenna beaming, which permits solving for the polarimetric swath limitation   due to the doubling of the PRF, to ensure high-resolution large swath polarimetric cover. Equipped with the digital antenna beaming, ALOS4  will permit polarimetric SAR imaging at 3m resolution and 100km swath. This is a significant advance in comparison with  the existing polarimetric satellite SAR missions (RADARSAT2, ALOS2, and TerraSAR) of limited swath (50km). 

With the initiation of this new ere of operational use of polarimetric SAR in support of key applications,  it is important to reconsider the state of the art in the methodology and tools  currently adopted for polarimetric information extraction. This special session will allow us to learn more about the most advanced tools recently developed for optimum polarimetric information extraction (such as target scattering decomposition, speckle filtering, image classification, and polarimetric SAR modeling), and their use in support of key applications. The session that will gather edge leading guest scientists will also give the opportunity to  discuss the gaps that would have to be fulfilled to fully exploit the polarimetric information provided by satellite and airborne SAR in support of key applications.

If accepted, this session would be the 13th session organized by Dr. R.
Touzi on advanced polarimetric methods in the context of IGARSS since IGARSS`2008. The eleven IGARSS  sessions he previously organized and co-chaired with Dr. J.S. Lee from 2008 to 2019 (right before the COVID) and the one he organized and co-chaired with Prof. Pottier in 2024 were completely successful with 70-to-100 attendees (in person)  for each session.

Dr. R. Touzi, the Chairman, is the recipient of the prestigious  2015 IEEE Fellow  award for his outstanding  contribution towards  the influence of the design and calibration of polarimetric satellite SAR missions, development of advanced methods and their validation through applications.
He was the recipient of the 1999 Geoscience and Remote Sensing Transactions Prize Paper Award for the study on the "Coherence Estimation for SAR imagery". He has been serving as an Associate Editor of IEEE Transaction on Geoscience and Remote Sensing for more than 6 years.

Dr. M. Arii, the Co-chairman, has been a Senior Member of IEEE and a Senior Expert at Mitsubishi Electric Corporation. Since 2023, He has been appointed as an Associate Editor of TGRS especially for PolSAR theory, data analysis and applications. Also, he was recognized as the GRSL Best Reviewer for 2012. He published 18 peer-reviewed and 84 conference papers with more than 1300 citations.
Show/Hide Description
CCS.4: Advanced Satellite Remote Sensing Techniques for Coastal Hazard Monitoring and Risk Prediction
O.4: Oceans — Coastal Zones
Coastal subsidence significantly contributes to relative sea-level rise, wetland loss, infrastructure damage, and flood risk in low-lying coastal regions. The interplay of rapid urbanization, aggressive human activities, extreme climate change, and rising sea levels exacerbates risks for socio-economic growth and impacts residents in major coastal cities worldwide. Inaccurate assessments of vertical land motion along coastlines can distort sea-level projections, heightening vulnerability for coastal populations and infrastructure. To address these challenges, a multidisciplinary approach that integrates multi-sensor remote sensing data with ground-based observations is essential for mitigating coastal hazard risks linked to changing environmental and climatic conditions. Advances in satellite radar remote sensing—characterized by improved acquisition times, varying spatial resolutions, and extensive coverage—have enhanced our ability to monitor and manage coastal hazards effectively. The diverse array of Earth Observation (EO) data from multi-resolution Synthetic Aperture Radar (SAR) sensors, such as ESA’s Sentinel-1 A/B, JAXA’s ALOS-1&2, Canada’s Radarsat-1&2, DLR’s TerraSAR-X, ASI’s COSMO-SkyMed, and the upcoming NASA-ISRO NISAR mission, combined with ground-based measurements like GPS, tidal data, high-resolution topography, land use/land cover (LULC), rainfall, and groundwater levels, presents a robust framework for analysis. An integrated approach that combines high-precision geodetic techniques, particularly advanced multi-track SAR imagery, can greatly enhance our understanding of vertical land motion and improve inundation risk assessments for coastal cities. This session aims to bridge analytical gaps and develop an effective physics-based numerical predictive model for inundation risk assessment and mitigation strategies in coastal metropolises vulnerable to rising sea levels and subsidence.

This session invites abstracts that showcase recent advancements in scientific and technical developments employing satellite remote sensing strategies for coastal hazard monitoring. We welcome submissions that explore the opportunities and challenges in the following areas:

1.	Monitoring Coastal Hazards Phenomena: Analyzing coastal subsidence at local to regional scales and estimating relative sea-level rise for inundation models in vulnerable regions worldwide. This should integrate multi-sensor satellite remote sensing data with ground-based observations, tidal gauge measurements, high-resolution topography, storm surge data, and other relevant information.
2.	Physics-Based Predictive Modeling: Developing predictive models using artificial intelligence (AI) and deep learning to understand the underlying physical processes affecting coastal risk. This includes factors such as sediment compaction, groundwater flow, land deformation, rainfall, climate change-induced sea-level rise, and storm surges.
3.	Enhancing Coastal Hazard Risk Assessments: Improving the accuracy and timeliness of risk assessments while bridging the gap between scientific research and practical management strategies.

We welcome contributions from a range of interdisciplinary fields, such as remote sensing, geomatics, geoscience, geodesy, ocean science, hydrology, atmospheric science, and other related disciplines.
Show/Hide Description
CCS.5: Advanced Signal Processing Methods for Geoscience and Remote Sensing Applications
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
The field of geoscience and remote sensing is undergoing a transformative phase, which is driven by the development of advanced signal processing techniques. As the volume, variety, modality, and velocity of geospatial data continue to grow, there is an increasing demand for innovative methods that can efficiently and accurately process this data for extracting better and meaningful information. This session focuses on the role of several state-of-the-art advanced signal processing methodologies in addressing these challenges.

The focus of this session will be on innovative methods, algorithms, and applications that enhance the analysis, interpretation, and utilization of remotely sensed and geospatial data through advanced signal processing. We particularly welcome contributions that explore, but are not limited to, the following topics:

Neural Networks and Deep Learning: neural networks, complex-valued neural networks, quaternion neural networks, convolutional neural networks (CNNs), recurrent neural networks (RNNs), generative adversarial networks (GANs), and other deep learning (DL) models to process and interpret remote sensing data.

Compressive Sensing (CS) and Sparse Reconstruction: CS and Sparse Reconstruction methods in GRS, CS Radar, CS for subsurface imaging and devices, sparse representation and dictionary learning, applications in hyperspectral imaging and Synthetic Aperture Radar (SAR), CS reconstruction algorithms for geophysical signals

Quantum Computing in Geoscience and Remote Sensing: quantum algorithms for geoscience and remote sensing, quantum neural network and its application in GRS, and quantum machine learning for processing geospatial data, 

Super-Resolution Techniques: Image super-resolution using deep learning methods, resolution enhancement in optical and radar imagery, applications in land cover mapping and environmental monitoring

Emerging Signal Processing Techniques: Complex and hyper-complex signal and image processing for geoscience and remote sensing, Innovative signal processing frameworks and algorithms not covered in the above topics but with promising applications in geoscience and remote sensing.
Show/Hide Description
CCS.6: Advanced well-posed forest height inversion techniques using microwave remote sensing
L.3: Land Applications — Forest and Vegetation: Application and Modelling
Accurate forest height inversion is a crucial technique for monitoring forest structure and biomass. Compared to traditional methods, such as field surveys and lidar, microwave remote sensing techniques, particularly interferometric synthetic aperture radar (InSAR) offer a promising alternative for forest height inversion due to their ability to penetrate canopy structures and provide detailed information about forest stands. 
However, there remains some long-standing problems to be solved, e.g. the ill-posed equation solving, limitation of coarse resolution of the terrain under forest, the requirements of fully polarized data, et.al. With the development of artificial intelligence (AI), well-posed phase signal processing and other advanced techniques, it is possible to overcome such limitations. 
The session will address the latest development frontier of forest height inversion using microwave remote sensing techniques, and welcome submissions but not limited to the following areas: 
1.	Theoretical Foundations: Discussions on the theoretical frameworks of microwave remote sensing for forest height inversion, including AI model development, scattering and absorption mechanisms within forest canopies, et.al. 
2.	Methodological Developments: Presentations on recent advancements in microwave remote sensing methodologies for forest height inversion, including well-posed phase signal processing algorithms, training dataset generation methods for AI, as well as interpretability of AI models used in microwave remote sensing forest height inversion.
3.	Practical Applications: Case studies about the application of microwave remote sensing for forest height inversion in diverse forest types and environments, showcasing the practical utility of the advanced inversion methods.
4.	Challenges and Limitations: Discussion of the current challenges and limitations of microwave remote sensing for forest height inversion, including data quality, spatial resolution, and the impact of environmental factors (e.g., soil moisture, vegetation type). 
Show/Hide Description
CCS.7: Advancements in Deep Learning for SAR Remote Sensing Applications
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Synthetic Aperture Radar (SAR) remote sensing is one of the most advanced techniques for Earth observation, offering continuous day-and-night data collection. SAR captures both phase and amplitude information, providing detailed insights into surface and sub-surface features. Techniques like Polarimetry (PolSAR) and Interferometry (InSAR) leverage this data for monitoring land, sea, and atmospheric conditions. Recently, deep learning has become a powerful tool for enhancing the capabilities of satellite remote sensing. It has been vastly applied in the applications of SAR remote sensing and deep learning applications. This session will explore the development of deep learning models to improve SAR satellite efficiency and effectiveness for various applications, such as:
The topics to be covered in this special session include, but are not limited to: 
1.	Robust 2D CNN and 3D CNN architectures use in SAR Remote Sensing
2.	SAR image-space transformations and their Applications in Large Language Models such as Vision Transformers, BERT, Llama, T5 for SAR Remote Sensing applications.
3.	InSAR stack noise removal; estimation of persistent and distributed scatterers.
4.	Image-to-Image translation deep learning models for SAR Image Despeckling 
5.	Deep Learning in Semantic segmentation, urban features extraction and, Land Use and Land Cover Dynamics modeling using SAR Remote Sensing
6.	Cryosphere monitoring, including snow parameter retrieval and glacier mapping.
7.	Forest biomass estimation using SAR and deep learning models
8.	Disaster monitoring, focusing on landslides, avalanches, and flood assessment.
9.	Planetary remote sensing with Mini-SAR on Chandrayaan-1, and Chandrayaan-2 SAR sensors.
Join us to explore the latest advancements in SAR remote sensing 
Show/Hide Description
CCS.8: Advances in Hydrology Using the Surface Water and Ocean Topography Mission
L.10: Land Applications — Inland Waters
The Surface Water and Ocean Topography (SWOT) satellite mission, launched in December 2022, is a groundbreaking effort to observe inland, ocean, and coastal waters worldwide. It marks a significant advancement in hydrological sciences as the first satellite designed to investigate surface water in the global water cycle, and it provides an unprecedented, comprehensive view of Earth's freshwater bodies from space. Using Ka-band radar interferometry, SWOT delivers, for the first time, simultaneous, high-resolution maps of water surface elevation and inundation extent in rivers, lakes, reservoirs, and wetlands globally. The mission has a 21-day repeat cycle, offering multiple revisits, ranging from two to more than seven, depending on latitude.

For over a decade, the hydrologic remote sensing community has developed new methodologies and scientific frameworks to fully leverage the potential of SWOT data, which are now advancing our understanding of global water fluxes and fundamentally changing how we perceive and analyze surface water dynamics. Early investigations with this new dataset are already uncovering valuable insights into hydrologic processes across rivers, lakes, reservoirs, wetlands, as well as the terrestrial cryosphere, with performance exceeding the mission'’s science requirements for some variables .

SWOT’s innovative capabilities, building on existing altimetry and imaging satellite missions, are unlocking new opportunities to deepen our understanding of water storage, river discharge, and flood dynamics.  These advances are expanding our ability to inform water resource management and assess the impacts of climate change. This session invites abstracts that present recent breakthroughs and novel discoveries, demonstrating how SWOT is advancing new frontiers in hydrology and enriching our understanding of Earth’s surface water systems.
Show/Hide Description
CCS.9: Advances in Hyperspectral Remote Sensing Image Classification and Their Applications in Agriculture, Forestry, and Wetlands
T/D.12: Data Analysis — Classification and Clustering
With the continuous advancement of hyperspectral remote sensing technology, the volume of acquired image data is steadily increasing. Traditional hyperspectral classification methods often struggle to handle such large and complex datasets. Therefore, applying artificial intelligence and machine learning techniques for hyperspectral image classification not only improves the accuracy and efficiency of image analysis, but also enables the automation of large-scale data processing. These advancements allow us to better monitor and analyze changes on the Earth's surface, and utilize these methods in many applications, such as agriculture, forestry, and wetlands.
Timely awareness of the latest advancements in hyperspectral image classification can enable researchers to grasp cutting-edge technologies and innovative methods, thereby enhancing the efficiency and accuracy of their research and applications, and contributing to the in-depth development of scientific research. Furthermore, understanding these advancements also fosters collaboration between academia and industry, promoting the translation and application of technology, providing strong support for sustainable development, addressing global challenges, and driving innovation and progress in practical applications.
 The broad topics include (but are not limited to):
•	Methods for single-scene or cross-scene hyperspectral image classification
•	Methods for open-set hyperspectral image classification
•	Fusion of hyperspectral and multispectral images for classification
•	Hyperspectral image classification with data in other modalities (Lidar, SAR, etc)
•	Applications of hyperspectral image classification in agriculture
•	Applications of hyperspectral image classification in forestry 
•	Applications of hyperspectral image classification in wetlands
•	Applications of hyperspectral image classification in other fields
Show/Hide Description
CCS.10: Advances in Image Analysis and Data Fusion for Earth Observation Towards Sustainable Development Goals
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
The sustainable development goals (SDGs) adopted by United Nations member states provide a framework for action on tackling climate change, promoting prosperity, and people's well-being for a better and sustainable future for all. The progress toward attaining SDGs is monitored by analyzing data collected from multiple sources (surveys, government agencies, social media,  etc.).  In addition, the images acquired from sensors onboard Earth Observation (EO) satellites also provide an opportunity to monitor the Earth's ecosystem and build a valuable infrastructure. The EO images can provide continuous temporal information over the globe and can cover most remote areas of the world. Besides, satellite images can improve and complement conventional statistical in-situ data collection, as well as provide new types of environmental information. Image analysis and data fusion methods continue to impact the EO applications, such as crop type mapping, mapping slums and urban areas, sustainable forest management, disaster monitoring and response, drought monitoring through water bodies' conditions analysis, water quality monitoring through contaminants’ analysis, just to give a few examples. All of them are well aligned with the SDGs, such as attaining Zero Hunger (SDG 2), Good Health and Well-Being (SDG 3), Clean Water and Sanitation for All (SDG6), Sustainable Cities and Communities (SDG 11), Climate Actions (SDG 13), Life on Land (SDG 15), and Secure Property Rights (Multiple SDGs).

This session is directed to papers focusing on methods and techniques in image analysis and data fusion that address the SDGs. The session aims to highlight the pivotal role of image analysis and data fusion-based methods when applied to Earth Observation (EO) images in achieving SDGs. 
Show/Hide Description
CCS.11: Advances in Machine Learning for Agricultural Land Use and Land Cover Classification
L.1: Land Applications — Land Use Applications
Introduction
The proposed session at IGARSS 2025 will explore innovative methodologies and applications in the field of land use and land cover (LULC) classification, with the focus on machine learning and data integration. Land use and land cover data play a very important role in policy making, sustainable development, and environmental management. Within this framework, with the increased challenges related to climate change, agriculture, and urbanization, the demand for accuracy and reliability of LULC data has never been higher. This session will review several different examples of how machine learning techniques contribute to improving LULC mapping, integrating diverse datasets, and classification accuracy within a wide range of agricultural landscapes. 
Land cover refers to the physical characteristics of the land, while land use describes the activities carried out on the land and the purposes it serves. The interdependence of these two dimensions demands comprehensive mapping and analysis for unravelling their relationships, especially about agricultural management and climate resilience. Recent developments in machine learning and data integration technologies have reached promising ways of enhancing the accuracy and efficiency of LULC mapping so that agricultural landscapes may be better analysed and managed.
The session aims to present state-of-the-art research and practical applications that demonstrate advances in LULC classification with machine learning techniques and the integration of different datasets for better LULC mapping. We encourage a sharing platform to take discussions on LULC mapping forward with a view to contribute towards land management for broader understanding and implications for sustainable development. 
Session Objectives
The session will cover a wide range of topics related to LULC classification and machine learning applications in agriculture. We encourage submissions that address, but are not limited to, the following themes:
1.	Mapping and Monitoring: Innovative approaches for mapping and monitoring LULC across different spatial and temporal scales, using machine learning techniques to improve the LULC detection and classification.
2.	Data Integration: Methods for data integration from multiple sources, including remote sensing, ground truth data, and socio-economic datasets, to create comprehensive LULC models.
3.	Change Detection: Advanced analytics for detecting changes in land use and land cover, with a focus on assessing the impacts of agricultural practices and climate change.
4.	Validation Techniques: Techniques to validate LULC datasets, ensuring reliability and credibility of LULC maps for stakeholders who deal with environmental management and policymaking.
5.	Data Communication: Effective reporting, visualization, and communication methods in presenting LULC data to diverse users, including policymakers, researchers, and the public.
6.	Evidence-based Decision Making: Case studies demonstrating the application of LULC data in evidence-based decision-making processes, and how machine learning was instrumental in enhancing policy outcomes.

Dr. Sabah Sabaghy, specializing in land cover mapping, remote sensing, spatial analytics, and their applications in agriculture and environmental monitoring, will chair the session. She will lead discussions, foster collaboration, and facilitate a productive exchange of ideas to advance understanding.

Dr. Kathryn Sheffield, specializing in land use and cover mapping, validation, remote sensing, and biosecurity with a focus on agricultural applications, will co-chair, supporting the exchange of knowledge.
Show/Hide Description
CCS.12: Advances in Monitoring Precipitation, Clouds, and Aerosols Using Remote Sensing and AI Technologies for Weather Analysis and Extreme Events
M.1: Atmosphere Applications — Precipitation and Clouds
Research related to clouds and precipitation remains one of the most important challenges, ranging from high-resolution, short-term forecasting and monitoring to global, long-term climate prediction.Clouds and precipitation are important components of the Earth’s energy and water cycle, climate, and climate variability. In particular,extreme events are likely to increase in frequency and severity in the near future due to climate change.Therefore, monitoring the evolution of clouds and precipitation in severe weather systems through remote sensing measurements is of paramount importance for improving public safety. 

In recent years, advancements in measuring the characteristics of clouds and precipitation have been significant due to new technologies in in-situ and remote sensing instruments. Global coverage and high-resolution observations have greatly improved our understanding of the formation and evolution of clouds and precipitation systems. These advancements now enable us to better study multi-scale motions, microphysical processes, and the role of aerosols in cloud and precipitation systems,thereby enhancing the accuracy of numerical weather and climate prediction models. 

A comprehensive understanding of the physical processes that govern the formation, growth, and decay of clouds and precipitation is essential for improving both short- and long-term forecasting. Although much has been learned about clouds and precipitation in recent years, many research questions remain unanswered, and the ability to predict their location and intensity with the desired accuracy remains elusive. The development of new observation strategies and AI-enhanced algorithms for monitoring and predicting is urgently needed to address these challenges.

The aim of this special session is to present advances and new findings that enhance the monitoring and understanding of cloud, precipitation, and aerosols, and accuracy of nowcasting, and short-term forecasting. Advanced AI techniques can enhance data analysis and interpretation. By leveraging advanced algorithms and machine learning techniques, these technologies can analyze vast amounts of data collected from satellites and ground-based sensors, providing insights that were previously unattainable. 
Show/Hide Description
CCS.13: Advances in Multimodal Remote Sensing Image Processing and Interpretation
T/D.17: Data Analysis — Data Fusion
Recent advances in sensor and aircraft technologies allow us to acquire huge amounts of remote sensing data. Diverse information on Earth's surface can be derived from these multi-resolution and multimodal data, providing a much more comprehensive interpretation for Earth observation, e.g., spectral information from multi- and hyperspectral images can help to reveal the material composition, elevation information from LiDAR data helps to estimate the height of the observed objects, synthetic aperture radar (SAR) data can measure dielectric properties and the surface roughness, panchromatic data are instead focused on spatial features of the acquired landscape, and so forth.
 
State-of-the-art works have proven that the fusion of these multi-resolution and multimodal images provided better performance than those only using a single image source. However, challenges remain when applying these data for some applications (classification, target detection, geological mapping, etc.). For example, classical issues could be related to the misalignment of multimodal images, the presence of clouds or shadows (in particular, when optical data are involved), and spectral/spatial differences hampering the post-fusion of these data.
 
This session will focus on multi-resolution and multimodal image processing and interpretation, such as multimodal image alignment, restoration, sharpening of multi-spectral and hyperspectral images (e.g., pansharpening, hyperspectral pansharpening, and hypersharpening), use of machine learning approaches devoted to several tasks (e.g., feature extraction and classification) exploiting the multimodality of the data, and so forth. We will discuss the latest methods/techniques for multi-resolution and multimodal image processing, as well as how this can benefit our interpretation.
Show/Hide Description
CCS.14: Advances in Remotely Piloted Aerial Systems (RPAS) hyperspectral data collection, processing and analysis for Earth Observation Applications.
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
The use and development of Remotely Piloted Aerial Systems (RPAS) based hyperspectral imagers (HSI) has advanced significantly and rapidly over the last decade. As such, the near ubiquitous adoption of RPAS-HSI for Earth Observations applications including biodiversity mapping, ecology, satellite calibration/validation, coral and freshwater studies, mining, and agriculture, provided a firm basis for its wider adoption. RPAS-HSI systems generally acquire data from low altitude (~<120 m) resulting in ultra-high spatial resolutions (e.g. 3-5 cm). Multiple sensors also allow for full range (400-2500 nm) acquisition increasing the fields of application. Even for small areas (~ 1 hectare) as is commonly used in satellite validation applications or vegetation studies, the resulting data is comprised of millions of spectral signatures representing the various chemical and biophysical features on the ground. This vast amount of data presents new challenges in terms of data processing and analysis. Too often, complex and costly RPAS-HSI mission data is often reduced to basic analytical techniques such as vegetation indices or other similar methods that do not exploit the wealth of information in spectral signatures. The use of more sophisticated techniques such as wavelet analysis for modeling vegetation characteristics is on the rise, and the recent renewed interest in machine learning algorithms for hyperspectral data, proposes new and exciting opportunities for maximizing the information in the data cubes. The goal of this CCS is to bring state-of-the-art RPAS HSI applications and analytical tools related to a broad range of EO topics, especially those focused on best practices and protocols aiming to produce reliable hyperspectral imagery. We are pleased to invite a broad spectrum of research work that showcases novel ways to collect, process and analyze RPAS-HSI data, and promote advanced EO applications.
Show/Hide Description
CCS.15: Advances in SAR Image Quality Metrics
D/S.8: Societal Engagement and Impacts — Standardization in Remote Sensing
The proposed session is a joint effort between the P3397 Synthetic Aperture Radar (SAR) Image Quality Metrics standards working group and the GRSS Standards Committee.  This session will highlight the importance of standards in commerce and address the need for a standardized metric capable of rating the quality of SAR images.  Such a metric is highly sought after since there are multiple vendors in today’s marketplace that sell SAR imagery to global customers.  The images for sale are generated across a wide range of hardware platforms with capabilities that do not necessarily create imagery of comparable quality.  To a government or industry customer, objective methods for determining the quality of SAR images will add confidence when engaging in transactions within the emerging commercial space industry.

The scope of this session includes new ideas and concepts for performance metrics that can be used to assess the quality of imagery generated by airborne and space-based SAR systems.  The system components that determine image quality include the raw collected data as well as the back-end processing algorithms.

A sample list of quantitative metrics that are used to assess the quality of airborne and space-based SAR images include peak signal-to-noise-ratio, integrated sidelobe level, speckle level, signal-to-clutter ratio, ambiguity signal ratio (ASR), noise equivalent sigma zero (NESZ), and feature contrast.  Also relevant to this session is analysis of the impact of different SAR geometries and measurement modalities on image quality, and an analysis of the impact of different focusing and autofocusing methods.  Further, the impact of different waveform modulations, platform position uncertainty, and quantization errors on image quality are within scope.  Calibration uncertainties and confidence intervals for objects in SAR images are also of interest.
Show/Hide Description
CCS.16: Advances in yield and production estimation for agricultural policy and food security with Earth Observations
D/S.6: Societal Engagement and Impacts — Food security
The global imperative for food security, intensified by population growth and climate change, demands innovative approaches to crop yield estimation. This session explores cutting-edge methodologies leveraging Artificial Intelligence (AI), Machine Learning (ML), and Earth Observations (EO) to revolutionize agricultural yield prediction and production forecasting. We focus on addressing the unique challenges of small-holder regions and the Global South, where data scarcity and complex geographies often hinder traditional estimation techniques. 

This session welcomes submissions on AI and ML algorithms for crop yield estimation using satellite data, techniques for overcoming data limitations in small-holder areas, and integration of climate change impacts into prediction models. Research highlighting user-friendly tools that bridge the gap between space technology and on-farm applications is particularly encouraged. Special emphasis is placed on studies that demonstrate the utility of these approaches in real-world decision-making scenarios and assess the impact of climate change on crop yields in the Global South. We are especially interested in submissions that showcase how AI and EO-based yield estimation techniques can inform climate adaptation strategies and enhance resilience in agricultural systems. This session aims to facilitate knowledge exchange among remote sensing experts, data scientists, agricultural researchers, and policymakers. 

The session will explore state-of-the-art crop area estimation and yield prediction techniques, challenges specific to small-holder agriculture, and innovative approaches to translate yield predictions into actionable insights for stakeholders. We encourage presentations exploring the entire pipeline from data acquisition to practical application, including the development of robust AI models, integration of multiple data sources, creation of early warning systems, and design of user-friendly interfaces. By showcasing advancements in AI and satellite data-based technologies for yield estimation, we hope to inspire new collaborations and drive progress in addressing global food security challenges, ultimately contributing to a more food-secure and resilient future for all.
Show/Hide Description
CCS.17: Advances on Polarimetric GNSS-R
S/M.3: Mission, Sensors and Calibration — Spaceborne GNSS-R Missions
Polarimetric GNSS-R (Global Navigation Satellite System-Reflectometry) represents the next phase in GNSS-R technology, with the potential to significantly advance land and cryosphere assessments. By incorporating polarimetry into forward-scattering measurements, this approach helps disentangle key variables such as vegetation, soil moisture, surface roughness, sea ice properties, and freeze/thaw state. One long-standing goal in GNSS-R has been to achieve soil moisture retrieval in a single pass, minimizing the dependence on supplementary data. Realizing this goal would enable small satellites with GNSS-R payloads to autonomously monitor land without needing additional dynamic measurements. However, previous research has revealed the complexities and lower accuracy of single-pass soil moisture retrieval without in-situ or ancillary data. A promising solution is to upgrade current GNSS-R instruments with polarimetric capabilities, providing a more versatile tool for Earth surface analysis, especially in areas where polarimetric sensitivity is crucial, such as icy terrain, bare soil, and vegetated regions.

Current advancements in Polarimetric GNSS-R focus on the use of either two circularly polarized antennas (RHCP/LHCP) or two linearly polarized antennas (H/V). Continued collaboration among experts in the field is crucial to establishing clear guidelines on when to favor one polarization scheme over the other. These discussions are expected to encourage valuable exchanges among experts from various teams and backgrounds. Key topics of interest include the potential impact of polarimetric GNSS-R, spanning from modeling and data simulation for future missions, drawing insights from existing non-polarimetric missions, to analyzing real polarimetric data.

This session is particularly relevant to forthcoming polarimetric GNSS-R missions like ESA's HydroGNSS, as well as other future missions requiring polarimetric capabilities to enhance GNSS-R products, such as soil moisture and cryosphere characterization, or even new products like vegetation opacity. Polarimetric GNSS-R retrievals could also enable the independent recovery of critical geophysical parameters like soil moisture, reducing reliance on ancillary data. Such retrievals would complement data from current missions, providing valuable training and validation resources for future algorithm development. Combining models with data from ongoing GNSS-R missions (e.g., CYGNSS, SMAP-R, BuFeng-1, FY-3E, Spire, Muon) will deepen understanding of the potential outcomes of a polarimetric GNSS-R mission. The session will feature presentations from international teams working on polarimetric GNSS-R, including efforts in model development, existing spaceborne measurements, and the anticipated launch of polarimetric GNSS-R missions.
Show/Hide Description
CCS.18: Advancing the monitoring and assessment of climate change impacts on vegetation
L.3: Land Applications — Forest and Vegetation: Application and Modelling
This session will explore innovative methodologies and technologies for monitoring and assessing the impacts of climate change on vegetation physiology, dynamics, and structure. As climate variability intensifies, understanding how natural vegetation ecosystems respond is crucial for biodiversity conservation, agricultural productivity, and ecosystem services. Developments in remote sensing technology have opened up exciting new opportunities for vegetation monitoring. As we continue to explore these innovative tools, we can more effectively address the challenges posed by climate change and support the sustainability of vital ecosystems. In this session, we will discuss advancements in remote sensing techniques, including satellite imagery, drone technology, and ground-based observations, to enhance our capacity to detect changes in vegetation productivity, stress, distribution, and phenology. The session will also highlight interdisciplinary approaches that integrate ecological modelling and data analytics, fostering collaboration among geoscientists, vegetation ecologists, and remote sensing experts. By advancing our monitoring capabilities, we aim to provide critical insights into the susceptibility and resilience of vegetation in the face of climate change and weather extremes.
Ultimately, our goal is to support more effective conservation and management strategies that address the ongoing challenges posed by climate change, through advancements in remote sensing and geoscience. This topic is vital for geoscience and remote sensing, as it bridges the gap between technological innovation and ecological mechanisms, informing policy and action in an era of rapid environmental change and uncertainty. By enhancing our knowledge of these dynamics, we can better prepare for and mitigate the impacts of climate change on vital ecosystems, ensuring their sustainability for future generations.
Show/Hide Description
CCS.19: AI Impact and On-board Hyperspectral Data Analytics
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
We are standing at an important junction of developing better AI technology as well as providing sustainable solutions to the mankind. Our geoscience and remote sensing community has been leveraging on AI/ML technology to build state-of-the-art solution frameworks for wide range of application products. In particular, with the advent and progress of greater computing power and access to a certain public datasets, the hyperspectral research community has made a tremendous growth as well as put forward remarkable learning algorithms. However, there is a need to focus on-board processing of hyperspectral information where AI can play a crucial role in such a resource constraint scenario.

The session invites innovative research/review papers focusing on providing strategic balance between AI-based technological development that provides compact, fast, resource-efficient, scalable, learning algorithms for on-board data analytics of hyperspectral imagery.; and its societal impact and sustainability. It includes following major thrust areas (but not limited to):
1. Self-supervised methods for hyperspectral unmixing, classification, target detection
2. Limited-labelled data modeling for hyperspectral applications
3. Compact learning machines for on-board processing of hyperspectral data
4. Statistical modeling for simple, fast, resource-efficient processing of hyperspectral imagery
5. Multi-temporal data analysis of spaceborne hyperspectral imaging
6. AI-based hyperspectral 3D models 
7. Impact of AI in hyperspectral: review, challenges, opportunities, applications 
8. Societal impact of AI in hyperspectral: roles, responsibilities, counter measures, looking ahead
9. Generative AI for meeting demands of large dataset for hyperspectral analytics
10. Supervised, unsupervised, semi-supervised methods for on-board processing of hyperspectral data
11. Reinforcement learning for autonomy in on-board processing of hyperspectral imagery
Show/Hide Description
CCS.20: ALOS Series Mission, Cal/Val, and Applications
S/M.1: Mission, Sensors and Calibration — Spaceborne SAR Missions
The Advanced Land Observing Satellite “Daichi” (ALOS) series of the Japan Aerospace Exploration Agency (JAXA) has been continuously operated since 2006, and currently ALOS-2 and ALOS-4 have been in operation since 2014 and 2024, respectively, to perform precise observations. The L-band SAR mission is succeeded by ALOS-2, which will be succeeded by ALOS-4. Therefore, tandem L-band SAR observations will be realized by ALOS-2 and ALOS-4. The ALOS optical mission was scheduled to be succeeded by ALOS-3, but the launch failed due to a malfunction of the H-3 rocket test flight 1. The study of the next generation high-resolution optical mission has been accelerated and intensive study is now underway within JAXA, and a launch is expected in the near future; a follow-on mission to ALOS-4 is also under study within JAXA, and continuous observations without gaps using L-band SAR are expected. 
The main mission of the ALOS series is to contribute to the major research and application themes in the earth science and remote sensing fields: disaster monitoring and disaster prevention, updating national land and infrastructure information, and global forest and environmental monitoring.
In this community Contribution Session, we will discuss where we should focus on the mission, Cal/Val, science, and applications of the ALOS series. In addition, the results achieved in ALOS and ALOS-2 will be presented, and plans for implementation in ALOS-4 will be presented. Based on these results, we will also discuss the prospects for future ALOS series missions, including the continuity of the missions, the importance of international cooperation, and the advantages and disadvantages that should be reflected in future missions.
Show/Hide Description
CCS.21: Applications of hyperspectral data for critical metals exploration
L.11: Land Applications — Geology and Geomorphology
The increased global demand for raw materials has led to heightened pressure on mining industries to re-evaluate brownfield sites—those previously used but now underutilised or abandoned—while simultaneously pursuing new opportunities through greenfield exploration programs. This theme focuses on the transformative impact of hyperspectral techniques and earth observation data in revolutionising ore-body knowledge and exploration discoveries. By presenting global case studies, the theme highlights how these advanced technologies are redefining our understanding of mineral deposits.
Hyperspectral imaging and earth observation data offer unprecedented insights into the composition and distribution of mineral resources. These techniques provide detailed spectral information that can reveal hidden or previously overlooked reserves, significantly improving the accuracy of resource assessments. This theme emphasises how integrating hyperspectral data into exploration strategies not only enhances the ability to identify new deposits but also refines the understanding of existing ones. By doing so, it contributes to more effective and targeted exploration efforts.
Furthermore, the theme illustrates how hyperspectral technologies can optimise exploration strategies, enabling more efficient and sustainable resource management. These advancements support better decision-making processes, reduce environmental impacts, and improve the overall sustainability of mining operations. By applying these techniques to both established brownfield sites and emerging greenfield areas, the session demonstrates their potential to address current challenges and opportunities in the mining industry.
In summary, this theme showcases the pivotal role of hyperspectral and earth observation technologies in advancing the field of mineral exploration and resource management. It underscores how these innovations can drive more efficient, sustainable, and responsible mining practices, ultimately contributing to the more effective utilisation of mineral resources in response to the growing global demand.
Show/Hide Description
CCS.22: Applications of Remote Sensing in Urban Climate and Sustainability
L.6: Land Applications — Urban and Built Environment
The rapid urbanization of cities worldwide has introduced pressing challenges in managing urban climates and ensuring sustainable development. This session focuses on the critical role remote sensing technologies play in understanding, monitoring, and addressing the environmental and sustainability challenges faced by urban areas. In an era marked by big data and advanced observational systems, remote sensing provides a powerful tool for capturing high-resolution data across spatial and temporal scales, yet fully leveraging this potential requires sophisticated analytical techniques.

We welcome submissions from diverse fields, including remote sensing, urban climatology, environmental science, data science, artificial intelligence, and urban studies. The session aims to explore how remote sensing can be applied to monitor urban climate dynamics, assess environmental impacts, and promote sustainable practices in cities. Additionally, the session seeks to foster interdisciplinary collaboration to push the boundaries of urban climate research and sustainable planning in the context of evolving technologies and data sources.

Topics of interest include, but are not limited to:
- Remote sensing applications for urban heat island effect monitoring and mitigation,
- Satellite-based approaches for air quality and pollution assessments in urban areas,
- The use of remote sensing for sustainable urban planning and green infrastructure monitoring,
- Climate resilience and adaptation strategies using remote sensing data,
- Machine learning and AI techniques for analyzing urban sustainability from remote sensing datasets,
- Remote sensing for monitoring energy consumption and carbon footprint in cities,
- Integration of remote sensing with socio-economic and climate data for comprehensive urban sustainability assessments.
	
We also encourage discussions on the challenges and opportunities of integrating remote sensing with other environmental and socio-economic data sources to provide a more holistic view of urban climate challenges and sustainability efforts.
Show/Hide Description
CCS.23: Applications of Very High Resolution X-Band SAR data
S/M.7: Mission, Sensors and Calibration — New Space Missions
We kindly request your consideration of our proposal for a community-contributed session at IGARSS 2024. This session will highlight research conducted by scientists using "Very High Resolution Synthetic Aperture Radar (SAR)" data. 

Below is a brief overview of the proposed session:

New space SAR providers like Capella, Iceye, and Umbra have provided datasets to various researchers for a range of scientific studies. Last year, we organized a similar session that demonstrated how very high-resolution SAR imagery enhances the study of oceanography, geology, disaster management (such as in flood events), sea ice glaciology, and amplitude change detection through signal processing and machine learning techniques. 

For IGARSS 2025, we aim to showcase new studies that demonstrate the significant value of commercial high-resolution SAR data for the Earth observation and remote sensing community. The session will focus on two key tracks:
1. How small satellites and New Space SAR systems can enable advanced radar features such as Polarimetric SAR, Differential Interferometry, and other sophisticated techniques.
2. How the vast volume of open-source data from New Space SAR providers can be harnessed by the research community to train and implement cutting-edge algorithms, including those driven by machine learning and other innovative approaches.

These tracks will highlight the transformative potential of New Space SAR for advancing scientific research and operational applications.

Additionally, we hope to foster discussions on how commercial small SAR satellites can complement other systems such as TerraSAR-X, NISAR, Sentinel-1, and RADARSAT-2/RCM. The session will also explore the benefits of deploying SAR systems in mid-latitude inclined orbits or using small satellites in bistatic or multi-static configurations.

This session addresses many of the IGARSS 2022 themes such as S/M.1: Spaceborne SAR Missions and S/M.7: New Space Missions, and earth science focused areas such as D/S.5: Risk and Disaster Management, C.3: Sea Ice, as well as methodological areas such as  T/S.4: PolSAR and PolInSAR, T/S.2: Differential SAR Interferometry, and T/S.5: Bistatic SAR.
Show/Hide Description
CCS.24: Artifial Intelligence Advancements for Optimizing Soil Health Monitoring and Accessibility
L.8: Land Applications — Soils and Soil Moisture
Soil health is crucial for agricultural productivity, ecosystem health and biodiversity, land management, and climate change mitigation, aligning with numerous United Nations' Sustainable Development Goals. To safeguard this valuable resource, governments worldwide are enacting legislation that presupposes a robust monitoring system to continuously assess the health of soil ecosystems. This session aims to bring together experts, researchers, and practitioners to explore innovative applications of Earth Observation (EO) and remote sensing technologies in assessing and monitoring soil quality to support these objectives. 
The session will cover a range of key topics, focusing on advanced data analysis methods and algorithms for extracting valuable soil quality information from remote sensing data, as well as ways to simplify access to this information for the general public and relevant stakeholders. The transformative potential of artificial intelligence (AI) and big data in improving the accuracy, efficiency, and scalability of soil quality monitoring will be discussed, along with real-world case studies demonstrating remote sensing applications in various land use scenarios. Leading experts will offer insights into the use of multi- and hyperspectral data from airborne and spaceborne sensors for soil monitoring, as well as sensor integration, combining optical sensors with Radar and LiDAR for comprehensive assessments. Additionally, the session will highlight novel machine learning approaches, including synthetic bare soil composite generation, semi-supervised learning approaches, and transformer-based unsupervised learning approaches, all used for topsoil quality monitoring. Lastly, the session will also explore interpretable AI techniques that enhance understanding of the reasoning behind model predictions, as well as emerging trends such as generative AI and unsupervised methods. These advancements aim to support better predictive performance and broaden access for non-experienced users, making soil quality information more accessible and actionable.
The session is expected to foster collaboration among researchers, practitioners, and policymakers to further the application of remote sensing in soil quality monitoring. Participants will gain valuable insights into the latest advancements in remote sensing techniques and their potential to support sustainable agriculture. Additionally, the session will serve as a platform for networking and knowledge exchange, encouraging future research and project collaborations. 
Show/Hide Description
CCS.25: Beyond Ice: NASA’s ICESat-2 spaceborne lidar mission for land and vegetation applications
S/M.5: Mission, Sensors and Calibration — Spaceborne LIDAR Missions
Launched in 2018, NASA’s ICESat-2 mission has been successfully collecting laser altimetry data over terrestrial ecosystems. The ATL08 Land and Vegetation data product is the most downloaded product from the National Snow and Ice Data Center (NSIDC). This session, led by members of the ICESat-2 Science Team, aims to showcase the latest research and applications of ICESat-2 technology for mapping terrain and vegetation structure. Presentations will delve into key topics including: ICESat-2 data products, forest structure, biomass estimation, hydrology, disturbance detection, gridded map products, highlighting the invaluable contributions of ICESat-2 in advancing our understanding of terrestrial ecosystems. Moreover, we aim to provide an overview of the NASA ICESat-2 mission, including its instrumentation, data collection, mission objectives, and data validation, with a particular emphasis on its relevance to land and vegetation applications.

 Session Topics:

i) Forest Structure and Biomass Assessment with ICESat-2
ii) Disturbance Detection and Monitoring with ICESat-2 (e.g. Fire, hurricanes, droughts)
iii) Synergies of ICESat-2 with Existing and Upcoming Earth Observation Missions for Land and Vegetation Applications (e.g., NASA’s GEDI, NISAR, SWOT, ESA’s Biomass)
iv) Gridded maps of terrain and/or vegetation parameters
v) Validation of ICESat-2 products over land and vegetation 
vi) Open Science - Tools for ICESat-2 Data Processing
Show/Hide Description
CCS.26: Calibration and Validation of Space-Based Imaging Spectrometers during a Growth Era of Hyperspectral Data Sources
S/M.4: Mission, Sensors and Calibration — Spaceborne Hyperspectral Missions
Earth observation is experiencing a growth period for spaceborne imaging spectrometers (hyperspectral imagers). National space agencies and private companies are providing reflectance products from these new sensors at a variety of spectral and spatial resolutions, for example NASA's EMIT (Earth Surface Mineral Dust Source Investigation) at 60 m ground sampling distance (GSD) and DLR's EnMAP (Environmental Mapping and Analysis Program) at 30 m GSD. With proper spectral, radiometric, and spatial calibration and post-launch evaluation, data from multiple sources might be used collectively to support scientific studies and extend applications over larger areas and at greater temporal frequency. This session will provide a forum for commercial, government, and academic organizations and other groups to share their mission-focused efforts and results in advancing imaging spectrometer calibration and validation (cal/val) of products. This session focuses on cal/val of radiance and reflectance products which are the basic geophysical data and the level of data interpreted for surface material composition and chemistry, respectively. This session also provides an opportunity to discuss international collaborations and multi-mission data comparison. Leveraging of hyperspectral data with multispectral data is a topic of interest, including presentations on advanced multispectral missions, such as the SBG-TIR (Surface Biology and Geology Thermal InfraRed) and Landsat Next. Airborne and field campaigns often collect fine scale data for cal/val; thus, contributions which show applications of airborne and field data to cal/val of one or more spaceborne imaging spectrometers are sought, in particular, those that focus on underserved communities. Discussion of engagement of the IEEE GRSS Geoscience Spaceborne Imaging Spectroscopy (GSIS) technical committee and working groups is highly encouraged.
Show/Hide Description
CCS.27: Challenges and Opportunities for leveraging FAIR Remote Sensing Data Systems
T/A.19: AI and Big Data — Data Management Systems and Computing Platforms in Remote Sensing
The exponential increase in earth observation data necessitates systems that are not only scalable but also facilitates FAIR (Findable, Accessible, Interoperable, and Reusable) principles and Open Science Initiatives. This requires a multi-faceted approach, encompassing:

- Enhanced data search, discovery, and interoperability through the integration of semantics, ontologies, knowledge graphs, and emerging technologies like Generative AI and intelligent agents.
- Advancements in cloud-optimized data systems and products to improve accessibility, reusability, and scalability for the growing volume of remote sensing data.
- Innovative data storage and access frameworks designed to support geospatial analysis and machine learning workflows efficiently.
- Robust data provenance and lineage tracking through blockchain and related technologies to ensure trust and transparency in data handling.
- Showcasing the advantages of open data cube architectures and Analysis Ready Data (ARD) standards for streamlined interoperability, simplified data access, and accelerated analysis across a wider user base.
- Exploration of novel machine learning techniques, including super-resolution, federated learning, geospatial foundation models, and explainable AI, to enhance the utilization of remote sensing data and expand application possibilities.
- Development of interactive dashboards that leverage next-generation data systems to increase user engagement and maximize the value derived from remote sensing data.

These advancements will streamline data access, improve searchability, and manage the increasing volume and velocity of remote sensing data. By fostering broader data utilization and reuse, these efforts will benefit the remote sensing community, citizen scientists, and machine learning researchers. This session invites presentations that explore these and related technologies, showcasing their contributions to building more robust and accessible remote sensing data systems.
Show/Hide Description
CCS.28: Challenges and opportunities for Polar Earth Observation
C.2: Cryosphere — Ice Sheets and Glaciers
Remote Sensing is critical for monitoring change across the remote land and ocean at the Earth’s poles. Earth observations have revealed rapid and concerning changes in glacial extent, ice sheet mass and sea ice extent and thickness over recent decades. This includes the recent, unexpected decline of Antarctic sea-ice extent observed by satellites over the past two austral winters, and the continued decline in Arctic sea ice extent. Changes at the poles are impacting sensitive terrestrial and marine ecosystems and have far-reaching consequences for global climate, sea level rise, ocean circulation, carbon cycling and human populations that depend on these environments.  

 Polar remote sensing is more challenging than other regions. Image processing techniques require adaptation to low solar zenith angles, sensor oversaturation from ice/snow, unique atmospheric conditions, and polar projections. Furthermore, polar efforts can be disparate across nations, which makes it difficult to maximise opportunities to work at larger and continental scales. The inaccessibility of polar environments requires new and innovative remote sensing technologies to better observe, monitor and manage their rapid change. 

This session invites presentations that: 
* demonstrate successful applications of remote sensing to polar environments 
* include the development of algorithms for repeatable large-scale analysis 
* detail the unique challenges experienced in the use of polar data 
* showcase innovative solutions and opportunities to progress polar remote sensing challenges, increasing data quality, accessibility and usability of data 
* highlight opportunities to maximise the integration of remote sensing data across multiple and new sensors 
* showcase new remote sensing technologies and upcoming missions and priorities for polar regions. 
Show/Hide Description
CCS.29: Challenges for Ocean Remote Sensing in the UN Decade of Ocean Science for Sustainable Development
SP.3: Special Scientific Themes — Remote sensing for sustainable development in the Asia-Pacific region
The UN Decade of Ocean Science (2021-2030) (DOS) aims to transform the way that ocean science and knowledge is generated and used. An Ocean Decade Conference, held in Barcelona in April 2024, and other inputs resulted in a report by UNESCO which is effectively a midway progress report. The contributions of Remote Sensing are recognised in DOS and the identified gaps include monitoring marine pollution, water quality and biodiversity. The report points to gaps in understanding the land – ocean continuum and the need to adapt to coastal and ocean hazards.

This special session invites papers which address gaps in ocean science using remote sensing methods, and in particular, issues identified at the midway point of the UN DOS. 

UNESCO-IOC (2024). Ambition, Action, Impact: The Ocean Decade Pathway to 2030. Consolidated Outcomes of the Vision 2030 Process. UNESCO, Paris.
(The Ocean Decade Series, 50)
Show/Hide Description
CCS.30: Close-range sensing of the environment
L.1: Land Applications — Land Use Applications
Over the past decade, near-surface geophysics and close-range remote sensing have become essential tools for environmental and urban monitoring. Researchers have invested considerable effort in digitally reconstructing built environments and forestry elements, including buildings, infrastructure, and trees. The resulting information models—such as Geographic Information Systems (GIS), Building Information Modeling (BIM), and Computer-Aided Design (CAD)—have gained growing importance in various applications, including urban planning, disaster management, sustainability assessment, and forest monitoring.
With the fast development of machine learning and deep learning, the current analytical toolkits, spanning from spatial to intelligent analysis, offer potential for profound multi-data integration, enhancing outcomes and engendering more efficient techniques in the domain of land cover and land use detection. However, with regard to near-surface geophysics and close-range remote sensing, this integration remains this integration still remains at a basic level, involving just pixel-by-pixel composition or basic analysis.
Typically, close-range sensing technologies like laser scanning and photogrammetry are harnessed for digitizing manmade objects. This necessitates the unsupervised interpretation of complex scenes and the automated extraction of parameters from a diverse array of domain-specific objects, such as heritage sites, structures, and individual trees. In the last few years, there has been intense research activity towards the automation of this process. However, there is still important work to be carried out involving (i) the collection and processing of close-range sensing data (ii) scene interpretation including semantic segmentation and object detection, and (iii) parameter extraction for the final information models.
This workshop will present new technologies and methodologies that target the above objectives. We welcome submissions that cover but are not limited to the following:

Close-range sensing systems;
Geometric evaluation of mapping systems;
Close-range sensing data structures and models;
Innovative solutions for close-range sensing data interpretation
Scene interpretation including semantic segmentation, classification, and object detection;
Remote sensing data processing in information models such as BIM and GIS.

List of Contributions:

Prof. Naoto Yokoya, the University of Tokyo, https://naotoyokoya.com/

Prof. Wataru Takeuchi, The University of Tokyo,https://wtlab.iis.u-tokyo.ac.jp/en/index_e.html

assoz. Prof. Mag. Dr. Martin Rutzinger, University of Innsbruck, Martin.Rutzinger@uibk.ac.at


Associate Professor. Francesco Pirotti, University of Padova

Roberto Pierdicca, Università Politecnica delle Marche, r.pierdicca@staff.univpm.it

Show/Hide Description
CCS.31: Data Assimilation of Remote Sensing Data
M.2: Atmosphere Applications — Numerical Weather Prediction and Data Assimilation
Land surface processes play an important role in the earth system because all the physical, biochemical, and ecological processes occurring in the soil, vegetation, and hydrosphere influence the mass and energy exchanges during land–atmosphere interactions. Data assimilation (DA), through optimally combining both dynamical and physical mechanisms with real-time observations, can effectively reduce the estimation uncertainties caused by spatially and temporally sparse observations and poor observed data accuracy.

In recent decades, studies of land data assimilation have become very active, although this topic was proposed later than the assimilation of atmospheric observations. Land data assimilation can implement both in situ observations and remotely sensed data like satellite observation of soil moisture, snow water equivalent (SWE), land surface temperature, and so on to constrain the physical parametrization and initialization of land surface state. Notably, globally satellite‐derived data could provide the data basis for land DA.

Recent studies have shown that assimilating observed or remotely sensed data into land surface models to constrain the vegetation characteristics can improve the simulation ability for terrestrial flux exchanges. Most studies focusing on assimilation in terrestrial systems have tended to add multiple phenological observations to constrain and predict biome variables and further improve model performance. Joint assimilation of surface incident solar radiation, soil moisture, and vegetation dynamics (LAI) into land surface models or crop models is of great importance since it can improve the model results for national food policy and security assessments.

This session focus on successful applications of remotely sensed terrestrial data (including vegetation parameter, land surface temperature, soil moisture, snow, etc) into land surface models or coupled atmospheric models or even earth system forecasting systems, to improve model performance. Advances in land data assimilation using remote sensing terrestrial data into land models (or coupled atmospheric models), and even earth system models. Studies focusing on the comparison of variational algorithms (such as 3DVAR, 4DVAR), sequential algorithms (such as ensemble Kalman Filter (EnKF), Ensemble Adjust Kalman Filter (EAKF), Particle Filter (PF), etc) and their combination (such as 4DEnKF, DrEnKF, etc) conducted to find out the optimal assimilation algorithms. To find out the key physical and biochemical processes and intermediate variables in land-atmosphere interaction, reveal the potential predictability of the coupled climate system model. 

This session invites researchers to share their novel data assimilation using remote sensing terrestrial data for improving model predictions, as well as for gaining insight into the causes and mechanisms for atmosphere-land interaction.
Show/Hide Description
CCS.32: Data-centric Artificial Intelligence for Geospatial Applications
T/A.19: AI and Big Data — Data Management Systems and Computing Platforms in Remote Sensing
Building on last year’s success, this session continues exploring data-centric AI’s role in geoscience applications. With AI traditionally focusing on model improvements, there's a growing need to center efforts around data quality and management throughout the AI process. Data-centric AI, which prioritizes the handling and curation of data, ensures AI systems are driven by high-quality data, especially in fields like Earth science that rely on large datasets from satellites and models.

This approach is particularly relevant for geospatial AI as missions grow in complexity, making it essential to extract more value from the data. While progress has been slower in this field, a strong focus on data is critical to optimizing these expensive Earth observation missions.

Relevance

Incorporating data-centric AI into geospatial applications addresses current gaps in handling the vast amounts of data produced by Earth observation technologies. This session invites researchers and practitioners to share methods for improving data governance, managing bias in data-driven models, data engineering, and ensuring better outcomes in geospatial AI applications. 

This session will feature discussions on theory, methodology, use case, and application from areas including but not limited to:

1. Techniques to enhance the quality of datasets for geospatial AI.
2. Governance strategies for managing Earth observation data for AI effectively.
3. Approaches to managing bias and variance in geospatial data for AI.
4. Tools that improve the quality and accuracy of data used in geospatial AI.
5. New benchmark datasets including assessments and evaluations.

We welcome all contributions that are relevant to the geospatial AI community, with a focus on developing, iterating, and maintaining data to drive its advancement and help build better geospatial AI models.




Show/Hide Description
CCS.33: Datasets and Benchmarking for Rapid Disaster Response: Multimodal Approaches for Post-Disaster Damaged Building Detection
SP.1: Special Scientific Themes — Natural disasters and disaster management
In the wake of natural disasters like earthquakes, hurricanes, and tsunamis, the collapse of buildings is often the leading cause of fatalities. The ability to swiftly detect and assess damaged structures is critical for saving lives and coordinating recovery efforts. Remote sensing technologies, particularly those using high-resolution optical and Synthetic Aperture Radar (SAR) satellite imagery, have proven to be essential tools in post-disaster scenarios. While many studies have developed algorithms for identifying damaged buildings, leveraging both pre- and post-event imagery or post-event data alone, much of the research remains event-specific, limiting the broader applicability of these methods.
Several datasets have been developed to address these challenges. Notable examples include xBD for satellite imagery-based building damage assessment [1], SpaceNet 8 for flooded buildings and roads [2], MEDIC for disaster image classification using social media data [3], RescueNet for UAV-based damage assessment [4], and QuickQuakeBuildings, a dataset combining optical and SAR satellite imagery for earthquake damage detection [5]. These resources are paving the way for more robust algorithm development, but there's still much to be done. The increasing interest in multimodal data for disaster response highlights the importance of integrating diverse data sources to enhance detection accuracy and response times.
A key challenge is that many disaster datasets, though valuable, are not widely known or utilized across the numerous sectors involved in disaster management. Equally important is understanding what kinds of datasets are truly needed for rapid response. Are the current efforts from the remote sensing and GIS communities meeting these needs, or is there more ground to cover?
This session provides a platform for experts across disciplines to engage in discussions about multimodal disaster datasets and their critical role in improving post-disaster response. By fostering collaboration between researchers, emergency responders, and data providers, we aim to drive the development of standardized benchmarks and more accessible datasets. Ultimately, this will enable the creation of algorithms that can be swiftly deployed when disaster strikes, ensuring that detection and response efforts are as effective as possible.

References:
[1]Gupta, Ritwik, et al. "Creating xBD: A dataset for assessing building damage from satellite imagery." Proceedings of the IEEE/CVF conference on computer vision and pattern recognition workshops. 2019.
[2]Hänsch, Ronny, et al. "Spacenet 8-the detection of flooded roads and buildings." Proceedings of the IEEE/CVF conference on computer vision and pattern recognition. 2022.
[3]Alam, Firoj, et al. "MEDIC: a multi-task learning dataset for disaster image classification." Neural Computing and Applications 35.3 (2023): 2609-2632.
[4]Rahnemoonfar, Maryam, Tashnim Chowdhury, and Robin Murphy. "RescueNet: a high resolution UAV semantic segmentation dataset for natural disaster damage assessment." Scientific data 10.1 (2023): 913.
[5]Sun, Yao, Yi Wang, and Michael Eineder. "QuickQuakeBuildings: Post-earthquake SAR-Optical Dataset for Quick Damaged-building Detection." IEEE Geoscience and Remote Sensing Letters (2024).
Show/Hide Description
CCS.34: Datasets and evaluation protocols for benchmarking remote sensing algorithms
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
In an era of escalating environmental challenges, from climate change and deforestation to urbanisation and biodiversity loss, the call for effective solutions has never been more urgent. As remote sensing and machine learning applications expand across disciplines, the need for comprehensive datasets and standardised evaluation protocols becomes increasingly important. Well-curated datasets not only facilitate the development of more effective algorithms, but also increase the comparability and transparency of research results.
Building on discussions from previous IGARSS conferences, the session invites researchers, practitioners and stakeholders to come together to share insights, experiences and best practices in the development and use of Earth observation datasets and evaluation protocols for algorithm benchmarking and validation.
This year's theme, "One Earth", highlights the need for collaborative approaches to address environmental challenges. As such, this CCS will focus on fostering dialogue on the collective efforts required to curate and share remote sensing datasets that reflect different geographical regions and different environmental contexts. Participants are encouraged to present case studies that showcase successful collaborative initiatives, share lessons learned, and discuss the role of data sharing in promoting sustainability.
In addition to exploring existing datasets, the session will address the critical aspects of benchmarking methodologies. It will provide a platform to discuss evaluation protocols that can be adopted across the remote sensing community to ensure consistency and reliability. Topics of interest will include:

1. Large-scale datasets: highlighting existing datasets or presenting new strategies for collecting, annotating and curating remote sensing datasets, addressing large coverage and specific environmental phenomena. 

2. Standardised evaluation protocols: the session will present frameworks and best practices for the evaluation of remote sensing algorithms and discuss how these protocols can improve the robustness of algorithm performance assessments and promote reproducibility in research.

3. Impact of emerging methodologies: the role of machine learning and artificial intelligence in advancing remote sensing analysis and data fusion to enable the combined use of different available constellations for improved accuracy, coverage and temporal response. 
Show/Hide Description
CCS.35: Deep learning and remote sensing for rapid disaster response
SP.1: Special Scientific Themes — Natural disasters and disaster management
Every year, millions of people around the world are affected by natural and man-made disasters. Floods, heat waves, droughts, wildfires, tropical cyclones and tornadoes are causing increasing damage. In addition, civil wars and regional conflicts in different parts of the world lead to growing numbers of refugees and major changes in population dynamics. Emergency responders and aid organizations need timely, comprehensive and accurate information about the extent of hazards, exposed assets and damage in order to respond quickly and effectively.

For decades, emergency mapping has used remotely sensed data to support rescue operations. However, providing information in a rapid, scalable and reliable manner remains a major challenge, as it often relies on manual interpretation. Recent advances in machine learning have opened up new possibilities for automating the analysis of remote sensing data to cope with the increasing volume and complexity of data, as well as the inherent spatio-temporal dynamics of disaster situations.

In this session, we aim to provide a platform for research groups to present their latest research activities aimed at addressing the problem of automatic, rapid, large-scale and accurate information retrieval from remotely sensed data to support disaster response. More specifically, the focus lies on:
•	Deep learning-based approaches for extracting relevant information on hazard extent (e.g. flood mapping), exposed assets (e.g. road condition assessment) and impacts (e.g. building damage assessment) acquired from satellite, airborne or drone sensors. 
•	On-board deep learning-based approaches for infrastructure or damage detection from airborne platforms.
•	Challenges and limitations of current AI methods in the context of disaster response. 
•	New public benchmark datasets for disaster response, for the training, evaluation, and comparison of AI models.

Show/Hide Description
CCS.36: Deep Learning for Multi-channel SAR processing: current state of art and future trends in the application of DL for three dimensional reconstruction
T/S.1: SAR Imaging and Processing Techniques — Interferometry: Along and Across
Thanks to its coherent nature and its ability to function in all weather conditions, Synthetic Aperture Radar (SAR) has become one of the most important tools for Earth Observation in recent decades. The development of new sensors and acquisition campaigns has significantly increased both the volume and availability of data. In particular, multi-channel SAR data are crucial for monitoring ecosystems and urban areas. The spatial and temporal diversity provided by multi-baseline and multi-temporal acquisitions offers a vast amount of information, facilitating the creation of applications for monitoring natural and built environments.

After more than two decades of dedicated research and experimentation, multi-channel SAR processing techniques (such as InSAR, PolInSAR, TomoSAR, and PolTomoSAR) have reached a high level of maturity and operational readiness. Various research teams have demonstrated its effectiveness in different application domains. A key achievement has been the improvement of height resolution, which relies on advanced digital signal processing techniques to process stacks of SAR images taken from different angles or over different time periods.

Other notable advancements include the ability to analyze the internal structure of natural environments, such as forests and snow, through extensive airborne and ground-based studies. Multi-channel SAR processing has also proven highly effective for tracking deformation and detecting multiple Persistent Scatterers (PS) using satellite-based SAR sensors.

The future of multi-channel SAR processing is highly promising. On one hand, the focus is on developing more advanced and high-performance SAR satellites, with several upcoming spaceborne missions planned—particularly those using low-frequency sensors, such as BIOMASS, NISAR-ISRO, and Tandem-L. These missions are expected to significantly improve height resolution capabilities for natural environments from space. On the other hand, despite these advancements, tomographic processing still requires complex pre- and post-processing steps. Challenges such as noise, temporal decorrelation in SAR data stacks, and baseline distribution continue to limit the resolution and accuracy of 3D imaging and scatterer detection in multi-channel SAR applications.

Considering the success of artificial intelligence (AI) and deep learning in various areas of SAR data processing, there have been recent efforts to develop AI-based methods specifically for multi-channel SAR. Although the initial results are encouraging, several challenges remain, including the lack of clear ground truth data, the search for optimal AI architectures, and the integration of domain-specific knowledge into AI frameworks.

This session will not only present the current state of space-based tomography but also focus on the potential of upcoming spaceborne missions and AI advancements in multi-channel SAR. The goal is to encourage an open discussion on the findings, addressing the aforementioned challenges and identifying promising areas for future research, especially regarding AI and upcoming SAR missions.
Show/Hide Description
CCS.37: Detection and tracking of marine animals using spaceborne remote sensing data
T/D.11: Data Analysis — Object Detection and Recognition
IEEE GRSS is an international professional Society that seeks to engage students and young professionals in contributing to the solution of complex engineering problems within the scope of the Student Grand Challenges. IEEE GRSS has in the past sponsored three previous Student Grand Challenges related to remote sensing based on drones or remotely piloted aircraft systems, nanosatellites payloads, and marine plastic litter monitoring. In this fourth one student teams will develop image processing and classification techniques for detection and tracking of whales using spaceborne remote sensing data. This is very important to assess the populations of these mammals, and their travel habits, which are endangered by human activities. This is a tangible nice contribution of IEEE GRSS to the society, to make informed decisions.

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
Show/Hide Description
CCS.38: Disaster Remote Sensing: Algorithms and Applications
D/S.5: Societal Engagement and Impacts — Risk and Disaster Management (Extreme Weather, Earthquakes, Volcanoes, etc)
Enhanced monitoring capabilities have allowed the analysis of numerous global disasters attributed to climate change. Remote sensing (RS), with its macroscopic, rapid, and accurate Earth observation (EO) advantages, is crucial in disaster prevention and management. To understand various RS technologies in disaster prevention and management, an integrated and systematic session of various disaster RS applications is proposed. This session will extensively cover sudden-onset disasters such as floods, wildfires, geohazards, snow and ice disasters, and marine disasters, and slow-onset disasters such as droughts, air pollution, and biological disasters. It outlines the current applications of ultraviolet, visible light, infrared, microwave, and multi-source RS in areas such as potential hazard identification, monitoring and early warning, emergency response, and recovery and reconstruction. Furthermore, the session will highlight the crucial role of artificial intelligence (AI) methods such as deep learning, ensemble learning, and transfer learning in disaster management. This integrated session will provide guidance for choosing suitable RS and intelligent processing methods in disaster scenarios, helping to develop targeted and adaptable prevention and mitigation strategies that support the sustainable development goals of the United Nations (UN). This Session brings together numerous scientists from different areas of research, who are interested in using the latest algorithms developments, or who have implemented similar approaches under alternative scenarios in remote sensing and GIS context. This Session will offer presentations from a variety of sources, both local, national and international and will enable you to network with others working in similar fields. 
Show/Hide Description
CCS.39: Distributed Spaceborne SAR Systems, Algorithms, and Applications
T/S.5: SAR Imaging and Processing Techniques — Bistatic SAR
This session covers latest research for distributed spaceborne Synthetic Aperture Radar (SAR) systems, algorithms, and applications, especially the important progress in bistatic/multi-static and formation-fly spaceborne SARs. 
Spaceborne SAR has now become an indispensable means of Earth information acquisition in fields such as remote sensing surveying and mapping, and disaster prevention and mitigation. Nevertheless, a single satellite proved insufficient to meet users’ requirements for timely observation and high-accuracy measurement for land, ocean, and atmosphere.  
With the development of aerospace and electronic technology, the size, weight, and the cost of satellites is decreasing, while the capability of their payload is improving, therefore, distributed spaceborne SAR systems are coming into focus. 
Distributed spaceborne SARs have great advantages of shorter revisit time, larger imaging coverage, and wider remote sensing application scope, due to their system flexibility and inter-satellite collaboration. There are diverse configurations of distributed spaceborne SAR proposed for achieving different observation advantages. Loosely coupled configurations, such as the ICEYE constellations and the Capella Space constellation, composed of identical payloads distributed on different orbital planes or phases, can shorten the revisit time for observing specific targets. Closely coupled formations, represented by German Aerospace Center (DLR) TanDEM-X and Chinese LuTan-1, are generally used for high-accuracy topographic mapping and urban building tomography. Some relatively close-fly formation concepts were also proposed for image resolution enhancement and speckle reduction, where SwarmSAR and Advanced Radar Geosynchronous Observation System (ARGOS) are typical ideas. Another innovative configuration is the multi-angle formation such as European Space Agency (ESA) Earth Explorer 10 (EE-10) candidate mission Harmony project, which will play an important role in studying the dynamic changes between different geospheres. Moreover, the spaceborne-airborne/-ground bistatic SAR configuration further extends the spaceborne SAR’s capability, such as providing more information about scattering properties and achieving forward-looking SAR imaging. Therefore, distributed spaceborne SARs have great potential to improve the performance in a wide scope of geoscience and remote sensing applications.
With the theme of distributed spaceborne SARs, this session covers broad topics including but not limited to system design and analysis, imaging algorithms, related applications such as topographic mapping, deformation measurements, tomography, moving target detection, etc.
Show/Hide Description
CCS.40: Earth Observation data and Smart Early Warning based Decision Support Systems (DSS) for Disaster Management and Risk Reduction
SP.1: Special Scientific Themes — Natural disasters and disaster management
In recent years, the frequency and intensity of natural disasters have increased, leading to devastating socio-economic impacts on communities globally. As climate change amplifies these risks, there is an urgent need for advanced, data-driven approaches to disaster management and risk reduction. This special session on "Earth Observation Data and Smart Early Warning-based Decision Support Systems (DSS) for Disaster Management and Risk Reduction" aims to explore the transformative potential of remote sensing and geospatial technology in early warning, risk assessment, and decision-making processes. By integrating multi-source Earth observation (EO) data, this session will focus on the development of intelligent DSS that empower communities and decision-makers with actionable insights to mitigate disaster risks effectively.

Scope and Description
The session invites contributions that leverage EO data, such as Synthetic Aperture Radar (SAR), optical remote sensing, and LiDAR, to develop models that predict, assess, and manage disaster risks. Earth observation provides high-resolution, multi-temporal datasets that enable monitoring of environmental conditions, detecting anomalies, and assessing vulnerabilities in areas susceptible to natural hazards like floods, landslides, earthquakes, and droughts. When coupled with machine learning, artificial intelligence, and predictive analytics, these datasets enhance the capabilities of DSS, providing timely and location-specific warnings.

Key areas of focus will include:

Multi-sensor Data Integration: Approaches to integrating SAR, optical, and multispectral data for accurate, real-time environmental monitoring and disaster mapping.
AI-driven Predictive Models: Development and application of AI/ML models to forecast and analyze disaster risks based on historical EO data patterns.
Cloud-based DSS Platforms: Insights into scalable DSS platforms that deliver real-time information to stakeholders through accessible and user-friendly interfaces.
Community-Centric Early Warning Systems: Case studies on the implementation and effectiveness of early warning systems (EWS) tailored to local socio-economic conditions, especially in vulnerable communities.
Importance to Geoscience and Remote Sensing
This session is highly relevant to geoscience and remote sensing because it underscores the role of EO data in understanding and mitigating environmental hazards. The accurate monitoring capabilities of remote sensing technology allow for detailed assessments of dynamic natural processes, helping scientists and policymakers to predict the onset of disasters and plan accordingly. By focusing on the integration of advanced DSS with EO data, the session will highlight innovations that make disaster response more responsive, reducing the reliance on traditional post-disaster interventions. These developments not only provide life-saving warnings but also contribute to resilience-building in communities at risk.
Show/Hide Description
CCS.41: Earth Observation for Enhancing Water Resource Sustainability in Terrestrial Ecosystems
D/S.7: Societal Engagement and Impacts — Remote Sensing for Sustainable Development
As global warming and drying intensify, water resources are becoming increasingly important for the plant’s growth in terrestrial ecosystems. Water availability regulates the fundamental physiological processes of plants, including transpiration and photosynthesis, which directly affect the ecological, economic, and social benefits terrestrial ecosystems provide. Therefore, accurate and timely monitoring of water resources, along with a comprehensive understanding of their interplay with plants, is crucial for mitigating the impacts of climate change on terrestrial ecosystems and promoting a sustainable society.

The improving availability and sophistication of Earth observation (EO) data offer unprecedented opportunities to monitor, manage, and preserve water resources in terrestrial ecosystems. Numerous satellite missions have been dedicated to monitoring different components of the water cycle, such as Landsat, MODIS, and ECOSTRESS for evapotranspiration, SMOS and SMAP for soil moisture, GRACE and GRACE-FO for groundwater, GPM for precipitation, and SWOT for surface water. With the advancement in satellite EO technologies, including improved spatial and temporal resolutions, new process-based and data-driven models (e.g., machine learning) along with innovative applications have been developed to fully harness the wealth of these observations, contributing to more effective and sustainable water resource management. 

The objectives of this 2025 IGARSS session focus on 1) synthesizing the latest advancements in satellite measurements and methods for monitoring water resources in terrestrial ecosystems using EO data, and 2) deepening the understanding of interactions between the components of the hydrologic cycle and between water resources and plants. The session will cover a wide range of topics centered around water resource sustainability in terrestrial ecosystems, including 1) current and future high-resolution thermal missions and their role in water resources management, 2) applications and improvements in surface energy balance modeling, 3) downscaling of soil moisture products, 4) groundwater data modeling and assimilation in hydrological modeling, 5) ecohydrological applications (e.g., Groundwater Dependent Ecosystems), among the others.

Keywords: 
Satellite Remote Sensing, Evapotranspiration, Soil moisture, Groundwater, Precipitation, Ecohydrology
Show/Hide Description
CCS.42: Earth Observation Foundation Models
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
EO Foundation Models: Unlocking New Horizons in Geoscience and Remote Sensing.

The increasing volume of data in Earth Observation (EO) necessitates the development of sophisticated analytical frameworks. Foundation models (FMs) have emerged as powerful tools capable of harnessing vast, often unlabeled EO datasets. By learning generalized representations from extensive datasets, these models have the potential to transform our understanding of complex Earth systems and significantly enhance predictive accuracy for various environmental phenomena.

The proposed session will focus on the application and advancement of EO foundation models, placing a strong emphasis on their design features, methodologies, and implications for the field. Presentations will encompass the essential characteristics of an ideal EO FM, which includes multi-modality, scale awareness, and spatio-temporal modeling. Attention will also be given to addressing domain shifts in both space and time, ensuring physical consistency of models, and facilitating task-agnostic functionality.

Furthermore, the session aims to explore methodological advancements in self-supervised learning techniques that enable FMs to efficiently analyze extensive datasets, and develop downstream applications, even in scenarios where labeled data is scarce or noisy.  The integration of various data modalities, including multispectral, Synthetic Aperture Radar (SAR) and hyperspectral satellite imagery, along with meteorological and ground-based measurements will also be explored.

This session invites researchers to present their work on EO foundation models, encompassing innovative methodologies, benchmarking approaches, downstream applications, and relevant case studies. Collaborative efforts are encouraged, with contributions aimed at underscoring the significant impact of these models on geoscience and remote sensing. This is particularly important in addressing pressing environmental challenges and informing sustainable management strategies. Ultimately, the session aspires to foster a deeper understanding of the transformative potential of foundation models in advancing Earth and climate sciences.

References
Xiong, Z., Wang, Y., Zhang, F., Stewart, A.J., Hanna, J., Borth, D., Papoutsis, I., Saux, B.L., Camps-Valls, G. and Zhu, X.X., 2024. Neural plasticity-inspired foundation model for observing the earth crossing modalities. arXiv preprint arXiv:2403.15356.

Zhu, X.X., Xiong, Z., Wang, Y., Stewart, A.J., Heidler, K., Wang, Y., Yuan, Z., Dujardin, T., Xu, Q. and Shi, Y., 2024. On the Foundations of Earth and Climate Foundation Models. arXiv preprint arXiv:2405.04285.

Lacoste, A., Lehmann, N., Rodriguez, P., Sherwin, E., Kerner, H., Lütjens, B., Irvin, J., Dao, D., Alemohammad, H., Drouin, A., Gunturkun, M., Huang G., Vazquez D., Newman D., Bengio Y., Ermon S.& Zhu, X. (2024). Geo-bench: Toward foundation models for earth monitoring. Advances in Neural Information Processing Systems, 36.

Bountos, N.I., Ouaknine, A., Papoutsis, I. and Rolnick, D., 2023. FoMo-Bench: a multi-modal, multi-scale and multi-task Forest Monitoring Benchmark for remote sensing foundation models. arXiv preprint arXiv:2312.10114.
Show/Hide Description
CCS.43: Earth observations and the Earth science state of profession: building an equitable workforce to support mitigation, adaptation, and response to climate change impacts
D/E.3: Education and Policy — Education and Remote Sensing
This session recognizes the unprecedented opportunity presented by the current "golden age" of Earth observations, where satellites provide vast volumes of data revealing intricate Earth processes across the atmosphere, oceans, and land. There have been significant advances in how to harness Earth observations for understanding climate change, with successful instances of applying these data to support response, mitigation, and adaptation to climate impacts. However, there remains significant opportunity to maximize the potential of these observations to address pressing climate change impacts by building a diverse and skilled workforce, particularly in areas where resources and data availability are limited. We aim to foster a dialogue on how to cultivate such a workforce and invite submissions that discuss the following topic areas:
•	Building successful Earth science missions and collaborative projects
•	Providing examples of how advances in Earth science can support societal responses to climate change, including in the areas of freshwater, natural and compound hazards, coastal resilience, and biodiversity
•	The blend of technical, scientific, and social science expertise needed to build a resilient and adaptable Earth science profession
•	How to embed principles of inclusive innovation into climate solutions
•	Strategies to mobilize the Earth science community and its partners to foster a truly flexible, equitable, and robust profession capable of meeting the challenges of a changing planet
This session seeks to inspire action and collaboration within the IGARSS community to build a future where diverse expertise and perspectives are harnessed to fully utilize Earth observations for the benefit of society and the planet. We will also focus on cultivating inclusive, interdisciplinary partnerships that leverage the unique strengths of different regions and communities. Through these efforts, we hope to catalyze long-term, systemic change in the Earth science profession to better address global challenges.
Show/Hide Description
CCS.44: Earth System Science and Applications Based on a Decade of NASA Soil Moisture Active Passive (SMAP) Satellite Mission Science Data Products
L.8: Land Applications — Soils and Soil Moisture
The NASA Soil Moisture Active Passive (SMAP) satellite mission is in its tenth year of measurements acquisition. The science data products complete global coverage in 2 to 3 days depending on latitude. The multi-year record allows insights into year-to-year and season-to-season differences in these Earth System scientific disciplines.  The global low-frequency microwave radiometry are used to derive surface soil moisture, vegetation optical depth (indicator of integrated above-ground vegetation water content), and landscape freeze/thaw status estimates over land regions. Over the oceans the data are used to estimate sea surface salinity and surface wind speed.  Over the cryosphere the data are used to estimate snow and ice properties as well as firns. The soil moisture estimates are the bases for understanding water stress in vegetation and hence used for applications in carbon cycle studies.  They are also used in the applied sciences related to natural hazards and food security. Agricultural drought is defined as a deficit in soil moisture and SMAP data are used in monitoring droughts. The runoff ratio (ratio of runoff over storm precipitation totals) is also a function of available soil storage. SMAP data are used in flood hazards applications. The vegetation optical depth information allows monitoring of crop growth and above-ground biomass growth and rehydration. This session brings together studies on the role of landscape hydrology in the Earth system and applications to natural hazards. This session includes presentations on the status of the SMAP observatory, enhancements in the science data processing and new application of the decade-long SMAP data. 
Show/Hide Description
CCS.45: Enhancing crop yield and suitability mapping by integrating crop growth models and remote sensing
SP.8: Special Scientific Themes — GIS development and remote sensing
Agricultural productivity is crucial to meet food security challenges at global levels amidst climate variability, increasing populations, and limited resources. Precise crop yield information and suitability mapping are key tools to enhancing sustainable agriculture. 
This session targets recent advances regarding improving crop yield estimations, predictions and suitability mapping through integrating crop growth models with remote sensing in different geographic areas of the world.
Crop growth models simulate plant development and yield as a function of biophysical processes, using weather, soil, and crop management as inputs. However, most of the models have been developed based on a limited set of data or site-specific data, which reduces their applicability at larger geographic scales. With the recent increased access to satellite data, remote sensing can generate data for various agricultural indicators, such as vegetation health, soil moisture, and environmental conditions, at higher spatio-temporal resolution than before. 
When combined, crop models and remote sensing offer powerful solutions to enhance the precision and scalability of agricultural assessments.
With the objective to better understand at larger spatial levels the future of crop productivity, the agro-ecological zoning (AEZ) framework was developed by integrating various input datasets that drives crop yield. This includes soil, terrain, climate and many others, with a total of 300,000 global layers in the Global Agro-Ecological Zoning. By better understanding the spatial variability influencing crop production, the agroecological zones forms a framework to support agricultural management strategies considering local present and future conditions. This holistic perspective not only supports improved yield predictions but also fosters the development of climate-resilient farming systems, ultimately contributing to food security and sustainable land use.
This session will discuss recent advances and innovations in the use of satellite-based remote sensing data for crop monitoring such as the use of optical, thermal, and radar with respect to temperature, water condition, and vegetation indices critical in crop growth. Then, the session will introduce the use of crop models to assess yield in different conditions. Later, it will present the integration of remote sensing and crop models to provide estimates at larger spatio-temporal scales. Integrated with soil, terrain, and climate information, the results can contribute to agricultural land evaluation framework such as agro-ecological zoning and support sustainable agriculture and development. This session will also discuss advanced methods and techniques such as related to the use of machine-learning, data assimilation techniques, artificial intelligences to improve crop estimation and predictions. Finally, the session will discuss the adoptability and adaptability of complex and advanced scientific models into practical applications in different contexts.
Show/Hide Description
CCS.46: Enhancing Safety and Security through Earth Observation
SP.5: Special Scientific Themes — Global warming, climate records and climate change analysis
Earth observation technologies have proven invaluable for advancing safety and security in a variety of sectors, from disaster response to infrastructure protection. As our world becomes increasingly interconnected, these technologies offer the ability to monitor and mitigate risks in real-time, enhancing resilience to both natural and human-made threats. This session will explore recent advancements in the application of Earth observation in safety and security, focusing on how satellite data, remote sensing techniques, and geospatial analytics are being used to identify vulnerabilities, predict disasters, and respond to crises.
Topics may include the use of remote sensing for early detection of wildfires, floods, or landslides, monitoring of critical infrastructure such as power grids and transportation networks, and security applications like tracking illicit activities or ensuring border security. By integrating these technologies with artificial intelligence, we are entering a new era of predictive analytics that can offer early warnings and actionable insights for policymakers, emergency responders, and security experts.
This session will explore the intersection of technology and societal needs, contributing to the broader discussion of how remote sensing can help address some of the most pressing challenges of our time, including climate change, disaster resilience, and infrastructure protection. Furthermore, it will be aligned with the DEIAB values of IGARSS 2025 by emphasizing global collaboration, inclusivity in data accessibility, and solutions that benefit diverse and vulnerable populations.
Show/Hide Description
CCS.47: Enhancing the Interpretation of Complex Earth Materials Using Spectra of Simple Materials Over Various Spatial Scales
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
Earth scientists can characterise rocks and soils cost effectively at unprecedented detail using a wide range of hyperspectral optical remote sensing systems, such as operated on satellites, aircrafts and drones, and complemented by proximal sensing systems, such as field spectroradiometers, drill core sensors and lab spectrometers. In the mineral resources sector, applications range from fundamental geological mapping to critical metals exploration, mine site and tailings dam monitoring and evaluating the environmental impact of mining operations. But hyperspectral sensing has many more applications, such as in agriculture, urban planning, dust source investigation and solar system exploration. 
Successful interpretation of hyperspectral remote and proximal sensing data of rocks and complex earth materials hinges on understanding the spectra of materials present in the scene. Typically, complex rock and soil spectra are 'unmixed' or deconvolved using libraries of environmental and mineral spectra with known chemistry and crystallinity. However, the validation of reference samples for spectral libraries is time consuming and often incomplete. Accurate spectral libraries are essential for spectral unmixing methods, but also improve our understanding of mineral-specific information recorded in hyperspectral data (e.g. mineral chemistry and crystallinity). This session invites presentations on enhancing spectral unmixing techniques and improving spectral libraries. Furthermore, we welcome presentations about how spectral libraries can improve our understanding of, for example, mineral chemistry and mineral systems analysis. We welcome talks on UV, Vis, NIR, SWIR, MIR, and TIR hyperspectral data collected across various scales (10⁻⁹ m to 10³ m), as well as their validation with, for example, mineralogical and geochemical analyses.
Show/Hide Description
CCS.48: EO Data Cubes: Challenges, Innovations, and Opportunities
T/A.19: AI and Big Data — Data Management Systems and Computing Platforms in Remote Sensing
Earth Observation (EO) Data Cubes lower barriers for researchers, policymakers and industry to accessing and processing large-scale EO data. These platforms facilitate the development of environmental monitoring, sustainable development, and resource management capabilities.  

Implementations such as Digital Earth Australia, Digital Earth Africa, Swiss Data Cube and Brazil Data Cube have demonstrated the value of national and regional data cubes in providing localised solutions while contributing to global sustainability initiatives, supporting both national policy frameworks and international development agendas. 

Regional implementations can better cater to the local needs and requirements for processing and products, which may not be covered by global approaches. This session aims to bring together regional initiatives at an international level, to facilitate global thinking combined with local development. With the increasing magnitudes in scale of data from upcoming EO satellite constellations, it is increasingly important to take a collaborate approach on Analysis Ready Data, to serve both local and global needs. 

The session invites experts from the various international Digital Earth and Data Cube initiatives to come together and discuss: 

- Challenges and opportunities of deploying EO Data Cubes at different scales and environments 
- Showcase the technical development and implementation of EO Data Cubes at national, continental, and global scales 
- Discuss challenges and opportunities in scaling EO Data Cubes and aligning local efforts with global sustainability goals 
- Foster dialogue between academic, governmental, and industry stakeholders on the future of EO Data Cube technology 
- Explore opportunities in technical advancements in (cloud-native) technologies for delivering EO data at large scale
Show/Hide Description
CCS.49: Exploration and Exploitation of New Earth-Observing Satellite Applications for Weather and Climate Science
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
The Community Contributed Session, “Exploration and Exploitation of New
Earth-Observing Satellite Applications for Weather and Climate Science”, will highlight innovative approaches to leveraging data from both new and especially existing satellite instruments. Already, a vast array of Earth- Observing sensors in Geostationary (GEO) and Low Earth (LEO) orbit, managed by numerous international organizations, provide extensive datasets that capture Earth’s weather and climate dynamics on both regional and global scales.

However, such wealth of information is still ripe for additional exploitation. This session seeks to explore novel ways to enhance current capabilities and develop new applications. These possibilities include extreme weather prediction and monitoring, early alerts of rapidly developing storm systems in the Atlantic and Pacific, Satellite-Based marine and ocean debris monitoring, and the use of deep learning techniques to measure the Planetary Boundary Layer with remote sensing instruments. Additional applications may involve glacier mass balance measurements, soil carbon and soil moisture estimations through hyperspectral observations, and the development of other new environmental and ecological data products. 

Achieving these advances may require improved integration of sensors across the electromagnetic spectrum as well as methods to meet the demands for more frequent or higher resolution data collection and transmission. Other approaches that are already beginning to be demonstrated include the use of novel machine learning algorithms for enhanced environmental classification and prediction, improved atmospheric composition analysis, and more effective integration with ground-based systems.

These advancements, alongside emerging innovations in how Earth and the atmosphere are measured and characterized, have the potential to transform and elevate our understanding and capabilities to meet weather and climate science goals. This can lead to more accurate predictions, earlier warnings for localized and extreme weather events, enhanced data quality, and deeper insights into the global environment.
Show/Hide Description
CCS.50: Foundation Models for Geospatial Artificial Intelligence
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Foundation models (FMs), particularly Multimodal Large Language Models (MLLMs), are poised to revolutionize the field of remote sensing by significantly enhancing data interpretation and analysis. These models excel in automating feature extraction from complex satellite imagery, which can facilitate groundbreaking advancements in environmental research. Despite their proven capabilities across a wide range of applications, the specific potential of FMs in geographic and geospatial contexts remains largely untapped. 
By efficiently processing vast datasets, foundation models can support critical tasks such as land cover classification, change detection, and environmental monitoring. Their advanced natural language understanding capabilities enable users to extract meaningful insights from remote sensing data, allowing for intuitive querying and visualization of complex datasets. Moreover, Multimodal Large Language Models can automate the generation of descriptive reports and summaries, effectively bridging the gap between technical data and actionable information.
As these models continue to evolve, their integration with remote sensing technologies is expected to drive innovation in various applications, including disaster response, urban planning, and climate change studies. Recent research has demonstrated that MLLMs are not only remarkably sample-efficient but also rich in geospatial knowledge, showcasing robustness across diverse global scenarios. 
This session aims to explore the opportunities and challenges presented by foundation models in geospatial AI, fostering discussions that can lead to impactful solutions in environmental monitoring and management.

The scope of this session includes but not limited to:
- Pre-training Strategies for FMs
- Novel FM Downstream Use-cases 
 -Design of Geospatial Foundational AI Models
- Efficient and Scalable FM design approaches, Multi-modal FMs - design and architecture
- Validating FMs for various Scientific Use-cases
- Explainability for FMs
- Applications of Geospatial Foundation models in use cases such as land cover classification, urban planning etc.
- Integrating MLLMs with geospatial data analysis tools 
Geophysical data visualization using Remote Sensing and FMs - Ethics, fairness and bias in geospatial AI applications
Data Security in MLLMs 
- Future directions for research in foundation models and remote sensing
Show/Hide Description
CCS.51: From Space to Surface: Satellite Insights into Aerosol Properties and Their Radiative Impact
M.4: Atmosphere Applications — Aerosols and Atmospheric Chemistry
This proposal addresses the critical role of satellite remote sensing in characterizing aerosol properties and assessing their radiative impact on Earth's climate. Aerosols, as a significant component of the Earth's atmosphere, influence climate through scattering and absorption of solar radiation, as well as by modifying cloud properties. The session will focus on the latest advancements in satellite technology and algorithms that enable the quantification of aerosol optical depth, size distribution, and composition. It will also explore how these data are utilized to understand aerosol-radiation interactions and their implications for climate modeling and prediction.

The session aims to cover a range of topics, including the development of new satellite-based methods for aerosol detection, the validation of satellite-derived aerosol products against ground-based observations, and the integration of aerosol data into climate models. Particular emphasis will be placed on the challenges associated with aerosol remote sensing, such as the differentiation between aerosols and clouds, the impact of aerosols on surface observations, and the limitations of current satellite sensors.

The importance of this session to geoscience and remote sensing is profound. Aerosols are a key variable in the Earth's climate system, and their accurate monitoring is essential for understanding climate change. Satellite remote sensing provides a unique global perspective, allowing for the continuous observation of aerosols over large spatial and temporal scales. This session will contribute to the development of more accurate and robust methods for aerosol characterization, which is vital for improving our understanding of their role in the climate system and for informing policy decisions related to air quality and climate change mitigation. We invite contributions from researchers and practitioners working at the intersection of satellite remote sensing, atmospheric science, and climate research to share their insights and findings.
Show/Hide Description
CCS.52: Geo(AI) and Earth Observation for Informed Humanitarian Emergency Response
D/S.5: Societal Engagement and Impacts — Risk and Disaster Management (Extreme Weather, Earthquakes, Volcanoes, etc)
Natural and human-made disasters (conflict, floods, wildfires, geohazards like earthquakes and landslide, droughts, air pollution and others) often result in significant loss of life and property damage. Coupled with the increasing frequency and severity of climate extremes, it is causing large-scale forced displacement and migration. Effective short-term humanitarian support and long-term socio-economic recovery planning hinge on understanding the extent and nature of these disasters and the risks that follow. Earth Observation (EO) technology, with its ability to deliver spatially detailed and temporally precise information, presents new opportunities for monitoring objects and phenomena in the wake of such events. EO data allows for real-time assessments of climate hazards, providing critical insights into environmental triggers behind forced migrations, such as droughts, floods, and coastal erosion. When combined with advancements in artificial intelligence (AI), particularly deep learning for computer vision and other socio-economic data, these technologies enable the development of automated and semi-automated information retrieval processes. AI models further enhance predictive capabilities, forecasting migration events under various climate scenarios by integrating multi-sensor data and socio-economic context. Despite the potential for increased automation, challenges remain due to the complex nature of objects in earth observation imagery and the intricate dynamics of disasters. Moreover, time-sensitive humanitarian responses require models that are both efficient and capable of generalizing across different locations and periods, which can be difficult given the data-intensive nature of current AI models. This session aims to foster discussions and share developments from the broader scientific and geo-humanitarian communities on the role of Geospatial Artificial Intelligence (Geo)AI in enhancing informed humanitarian emergency responses ranging from algorithmic development, hazard identification, monitoring and early warning, emergency response, recovery and reconstruction operations. We seek contributions related to the following topics:
Fundamental research on technological and algorithmic advancement: including but not limited to
•	Development of algorithms for localization, detection, counting, and prediction of objects and phenomena.
•	Strategies for improving the transferability and generalization of Geo(AI) models, such as ensemble learning, domain adaptation, weakly-supervised learning, self-supervised learning, unsupervised learning, learning from incomplete annotations, generative approaches, and machine learning for the complementary use of radar and optical data.
•	Creation of language and vision models tailored to emergency response scenarios, e.g. foundation models and visual language models.
 Applications of EO and (Geo)AI in Humanitarian Emergency Response: focused on thematic applications designed for emergency response, including:
•	Predicting the occurrence and assessing susceptibility and vulnerability to disaster that is relevant for early warning and emergency response
•	Integration of socio-economic and earth observation data for understanding patterns, risk mapping and predictive modeling of climate induced migration
•	Use of multi-source earth observation data for rapid mapping of post-disaster events such as floods, landslides, building change and damage assessment, fires, and more
•	Other real-world use cases and success stories from humanitarian organizations, companies, and large multinational institutions that demonstrate how (Geo)AI and EO support short-term humanitarian emergency responses and long-term socioeconomic recovery and reconstruction planning.
Show/Hide Description
CCS.53: Give Earth a Chance: AI Algorithms for Environmental Monitoring
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Artificial Intelligence (AI) applied to remotely-sensed data, particularly satellite images (like hyperspectral and multispectral images, HSI and MSI), plays a crucial role in environmental monitoring. It enables timely actions to mitigate gas emissions, identify unexpected emitters (e.g., methane sources), monitor inland water quality, extract soil parameters, and estimate pollutant data for large areas. These applications extend to scenarios requiring rapid responses and present unique challenges. Detecting and assessing floods is complex due to the environmental effects of water in affected regions. On the other hand, wildfires are natural threats that devastate ecosystems and have far-reaching socio-economic consequences. Fire behavior depends on factors such as fuel type, flammability, quantity, climate change, topography, and wind conditions. Traditionally, fire area estimation relied on field methods using GPS data, but this approach could only determine the perimeter of the hazardous area. It also faced challenges due to inaccessibility and changing fire dynamics over time. Remote sensing technology, particularly MSI/HSI, provides a more comprehensive solution. It may be exploited to extrat changes over time, enabling the identification of active fires, burned areas (also with decreased chlorophyll content), and potential fire resurgence (partial fuel burnout). This method offers advantages over traditional techniques that only consider the perimeter. Additionally, it allows for continuous monitoring of fire evolution and early detection. Volcanic ash, a consequence of eruptions, poses various threats, including air and water pollution, climate impact, and aviation safety risks. Monitoring and forecasting volcanic ash clouds are of paramount importance, as dust from volcanic eruptions can lead to respiratory diseases and water contamination. Satellite remote sensing surpasses ground-based methods as it eliminates the need for installing instruments in remote and hazardous areas. These examples illustrate the societal benefits of effectively leveraging AI for analyzing satellite data. Whether onboard satellites or on the ground, AI accelerates data analysis and facilitates the extraction of actionable insights from raw satellite data. There are, however, several important challenges directly concerned with the characteristics of the data and – among others –  ground-truth data availability, representativeness and generalizability for specific environmental applications. This session addresses the challenges in deploying AI for environmental monitoring and designing data-driven algorithms in this context, welcoming submissions on a range of related topics.

-       Classic and AI algorithms for environmental applications,
-       Data-level digital twins for synthesizing training data for environmental purposes,
-       Few- and zero-shot learning for training AI algorithms,
-       Leveraging Big Data and unlabeled data for environmental monitoring,
-       Training AI algorithms from weakly-labeled remotely-sensed datasets,
-       Greenhouse gas detection and monitoring from satellite images,
-       Methane plum detection and segmentation,
-       Monitoring floods and wildfires from satellite data,
-       Bare soil detection and soil analysis from satellite data,
-       Air quality analysis and monitoring from satellite data,
-       Detecting and tracking industrially-induced pollution from satellite data,
-       Multi-modal satellite data analysis for environmental applications,
-       Validation of AI algorithms for environmental applications,
-       Examples of industrial, scientific and societal real-world impact of AI for environmental monitoring.
Show/Hide Description
CCS.54: Google Earth Engine: A new era of AI driven Earth Observation
T/A.19: AI and Big Data — Data Management Systems and Computing Platforms in Remote Sensing
The Google Earth Engine (GEE) has found a powerful tool for numerous applications with the continuous development and integration of satellite datasets in the field of earth observations. As a cloud-based platform, GEE enables users to process, visualize, and analyze vast amounts of satellite imagery and geospatial data, supporting real-time insights and long-term trend analysis. Integrating AI with GEE allows for advanced image analysis techniques such as deep learning for image classification, anomaly detection, and predictive modeling, greatly enhancing the potential of geospatial analyses. 

Moreover, the incorporation of advanced tools like TensorFlow with GEE facilitates the prediction analysis for environmental, forest fire prediction, and real-time crop health monitoring. However, there is still a need to understand the potential and significance of GEE with AI-driven technology. Topics of our session include (a) GEE cloud computing Remote Sensing; (b) AI-based GEE tools and technologies; (c) Emerging applications of GEE especially in the area of water resource management, agriculture, forest activities, climate change, cryosphere and hydrological applications; (d) Challenges and future trends of GEE. 

This session will provide effective guidance to participants to explore the potential of the GEE in earth observation using state-of-art Artificial Intelligence (AI) driven tools and technologies such as deep learning, machine learning, cloud computing, big data analytics, and many more. With these tools and technologies associated with GEE, the applicability will be enhanced in the various scientific domains of remote sensing and geographic information systems (GIS) such as Water resource management, agriculture mapping, forest cover, climate change, natural hazard assessment, aquatic and hydrological applications and many more. Additionally, it also highlights the challenges and futuristic AI-driven tools and technologies for earth observation data analytics.
Show/Hide Description
CCS.55: Ground validation of optical Earth observation images
L.11: Land Applications — Geology and Geomorphology
The latest generation of geoscience-tuned hyperspectral satellite sensors, launched by international space agencies and the private sector, now enables the characterisation of minerals on the Earth’s surface at an unprecedented level of detail. New spaceborne hyperspectral sensors such as PRISMA, EnMAP and EMIT complement the already successfully applied, “geoscience-tuned” multispectral sensors (e.g. ASTER and WorldView3) in geological remote sensing. A key part of mapping rocks, soils and specific minerals from multi-, and hyperspectral remote sensing imagery is the validation of the products on the ground. Ground validation may include using field instruments, such as portable reflectance spectrometers or X-ray fluorescence spectrometers, along transects that can be identified in preliminary remote sensing products. Furthermore, soil and rock samples are commonly collected in the field for mineralogical and geochemical analyses in the lab. The field and lab measurements can be used to calibrate remote sensing imagery-derived maps, for example, to produce quantitative mineral maps. Publicly available ground validation data sets (e.g. Cuprite – Nevada, U.S.A.; Rocklea Dome – Western Australia, Australia) provide opportunities for the international research community to compare different remote sensing image processing and calibration methods. However, publicly available ground validation data sets that go beyond simple image classification are scarce. 
This session invites presentations about workflows and methods for validating geoscience products derived from optical satellite imagery through case studies or conceptual approaches. Special emphasis will be placed on publicly available ground validation datasets that can be used by the international community, facilitating the comparison of various image processing workflows/algorithms and datasets.
Show/Hide Description
CCS.56: GRSS ESI TC / HDCRS WG - Hybrid Quantum-Classical Computing for Earth Observation
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
In recent years, significant advances in quantum technologies have shown immense potential to revolutionise geosciences and remote sensing. As these innovations develop, they promise to enhance environmental monitoring, climate modeling, and other critical geospatial applications. Although quantum computing is currently less mature than quantum communications and sensing, it holds immense economic potential and societal impact due to its promising capabilities in tasks like prime factorization and optimization algorithms. As quantum computing technology progresses, it could significantly transform geosciences and remote sensing, particularly in Earth Observation (EO) applications that leverage quantum machine learning and optimization techniques. These advances are essential for addressing Sustainable Development Goals like ‘Climate Action.’ By accelerating data processing and analysis, integrating data into scientific models, and running surrogate models – especially within future Digital Twin systems – the ability to meet these goals can be significantly enhanced

One of the main challenges in practical quantum computing is mapping quantum algorithms to available hardware. Factors such as the number of qubits, noise levels, and available gate sets influence the applications that can be implemented and their problem sizes. This session will explore the development of novel hybrid quantum-classical algorithms for processing and analyzing EO data, offering practical solutions to current hardware limitations while advancing useful quantum computing.

The topics of discussion for this session, although not exhaustive, will include:

Quantum Algorithms and Techniques:

Quantum-assisted and quantum-inspired algorithms, including theoretical analyses, simulations, and preliminary results.
Classical data embedding into quantum states, quantum state transformations, quantum circuit design, and quantum state measurement.
Quantum machine learning techniques applied to EO data.
Enhanced algorithms performance using distributed quantum computing.
Training and benchmark EO data sets and procedures for quantum algorithms

Hybrid Computing Environments:

How quantum and classical algorithms in hybrid High-Performance Computing (HPC) environments can address novel EO challenges.
Integration of HPC within cloud EO data platforms to improve processing efficiency.
Optimizing AI models for EO using GPU-accelerated clusters within European HPC infrastructures.
Enhancing EO algorithm execution using heterogeneous HPC systems (GPU-boosted, memory-, storage-, and compute-optimized clusters).
Quantum “twin” algorithms of AI models for EO.

Applications and Impact:

Leveraging quantum algorithms to improve the accuracy, efficiency, and scalability of large-scale EO data processing.
Applying quantum machine learning for enhanced predictive modeling and decision-making in Earth systems.
Physics aware quantum generative models for EO modeling
Investigating quantum computing’s potential to revolutionize climate modeling and simulation for better climate change assessments, e.g., for running what-if scenarios in Earth System Digital Twins.

We particularly welcome contributions featuring implementations on actual quantum computers – such as the D-Wave quantum annealer or IBM Quantum systems – with benchmarking against classical counterparts. Submissions that align with IGARSS 2025’s theme – “to address the threats to our Earth and promote collaborative global solutions using remote sensing technology” – are highly encouraged.”
Show/Hide Description
CCS.57: GRSS ESI TC / HDCRS WG - Parallel and Efficient computing for Remote Sensing
T/A.19: AI and Big Data — Data Management Systems and Computing Platforms in Remote Sensing
Recent advances in remote sensors, which capture data at high spectral, spatial, and temporal resolutions, have significantly increased data volumes, posing challenges for efficient processing and analysis in support of practical applications. Among the solutions for these challenges, deep learning (DL) has emerged as a powerful yet computationally demanding approach, known for its adaptability and effectiveness across remote sensing problems. To address these demands, the efficient design of DL algorithms and the use of parallel and scalable computing architectures are becoming fundamental in the field.

This community session on "Parallel and Efficient Computing for Remote Sensing" invites cutting-edge papers that explore innovative computational approaches. These include High-Performance Computing platforms—such as clusters, grids, and cloud computing—as well as accelerators like the GPU, which has evolved into a many-core processor with high computing power and memory bandwidth. The session will focus on the newest high-performance and distributed computing technologies and algorithms, aiming to make the processing and analysis of large-scale remote sensing data more efficient and scalable.
Show/Hide Description
CCS.58: Harnessing the power of earth observation data and high-end computing for next generation land use and land cover mapping
SP.3: Special Scientific Themes — Remote sensing for sustainable development in the Asia-Pacific region
The advent of high-resolution remote sensing, coupled with frequent satellite revisits and global coverage, has ushered in a new era of satellite Earth Observation (EO). The proliferation of multispectral, hyperspectral, and all-weather sensors has dramatically increased the complexity and volume of remotely sensed data, presenting both unprecedented opportunities and significant challenges for land use and land cover (LULC) mapping applications. As we navigate this data-rich landscape, the geospatial community faces formidable hurdles in data collection, analysis, and interpretation. Traditional computers, while still valuable, struggle to fully address the unique characteristics of EO big data, including its vast geographical spread, high dimensionality, and complex metadata structures. The sheer scale of available data with over 1000 EO satellites in orbit and 200 featuring multispectral and multitemporal sensors, that demand innovative solutions for efficient data management and processing.
This session aims to explore cutting-edge approaches to LULC mapping that leverage the full potential of big data and advanced computing technologies. The session aims to delve into novel methodologies for handling the volume, velocity, and variety of remote sensing data, with a particular focus on real-time processing and temporal analysis capabilities. Discussions will center on the development of robust models and computing platforms tailored to the unique demands of large-scale LULC applications.
We invite presentations on innovative research and practical implementations that address these challenges. Topics of interest include, but are not limited to, scalable models for LULC classification using satellite or aerial remote sensing data, deep learning approaches for feature extraction from high-cadence EO data, and cloud-based solutions for distributed processing of large-scale LULC datasets. The session will also explore the potential benefits of these advanced techniques for a wide range of application domains, including urban planning, and climate change impact assessment. We encourage presentations that showcase the use of high-end computing for model training, testing, and deployment in operational LULC mapping systems. 
By bringing together experts from remote sensing, computer science, and domain-specific fields, this session aims to foster interdisciplinary collaboration and drive innovation in LULC mapping. We seek to identify key research directions and technological advancements that will shape the future of land use and land cover monitoring, ultimately contributing to more informed decision-making and sustainable management of our planet's resources.
Show/Hide Description
CCS.59: High-Resolution Wind Vectors of Extreme Tropical Cyclones from Scatterometers and SAR: Methods and Applications
SP.1: Special Scientific Themes — Natural disasters and disaster management
Tropical cyclones have caused $1.4 trillion in economic loss and over 779,000 deaths in the past 50 years, according to the World Meteorological Organization (WMO). With climate change, these events are increasingly affecting higher latitudes, causing severe storms and heavy precipitation. Accurate wind vector information for tropical cyclones, especially those with sea surface wind speeds over 30 m/s, is in high demand. Scatterometers and  synthetic aperture radar (SAR) are two well applied instruments for ocean surface wind remote sensing sharing similarities while compensating each other with their differences in retrieval method developments. Recent advancements in wind retrieval from both sensors have enhanced the accuracy and spatial resolution of wind vector information. However, challenges remain in accurate quantitative descriptions and modelling due to the unestablished small-scale momentum transformations at boundary layers, evolving large-scale structures, and the scarcity of reliable in-situ measurements under extreme conditions.

In the proposed session, we will showcase state-of-the-art high spatial resolution wind vector remote sensing techniques for extreme tropical cyclones using scatterometers and SAR. We will address the efforts made in coping with wind retrieval and application challenges. The invited talk will provide a concrete description of the main concerns, followed by selected oral presentations on the following aspects: 

1) Recent developments in scatterometer or SAR remote sensing techniques for high spatial resolution and accurate quantitative descriptions of extreme tropical cyclones, 

2) Theory and application improvements using high resolution wind vector information from scatterometers or SAR for tropical cyclone tracking and intensity predictions.

The proposed session aims to provide high-quality wind field information from scatterometers and SAR, addressing our current efforts to face extreme tropical cyclones. It will also offer opportunities for interdisciplinary cooperation with related fields to mitigate the impact of increasing disasters under severe climate conditions. This aligns perfectly with the IGARSS 2025 theme of "addressing threats to our Earth and promoting collaborative global solutions using remote sensing technology."
 
Show/Hide Description
CCS.60: Hydrological Processes: The Role of AI in Unveiling Climate Change Patterns
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Climate change research requires the ability to discern the most influential features driving environmental shifts, not only as isolated systems but as interconnected systems. This session proposal focuses on the power of statistical analysis and machine learning in unraveling and interpreting the complexities, and feedbacks between different earth systems in relation to climate change. 
Specifically, the analysis of air-sea-snow interactions will allow us to estimate feature importance within and between these various systems. Furthermore, by differentiating natural variability from climate change we can advance NASA’s mission in modeling and tracking changes in hydrological processes on a regional and global scale. As extreme weather events continue to arise the necessity for feature importance, modeling, and predictions are essential for society and the development of management strategies. The goal of this session is to showcase how various machine learning techniques and statistical analysis can extract crucial information from Lagrangian, Euclidean, and remotely sensed datasets for estimating key parameters. These include neural networks, gradient descent, linear regression, feature importance, Pearson/Spearman correlation, residual interpretation, and other techniques that work towards demonstrating changes within the systems as a result of climate change. 
Additionally, our session will highlight the pivotal role of feature importance analysis in determining which environmental factors have the most significant impact on climate change. By bringing together experts in the field, we will explore innovative research that demonstrates how various machine learning techniques can enhance our comprehension of climate processes and, in turn, inform more effective climate change mitigation strategies. Attendees will gain valuable insights into how machine learning and statistical analysis are contributing to a sustainable future in the face of climate change challenges.
Show/Hide Description
CCS.61: Hyperspectral geoscience mapping in developing countries.
SP.2: Special Scientific Themes — Geoscience and remote sensing in developing countries
The beginning of geological remote sensing aligned with the launch of the Landsat mission in the early '70s. The field has since received massive boosts with the launch of ASTER and Sentinel-2 sensors from 1999 onwards. Airborne systems have also been important since the 1990’s for testing hyperspectral technology and for some limited large scale mineral mapping surveys. Developed and developing countries' geoscience communities have since equally benefited, particularly from the freely available satellite data to improve geological and soil characterisation and mineral exploration studies. However, this data sometimes has spatial and spectral resolution limitations, constraining the development in the geological remote sensing field.
Governmental agencies and the private sector have overcome this limitation by using commercially available datasets, such as  airborne hyperspectral sensors. Institutions in developing countries, however, do not typically have access to these modern, high-quality datasets as data providers often require expensive mobilisation costs from other countries, making the datasets only available to a select few. Therefore, the recently launched hyperspectral spaceborne sensors including PRISMA, EnMAP, HISUI, and EMIT represent a paradigm shift in this scenario. High spectral resolution data allows for identifying different mineral species, potentially enhancing local and regional geological / soil maps, and also mineral deposit characterisation.
This session invites authors to share their research on using hyperspectral mineral mapping to improve the geological and soil characterisation of study areas in developing countries. The focus is on spectral mineral mapping with recent-generation spaceborne instruments, but the use of different resolution scales and acquisition platforms are also highly welcomed.
Show/Hide Description
CCS.62: IEEE GRSS Data Fusion Contest
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
The Image Analysis and Data Fusion Technical Committee (IADF TC) yearly organizes the IEEE GRSS Data Fusion Contest, an open competition based on freely provided multi-source data and aiming at stimulating the development of novel and effective fusion and analysis methodologies for information extraction from remote sensing imagery. Similar to the last few years, the 2025 edition of the contest is planned to include a competition based on the performance of developed approaches on the data released. The developed algorithms will be ranked based on their performance accuracy, and the winners will be awarded during IGARSS 2025, Brisbane, Australia.


Following the same approach as in the 2016-2024 editions of the DFC and IGARSS, the IADF TC is proposing the present CCS that is aimed at timely presenting the most effective and novel contributions resulting from the competition. A session is proposed, in which the best-ranking submitted papers will be presented, and one slot will be used by
the contest organizers to summarize the outcome of the competition. According to the schedule of the contest, the session is currently proposed without explicitly mentioning speakers and tentative titles but will be filled in after the competition is completed. It is worth noting that the corresponding papers would not go through the submissions of regular papers but would be reviewed directly, in full paper format, by the Award Committee of the Contest. This process will ensure both thorough quality control and consistency with both the timeline of the contest and the final paper submission deadline to IGARSS 2025.
Show/Hide Description
CCS.63: Image Analysis and Data Fusion: The AI Era
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
The continued success of remote sensing across various applications, from semantic map production to environmental and anthropic area monitoring, hazard management, and more, has driven the development and deployment of state-of-the-art sensors capable of providing diverse insights into the Earth's condition. These sensors offer data in various modalities, spanning different types of imagery such as optical, hyperspectral, and synthetic aperture radar, as well as other data formats like Lidar point clouds and precise positioning information via GNSS. While early remote sensing efforts and research predominantly focused on individual sensors, contemporary approaches, notably those rooted in machine learning, seek to integrate data from multiple sources as they often offer complementary insights. Today, an even broader array of data sources is available, including crowd-sourced photographs, oblique images, and data from social networks, opening up novel avenues for tackling the most complex challenges in Earth monitoring and comprehension.

Despite the abundant availability and advantages of multimodal data, which encompasses multi-sensor, multi-frequency, and multi-temporal data, the analysis and fusion of information from these sources remain a complex and continuously evolving research frontier. Modern AI strategies founded on deep learning are at the forefront of enabling effective image analysis and multi-sensor data fusion. As a result, image analysis and data fusion have emerged as vibrant and dynamic research domains, characterized by a substantial demand for knowledge exchange, discussion of ongoing challenges, introduction of new datasets, and the proposal of innovative solutions.

The Image Analysis and Data Fusion Technical Committee (IADF-TC) of the Geoscience and Remote Sensing Society is dedicated to addressing these challenges. Its mission is to facilitate connections among experts, provide educational resources for students and professionals, and promote best practices in the realm of image analysis and data fusion applications. Among its various activities, the IADF-TC organizes an annual community-contributed session held during IGARSS, where it assembles the latest and most cutting-edge contributions in research areas such as machine learning, pansharpening, decision fusion, multi-modal data fusion, data assimilation, and multi-temporal data analysis.

This proposed session boasts a long and successful history within IGARSS, consistently held for over a decade. As a well-established session, it garners the full attention of both senior researchers and young scientists. Furthermore, it addresses topics of increasing significance in remote sensing and geoscience, appealing to an interdisciplinary audience with interests spanning methodology and application domains.
Show/Hide Description
CCS.64: Imaging Spectroscopy Analysis and Needs for the Défense and Intelligence Community
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
Imaging Spectroscopy has always had tremendous potential for national security, humanitarian assistance in war zones, and defence and Intelligence applications. Many new government and commercial hyperspectral systems are now available to this community with more to come. While much information in this community is classified/secured, there are a ways to build international bridges so hyperspectral data can provide the most value to global security. Given the complexity of this type of imagery, use from uniformed staff to decision makers may not be exploited to its full potential. As well, the scientific community needs the opportunity to learn how to better support imaging spectroscopy in global security and humanitarian efforts. This session aims to engage the international, and largely science driven audience at IGARSS with defense and intelligence hyperspectral applications, calibration/validation needs within the global security community, and present on ways that hyperspectral imagery could be better consumed by worldwide armed forces personnel, first responders, and decision makers. Speakers for this session could be end users explaining needs for training, data access, and rapid application of hyperspectral data, decision makers explaining how they would like to better use imaging spectroscopy, those working in military and intelligence research labs explaining limitations to using hyperspectral data operationally, first responders who need to deliver analysis rapidly. This session will be unclassified and open to all with the goal of learning from different organizations and domains and expose GRSS to a traditionally underserved segment of the IEEE GRSS world.
Show/Hide Description
CCS.65: Improved characterization and understanding of wildfires and their environmental impacts using satellite data and artificial intelligence
SP.1: Special Scientific Themes — Natural disasters and disaster management
The severity of wildfire damage increases due to dry weather and climate change around the world. While climate change is a contributing factor to the increasing incidence of wildfires, the consequences of these fires extend far beyond their initial outbreak. Wildfires not only contaminate soil, pollute groundwater, and saturate the atmosphere with harmful substances, but they also devastate ecosystems and release greenhouse gases, further exacerbating the long-term effects of global warming. There remain numerous challenges that we need to understand, such as these complex relationships and the nature of wildfires. To improve understanding of wildfire behavior, various sources can be utilized such as remote sensing, numerical model, and chemical transport model. These days, artificial intelligence is actively used in environmental science, and not only it shows better performance than traditional techniques in monitoring or forecasting, but it is also widely used to understand essential information or complex relationships between disasters.
Therefore, this session invites contributions providing new insights into wildfire behavior through satellite data and artificial intelligence. It includes any extended application for air quality or climate extremes related to wildfire. This session also welcomes case study of large fire events. The expected topics for this session are listed, but not limited to that.
	Wildfire monitoring and forecasting
	Smoke and air quality modeling
	Carbon emission estimation
	Wildfire risk assessment
	Ecosystem recovery and rehabilitation
	Wildfire behavior analysis (e.g. fire spread)
	Climate change and wildfire trends
Show/Hide Description
CCS.66: Innovations and New Methods in Remote Sensing Instrument Design and Calibration
S/I.15: Sensors, Instruments and Calibration — Advanced Future Instrument Concepts
The Community Contributed Session, “Innovations and New Methods in Remote Sensing Instrument Design and Calibration”, will explore cutting-edge advancements in radiometric and spectral calibration hardware and techniques, as well as groundbreaking sensor designs that are critical for the next generation of weather and climate data. As demands for higher spectral and temporal resolution increase, novel approaches are also required to improve radiometric accuracy and performance. This session will also address innovations in calibration algorithms and supporting infrastructure, essential to manage the higher data volumes and enhanced performance capabilities of these future sensors. Equally important is the need for real-time, automated calibration systems to ensure the continuous flow of precise, actionable weather and environmental data.

Providing revolutionary advancements in radiometric, spectral, and spatial- temporal resolution and accuracy are being driven by emerging hardware such as quantum-based calibration targets and sensors, photonic integrated circuits, and the integration of Focal-Plane Arrays (FPA) systems in hyperspectral sounders. On the software side, innovations like Deep Learning for calibration (capable of identifying anomalies and correcting errors) are on the horizon. Autonomous calibration systems are also poised to streamline the recalibration process, ensuring continuous data integrity over time and rapid operational maturity. Along with these innovations, new sensor designs, including Hyperspectral Microwave Sounders (in Low Earth Orbit (LEO) as well as Geostationary Earth Orbit (GEO)), Hyperspectral CubeSat constellations, and instruments that cover under-realized portion of the electromagnetic spectrum (Such as VSWIR Band and Solar Reflective Band measurements) will enable unprecedented measurement capabilities. 

These advancements in sensor design and calibration hold the potential to significantly transform weather and climate science. By providing more accurate and detailed observations across the electromagnetic spectrum, they will improve our ability to monitor key environmental variables such as atmospheric temperature, moisture profiles, and land and sea surface properties. This enhanced observational capacity will enable better forecasting of extreme weather events, contribute to more precise climate modeling, and support the development of early warning systems for natural disasters. Ultimately, the innovations discussed in this session will play a critical role in advancing our understanding of Earth’s climate systems, facilitating more informed decision-making in response to global environmental challenges.
Show/Hide Description
CCS.67: Innovative Sciences and Technologies Developed for Detecting Severe Storms and Their Devastating Impacts on Life and Properties
M.3: Atmosphere Applications — Atmospheric Sounding
As climate change accelerates, the frequency and intensity of severe storms are increasing, posing greater challenges for accurate detection and prediction. This session invites contributions focusing on innovative methodologies developed for detecting storms from satellite data, incorporating both physical retrieval techniques and artificial intelligence (AI) approaches. With the availability of high-resolution satellite data, both temporally and spatially, there is immense potential for enhancing our understanding of storm dynamics and improving early warning systems.

Submissions are encouraged to explore the use of data from geostationary meteorological satellites, such as GOES, AHI, FY-4, and MSG, polar-orbiting satellites like Sentinel and GF series as well as active and passive microwave sensors (SAR, microwave radiometer etc). These satellites provide crucial information through visible and infrared imaging that can be leveraged to monitor atmospheric and surface conditions. Derived products, such as atmospheric profiles and surface parameters, offer vital insights into phenomena like convective initiation (CI), severe precipitation, surface flooding, and landslides. The methodologies proposed should aim to refine the tracking of these events, ensuring timely and accurate detection.

Additionally, this session encourages contributions that focus on assessing the socioeconomic impacts of storms, particularly the aftermath, which can include damage to infrastructure, agriculture, and local economies. Studies that utilize satellite-derived data to evaluate these impacts and propose mitigation or recovery strategies are highly valued.

This session serves as a platform for interdisciplinary collaboration, inviting meteorologists, data scientists, remote sensing experts, and economists to present their work. By combining cutting-edge technology with in-depth analysis, the session aims to foster innovative solutions to enhance storm detection, impact assessment, and response strategies in the face of a rapidly changing climate.
Show/Hide Description
CCS.68: Integrated Multi-Satellite, GNSS and Observing Systems for monitoring and forecasting associated disasters from Cyclones/Typhoons/Hurricanes in Changing Climate
SP.1: Special Scientific Themes — Natural disasters and disaster management
The frequency and intensity of cyclones/typhoons/hurricanes are globally increasing due to strong land-ocean coupling and complex interactions between the land, ocean, and atmosphere. In the changing climate scenario, the extreme rainfall and strong winds associated with these storms impact coastal areas wherever landfall occurs, leading to extreme floods that affect roads and buildings, and day to day life of people living in the area. These impacts are exacerbated by urbanization in coastal areas, which amplifies the socio-economic consequences of such extreme events. A comprehensive approach to storm monitoring and forecasting, utilizing integrated observation systems, including multi-satellite, GNSS, airborne, and oceanic platforms are required for early warning and forecast. Multi-satellite systems serve as the backbone of storm tracking, providing real-time data on storm development, movement, and intensity. Geostationary and polar-orbiting satellites enable continuous observation of storm patterns, capturing key data on sea surface temperature, precipitation, and atmospheric conditions essential for accurate forecasting. Ocean-based observing systems, such as buoy networks and ARGO floating devices, provide surface and subsurface ocean surface information which can be used to know spatial and temporal changes in ocean parameters that influence storm formation and strength. GNSS receivers positioned on land and sea help monitor atmospheric parameters, such as water vapor content, which impact storm growth and predictability. Airborne and drone systems are also helpful, especially when storms approach land. Radar and meteorological networks on land add another critical layer, particularly in the final stages of storm monitoring. So the observing systems on land and ocean, together with multi-satellite, airborne, and drone systems, provide valuable information for disaster monitoring, forecasting, and early warning, helping people evacuate likely affected areas to save lives. 

This session aims to discuss recent developments and the understanding of the genesis and quantitative impact of these deadly events. It will feature contributions that deepen the scientific understanding of storm genesis and explore novel methodologies for quantifying their impact. Contributions are welcome that utilize all kinds of observing systems on land, ocean, satellite, and GNSS systems, covering all aspects of monitoring and mitigating socio-economic impacts, as well as safeguarding high-rise buildings from strong winds in vulnerable areas. By integrating advanced observing systems with disaster mitigation strategies, this session aims to explore innovative approaches to protect infrastructure and populations in high-risk regions. The session also welcomes contributions based on models, simulations, and the use of social media to reach people in storm-affected areas.
Show/Hide Description
CCS.69: Large-scale forest vertical structure, biophysical parameters and forest change mapping with the fusion of spaceborne radar and lidar/optical sensors
L.4: Land Applications — Forest and Vegetation: Biomass and Carbon Cycle
Since forest structure is of great value to terrestrial ecology, habitat biodiversity, and global carbon storage assessments, it is desired to monitor and quantify the state of, and change in forest vertical structural profile, aboveground biomass and height along with other forest biophysical characteristics (e.g., LAI). It is important to generate large-scale (e.g., regional and global) moderate-resolution (e.g., few hectares down to sub-hectare) such products because these are useful not only on carbon storage accounting and carbon cycle dynamics modeling but also on supporting efforts aimed at quantifying biodiversity, particularly given the rapid declines and losses of many plant and animal species world-wide.

To address this scientific goal, the remote sensing community has been working towards combining multi-sensor measurements, such as Synthetic Aperture Radar (SAR) and lidar as well as optical sensors. For example, these include JAXA's ALOS/ALOS-2/ALOS-4 (single L-band SAR) and MOLI (lidar) missions, NASA'S NISAR (single L-band SAR) and GEDI (lidar), DLR's TanDEM-X (twin X-band SAR), ESA's BIOMASS (single P-band SAR) as well as China's Lutan-1 (bistatic L-band SAR) and TECIS (lidar). Commercial SAR satellites have also been emerging in recent years, such as the X-band SAR constellation from Capella Space, and the multi-static X-band SAR constellation PIESAT-1 from PIESAT Information Technology. Other spaceborne passive optical sensors including NASA’s Landsat and ESA’s Sentinel-2 have also been successfully combined with lidar missions for large-scale forest parameter mapping.

It is known that radar (and passive optical sensor) have complete spatial coverage and good spatial resolution with moderate accuracy in measuring vertical structure, where microwave radar has deeper penetration through the canopy than passive optical sensors. In contrast, spaceborne lidar has a sparse coverage with much better vertical confidence. Therefore, this proposal will demonstrate a few novel scientific algorithms to combine the complete spatial coverage of radar/optical and precise vertical measurements of lidar so that large-scale (potentially global-scale) maps of forest vertical structure, biophysical parameters (e.g. aboveground biomass/height) and their time-series changes can be generated, through the fusion of multiple spaceborne SAR, including (but not limited to) Interferometric SAR (InSAR), Polarimetric InSAR (PolInSAR) and Tomographic SAR (TomoSAR) data, as well as lidar and/or passive optical data.
Show/Hide Description
CCS.70: LiDAR for Sustainable Development Goals
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
LiDAR technology has emerged as a crucial tool for advancing the United Nations' Sustainable Development Goals (SDGs) by offering precise and scalable solutions for environmental monitoring, urban planning, and resource management. This is particularly important as high-resolution 3D data contributes to global challenges like climate change, biodiversity loss, and sustainable infrastructure development. This session focuses on various sustainable development applications of LiDAR. The purpose is to support climate action, protect ecosystems, develop renewable energy, and enable the sustainable growth of cities. Participants will discuss state-of-the-art techniques in LiDAR data acquisition, processing, and analysis, including the integration of AI and machine learning for more accurate predictions and insights. By fostering interdisciplinary collaboration, LiDAR provides emerging technological support for the 2030 Agenda for Sustainable Development, promoting balanced growth across social, environmental, and economic dimensions.

This Chair session invites prospective authors to submit papers that explore the applications of LiDAR technology in advancing the Sustainable Development Goals (SDGs).

Topics To Be Covered

The chair session seeks contributions that may address, but are not limited, to the following topics:
1. LiDAR for Environmental Monitoring and Natural Resource Management
2. LiDAR for Climate Change Impact Assessment
3. LiDAR for Smart Cities and Infrastructure Development
4. LiDAR for Coastal zone and wetland management
5. LiDAR for Forest structure and carbon stock estimation
6. LiDAR for Disaster Management and Emergency Response
7. LiDAR Applications in Extreme Environments
8. LiDAR for Biodiversity and Ecosystem Services
9. Applications of LiDAR with LLMs
10. Integration of Socioeconomic Factors with LiDAR Data
11. LiDAR for Renewable Energy Site Assessments
12. Interdisciplinary Collaboration and LiDAR Potential
13. Policy Integration with LiDAR Technology
14. LiDAR-Derived Metrics for Quantifying Intelligent Processing
15. Novel case studies in object inventory and management
16. Real 3D Visualization in Urbanization
LiDAR retrieval algorithms benchmarking studies and validation protocols
Show/Hide Description
CCS.71: Low Earth Orbit (LEO) satellite missions and their contribution to Earth science applications
S/M.7: Mission, Sensors and Calibration — New Space Missions
Polar orbiting environmental satellites in Low Earth Orbit (LEO) are critical for global monitoring of Earth and its environment. With three operational satellites in orbit, NOAA’s Joint Polar Satellite System is providing resilient and reliable Earth observations for operational meteorology and other mission critical applications. Two more satellites in the series are under development and the constellation is expected to provide backbone observations that support both short- and long-term weather forecast models well into the next decade. In collaboration with partner agencies such as NASA, ESA, EUMETSAT and JAXA, NOAA provides its stakeholders a large set of global observations in LEO. The synergy between NOAA and its partner missions offers significant benefits to users such as improved global refresh of observations and complementarity of measurements in multiple regions of the electromagnetic spectrum to enable detailed monitoring of the Earth. In addition to providing timely and critical observations for extreme weather and disasters, the long historic data record from LEO satellites is also critical for climate monitoring. This session invites presentations from international agencies to present the status of their current and future LEO missions that make routine observations to monitor the Earth and its environment for applications that support decision makers. Subjects covered in this session will include applications and accomplishments from current operational missions as well as plans for new missions from space agencies. Presentations that demonstrate the societal and economic value of LEO observations to applications that are integral to decision making are also of high interest to this session.
Show/Hide Description
CCS.72: Microwave Radiometry Calibration
S/I.16: Sensors, Instruments and Calibration — Microwave Radiometer Calibration
Radiometry calibration plays a critical role in ensuring the accuracy and reliability of remote sensing data, which is vital for various applications in geoscience. This session will focus on the techniques, standards, and challenges associated with radiometric calibration, aiming to provide a platform for experts to discuss recent advancements, methodologies, and best practices. By calibrating the sensors used in remote sensing instruments, we can ensure that the measured radiance from the Earth's surface is consistent, comparable, and interpretable across different platforms and over time. This is essential for both qualitative and quantitative analyses of remotely sensed data in a wide range of scientific and industrial applications, including climate monitoring, agriculture, environmental management, disaster response, and natural resource exploration.
Scope and Session Description:
The session will cover a broad range of topics within radiometry calibration, including but not limited to:
•	Absolute and relative radiometric calibration: Discussion on the latest techniques for ensuring sensor accuracy, consistency, and precision in measuring radiance values.
•	Pre-launch and in-orbit calibration: Exploration of ground-based calibration techniques and the challenges of maintaining sensor accuracy once deployed in orbit.
•	Cross-calibration of sensors: A look at methodologies for harmonizing data from different sensors, platforms, and missions to ensure seamless data integration.
•	Calibration traceability and standards: Ensuring that radiometric measurements can be traced back to a recognized reference, allowing for comparability over time and between sensors.
•	Challenges in calibration of multi- and hyperspectral sensors: Specific challenges related to the growing use of high-resolution spectral data in remote sensing.
Importance to Geoscience and Remote Sensing:
The importance of accurate radiometry calibration cannot be overstated in geoscience and remote sensing. The increasing reliance on remote sensing data for monitoring environmental changes, climate models, and resource management requires that the data is accurate and reliable. Without proper calibration, inconsistencies between sensors, platforms, or even temporal measurements can lead to significant errors, rendering the data unsuitable for scientific analysis and decision-making. Moreover, with the rapid advancements in sensor technology and the proliferation of satellite missions, there is an urgent need for consistent and robust calibration practices to ensure the comparability of data from various sources.
This session will bring together researchers, practitioners, and stakeholders to discuss how to advance the current state of radiometry calibration. It will highlight the critical need for ongoing research and collaboration to address the evolving challenges in this field and ensure that remote sensing continues to provide accurate, actionable insights for geoscience applications.
Show/Hide Description
CCS.73: Mine site re-evaluation, monitoring and restoration
L.7: Land Applications — Topography, Geology and Geomorphology
This theme invites contributions that utilises hyperspectral technologies to advance the environmental management of mining sites. It emphasises the integration of hyperspectral data for a variety of innovative applications, including the identification of new resources of critical metals within existing mine waste and the enhancement of restoration efforts. Hyperspectral imaging is a sophisticated technique that captures and analyses a wide range of light wavelengths, providing detailed information that can significantly improve resource detection and environmental monitoring. By focusing on the use of these advanced techniques in real-time monitoring and restoration planning, this theme aims to highlight how hyperspectral technologies can substantially boost the sustainability and environmental stewardship of mining operations.
Moreover, this theme explores the development and implementation of site rehabilitation strategies, with a particular focus on how hyperspectral data can contribute to understanding and mitigating the ecological impacts of mining activities. Contributions are encouraged to investigate how hyperspectral imaging can be utilised to assess and monitor progress in site rehabilitation efforts, such as tracking soil health, evaluating vegetation recovery, and gauging overall ecosystem restoration. This approach seeks to provide a comprehensive view of how these technologies can aid in creating more effective and environmentally responsible mining practices.
By addressing these key areas, the theme aspires to demonstrate the transformative potential of hyperspectral technologies in advancing sustainable mining practices. It aims to illustrate how such technologies can lead to improved outcomes for both the environment and the communities affected by mining, ultimately fostering long-term ecological health and resilience. Through these contributions, the theme seeks to underscore the pivotal role of hyperspectral data in shaping the future of environmentally conscious mining and restoration efforts.

Show/Hide Description
CCS.74: ML and AI-Based Noise Reduction and Image Enhancement
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Noise reduction and image enhancement are fundamental steps in the pipeline that transforms remotely sensed measurements into valuable information. These families of operations usually rely on Statistics and Signal Processing techniques, but the advent of Artificial Intelligence is shaping the way in which we think, design, apply, and quantify them in Remote Sensing. 

This call for submissions invites researchers, practitioners, and industry experts to explore the application of ML and AI in noise reduction and image enhancement. We are particularly interested in original papers addressing the following themes: 

Algorithm Development: Innovative ML and AI algorithms aimed at improving noise reduction and enhancing image understanding. 

Comparative Studies: Assessments of AI techniques against traditional methods, beyond classical metrics such as Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index (SSIM), and visual quality. 

Remote Sensing Applications: Case studies illustrating the effectiveness of AI-driven methods in fields like environmental monitoring, urban planning, and disaster management. 

Handling Data Diversity: Strategies addressing diverse noise patterns, resolutions, and sensor characteristics within image datasets. 

We aim to foster collaboration and knowledge sharing within the community, advancing the integration of AI and ML in remote sensing applications. 

 
Show/Hide Description
CCS.75: Modeling in GNSS-R and other signal-of-opportunity systems
T/M.27: Modeling — EM Modeling for Signals of Opportunity (e.g. GNSS-R)
The session is aimed at collecting research about the modeling in global navigation satellite system reflectometry (GNSS-R) and other signal-of-opportunity reflectometry (SoOp-R) systems. Both land and ocean applications are considered. The session is organized by the GNSS-R Working Group of the Modeling In Remote Sensing Technical Committee (MIRS TC) of the IEEE Geoscience and Remote Sensing Society (GRSS). The session aims to give a picture about the state of the art of modeling research and its impact on data processing.
The use of GNSS signals reflected by land and sea surfaces has attracted strong scientific and industrial interest. The study of wind fields over the ocean, soil moisture, vegetation biomass, as well as freeze/thaw state of soils at higher latitudes are nowadays consolidated applications of GNSS-R. Research groups from several countries are currently involved in
research activity targeting the exploitation of data produced by the NASA CYGNSS mission, whilst the preparation of the HydroGNSS mission is getting closer to the launch, scheduled soon in 2025.
Great progress has been made in understanding the physics describing the interaction of navigation signals with natural surfaces and the corresponding impact on the estimation of bio-geophysical parameters. However, improving the understanding of the scattering demands further efforts, especially when different frequencies and system parameters (e.g.,
receiver configuration and the antenna patterns) are considered. The exploitation of both coherent and incoherent reflections, and their generation across the illuminated scene, is also an object of research studies.
Show/Hide Description
CCS.76: Monitoring of Snow/Glacier lakes and Associated Natural Hazards Using Ground, Drone, Radar, Airborne, and Satellite data
SP.1: Special Scientific Themes — Natural disasters and disaster management
The snow/glaciers in the mountainous Areas are sensitive indicators of climate change which is causing more frequent natural hazards such as snow avalanches, landslides, rockslides, floods, subsidence and earthquakes. The population growth in the mountainous regions requires development of infrastructure, which is subject to natural hazards and contributes to the growing threat to the population living in the area. The greenhouse emissions and long-range transport of dust and black carbon directly impact the snow/glaciers; thus, increasing frequent occurrence of natural hazards. In recent years, deadly disasters occurred in the Himalayan region (Chamoli, Kedarnath, Joshi math) affecting the people living in the surrounding areas. 
The proposed session invites papers utilizing ground, satellite, drone, and airborne data with the AI/ML to provide early warning, mapping, and recovery plans prior to and after the natural hazards to help the community living in the vulnerable and affected areas. 
Show/Hide Description
CCS.77: Moving Target Indication Using Synthetic Aperture Radar
T/D.11: Data Analysis — Object Detection and Recognition
Application of Synthetic Aperture Radar (SAR) was widely studied for its robust all-weather periodic data acquisition. Especially, advancement of radar sensor enabled the acquisition of high-resolution SAR images, which were applied to monitoring ground and maritime targets. Identifying the motion of ground and maritime targets using SAR sensor provides invaluable information on dark vessels, military vehicles and aircrafts, which can be directly used in decision making of national security related issues. Therefore, moving target indication using SAR sensors is therefore one of the most significant research topics in remote sensing, concerning its practical application and development. 
Conventional application of SAR moving target indication was performed using multi-channel radar, often by means of Displaced Phase Center Antenna (DPCA) or Along-track Interferometry (ATI). The two algorithms exploit amplitude and phase difference of multi-channel SAR images, thereby identifying the targets in motion, or estimating its line-of-sight velocity respectively. Not only the technical advancements on DPCA and ATI, moving target indication on different sensor platforms such as single-channel SAR, geosynchronous SAR and multi-input-multi-output SAR was investigated henceforth. Given that robust and precise moving target indication algorithms on vessels, vehicles and aircrafts can significantly benefit the practical advancement on military reconnaissance technologies. 
This community contributed session is dedicated to cutting-edge technologies of SAR moving target indication, not only including the technical enhancements of conventional DPCA and ATI algorithms, but also encompassing (i) moving target indication on various SAR sensors, (ii) application of precise moving target indication algorithms of different ground and maritime targets, (iii) estimation of complex information on target motion surpassing its velocity and (iv) their uses on military reconnaissance. 
Show/Hide Description
CCS.78: MULTIFREQUENCY MICROWAVE APPLICATIONS TO SOIL AND VEGETATION : OBSERVATIONS AND MODELING
L.3: Land Applications — Forest and Vegetation: Application and Modelling
The increasing impacts of climate change, with more and more frequent extreme weather events and persistent droughts, has strong impact on crop productivity, health and yields. Recently, these risks have been expanding in several regions within the Mediterranean basin. Therefore, effective management of soil and water resources can be considered as a key factor to achieve sustainable food production. In this context, the monitoring of soil and vegetation from space can greatly enhance our understanding of Earth's surface processes as hydrological cycle and crop productivity. With the rapid advancements in remote sensing technology, estimation of several natural parameters with high spacetime resolution is becoming more and more feasible. 
The role of soil moisture and vegetation biomass in the climate system has been studied for a long time by the climate research community, showing an enhanced understanding after the addition of remotely sensed products. The essential role of soil moisture in the climate system motivated the Global Climate Observing System (GCOS) and ESA to endorse soil moisture as an Essential Climate Variable (ECV) and introduce it to their Climate Change Initiative programme. 
Remote sensing techniques support observations of the most important surface characteristics according to different sensors: the optical ones, such as visible-near-infrared, hyperspectral, and thermal infrared sensing, have limited penetration capabilities, making impossible to obtain soil information in presence of dense vegetation layers and of the entire vegetation structure in case of well-developed plants. Nevertheless, the signal in these wavelengths is very sensitive to vegetation pigments and photosynthetic activity. Furthermore, the use of these sensors is hampered by weather conditions like the presence of clouds, rainfall and solar illumination. 
Microwave sensors, such as synthetic aperture radar (SAR) and microwave radiometers, have instead a deeper penetration into agricultural and forest cover and better capability to observe objects in absence of solar light and cloud cover. Microwave observations from space-borne sensors are ideal for soil moisture and vegetation biomass retrieval due to the high sensitivity to water in observed bodies and the frequent revisit time. For this reason, a variety of microwave sensors from both active and passive systems has been observing the Earth’s surface since the late 1970s with this aim. Microwave observations can be consequently used for the retrieval of these land surface parameters and may ultimately be integrated into existing (long-term) data products. 
Due to the peculiar characteristics of optical and microwave satellite data, the integration of these two types of measurements, enable a comprehensive approach to assessing crop health, water status, and soil conditions, by overcoming the constraints of single use of different sensors and thus obtaining more in-depth information on the canopy. 
In this session some approaches based on microwave observations also integrated with optical data for estimating soil and vegetation parameters will be described. Both active and passive sensors will be used together with newly implemented algorithms and models.
Show/Hide Description
CCS.79: Multisensor data fusion and geospatial data intelligence
T/D.17: Data Analysis — Data Fusion
•	Brief description of session with keywords
Recent developments in earth observation sensing technology have yielded new knowledge, methods, and tools for collecting and processing and sensing. This session will provide a platform to discuss challenges and opportunities for multisource data fusion, AI methods for efficient and accurate remote sensing image classification, 3D mapping and their applications such as change detection for disaster management and assessment, land use and land cover analysis, environmental monitoring, and assessment. 

•	A short abstract 
Various earth observation platforms (e.g., satellites, aircrafts, and drones) and types of sensors (e.g., LiDAR, optical, SAR, hyperspectral) along with in-situ sensors have noticeably been increased during the last decade. As a result, massive amounts of rich data are systematically produced. This increase in the number and heterogeneity of earth observation data sources presents both opportunities and challenges. Opportunities result from the refinement of sensors and sensing methods, the ability to capture increasingly granular data via those improved sensors and methods, and the improved data storage, manipulation, analysis, and interpretive capacity of modern computational hardware and software, cloud computing and IoTs. Challenges result from the difficulty in accurately fusing data, so that a coherent, comprehensive, fully detailed, and integrated understanding of data from multisensor. This session brings together state-of-the-art research and development on AI algorithms, techniques, and software tools to address the above-mentioned challenges and to better exploit spatial and temporal features and dependencies in RS data. The focus of the session will be on data-driven models (such as CNN, GAN, RNN, etc.)  that are highly adaptable to changes in large remote sensing dataset and can find unexpected patterns in multi-source and multi-temporal data.
Show/Hide Description
CCS.80: Multisource Remote Sensing for Enhanced Monitoring and Assessing Agricultural Land Applications
L.5: Land Applications — Agriculture
Multisource remote sensing is an advanced approach to monitor and assess agricultural lands to support decision making in maintaining and enhancing agricultural land applications to consider agricultural land applications for plant/crops and animals/livestock within the scope of large-scale one health agricultural and ecological systems. Multisource remote sensing varies with platforms of satellite, piloted aircraft, unmanned aerial vehicles and ground-based systems and sensors typically of optical multi- and hyper-spectral spectrums (visible, visible-near infrared, short-wave infrared, and thermal), Synthetic-Aperture Radar (SAR), and LIght Detection And Ranging (LIDAR). Multisource remote sensing is based from single source remote sensing which may not provide acceptable performance for the research tasks. Integration of single source remote sensing and finding optimal single source remote sensing for multisource remote sensing present a challenge for scientists and engineers to enhance remote sensing monitoring and assessment. Therefore, the topics are welcome for how to integrate multisource remote sensing cross platforms and sensors based on standalone single-source remote sensing for effective agricultural land monitoring and assessment. For example, image data fusion from different sources can be investigated from low (input/pixel) level, intermediate (feature) level to high (output/decision) level. Now is the age of artificial intelligence (AI) with wide applications in information processing and analyses of geoscience and remote sensing. The topics are also expected to discuss if AI is necessary for agricultural land applications through multisource remote sensing and demonstrate how machine and deep learning or any new AI schemes such as Large Language Models can help enhance the monitoring and assessment. The reports on software tools and hardware systems are especially encouraged to provide facilitated solutions to multisource remote sensing integration and AI enhancement for practical monitoring and assessing agricultural land application. The overview on the future and perspectives on multisource remote sensing for agricultural land applications are welcome also. The contents of this session will be informative and valuable to contribute to geoscience and remote sensing technically through integration of multisource remote sensing and practically for enhanced geospatial analyses in the context of agricultural land applications within global ecosystems.
Show/Hide Description
CCS.81: Needs and Challenges for Separating Vegetation Structure from Bare Earth Topography for Mapping Earth’s Changing Surface and Overlying Vegetation Structure
S/M.7: Mission, Sensors and Calibration — New Space Missions
Earth’s changing surface structure informs us about climate change, natural hazards, ecosystem habitats, and water availability. Separating bare overlying vegetation and the built environment from bare Earth topography reduces uncertainties in solid Earth, ecosystem, cryosphere, hydrological, and coastal processes, but is challenging. Radar, lidar, and stereoimaging are all methods for estimating surface topography and vegetation structure. Fusing data from each method should provide more robust solutions and reduce uncertainties.  Radar provides excellent estimates of vegetation density distribution. Lidar provides accurate estimates of vegetation canopy height and the ground surface. Stereoimaging provides high resolution digital surface models covering broad areas. Past processes are written in the landscape and enable forecasting of future processes and events; accurate measurement of bare Earth topography will improve understanding and assessment. Current processes and events change topography and/or vegetation structure gradually over time or abruptly.  What science challenges can STV meet? What coverage, accuracy, resolution, and repeat frequency are needed to address the science? What is the best approach to meeting the science and applications needs?   This session encourages submissions that demonstrate how measurement of surface topography and vegetation structure advance scientific knowledge including the following. Algorithm development, data fusion, and modeling, analysis, or demonstration of science traceability to measurement needs. Presentation of existing capabilities, data, and missions for estimating surface topography or vegetation. Evaluations of orbital and suborbital capabilities and their applications driving the design of an STV observing system. Advances in radar, lidar, or stereoimaging technology and methodology each or combined. Strengths and weaknesses of individual orbiting instruments, co-flyers, or constellations. All advances in mapping Earth’s changing surface and overlying vegetation structure are welcome.
Show/Hide Description
CCS.82: New Frontiers in Satellite Remote Sensing for Geohazard Monitoring: Advanced Algorithms to Real-world Applications
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
In the era of global climate change and rapid urbanization, increasing natural and anthropogenic activities raise several geohazard phenomena across the globe. These include frequent earthquakes, volcanic eruptions, landslides, urban subsidence, floods, tsunamis, drought, forest fires, etc. For instance, rising relative sea levels combined with local subsidence due to rapid coastal urbanization expose to the inundation risk for many low-lying coastal cities globally. To mitigate such geohazard phenomena at global to local scales, we require effective, efficient, and advanced monitoring tools and techniques. Satellite-based radar remote sensing has revolutionized the way we monitor coastal environments and geohazards, offering high-resolution data on inundation, land motion, and risk exposure that is crucial for disaster response and planning. This is due to its unique capability to provide imagery in day-and-night and independent of weather conditions. Recent advancements in satellite technology, such as increasing spatial coverage, frequent revisiting time, high spatial resolution, and algorithm development, have broadened the scope of remote sensing applications, facilitating more accurate and timely assessment of coastal hazard monitoring and geohazard risks. However, there is a need for an integrated multi-sensor and multi-model approach combining various modern remote sensing, ground-based observation, and other geodetic tools for precisely measuring, monitoring, and predicting the underlying geohazard phenomenon. 

This session solicits abstracts focused on recent advances in scientific-technical development for satellite remote sensing strategies addressing the opportunities and challenges related to the following areas:

(1)	Effective monitoring frameworks and analytical mechanisms for various geophysical applications, such as landslides, land subsidence, groundwater monitoring, seismic and volcanic studies, flood mapping, inundation modeling, coastal instability assessment, urban infrastructure monitoring, etc.
(2)	Numerical data modeling with artificial intelligence (AI) and deep learning platforms, and integration techniques exploring multi-resolution satellite data (e.g., radar, multispectral, and hyperspectral imagery), bistatic and multistatic SAR systems, and geodetic data at local and global scales. This includes AI-based integration methods for cross-sensor data, aimed at generating Level 2 land change products, improving the accuracy and timeliness of geohazard risk assessments, and bridging the gap between scientific research and practical management strategies.

We welcome contributions from different interdisciplinary fields, including remote sensing, geomatics, geoscience, geodesy, ocean science, hydrology, atmospheric science, and other relevant areas.
Show/Hide Description
CCS.83: New Observations and Applications of Multidimensional SAR
T/S.1: SAR Imaging and Processing Techniques — Interferometry: Along and Across
Synthetic aperture radar (SAR) is an active earth observation system that utilizes microwave electromagnetic waves to detect and perform long-distance imaging, regardless of weather conditions. It significantly enhances the radar's information acquisition capabilities. SAR holds great significance in both military and civilian domains and finds widespread applications in target recognition, battlefield environment detection, terrain mapping, and resource prospecting, among other fields. The evolution of SAR systems has undergone a significant transition, progressing from single-band and single-polarization configurations to multi-polarization, multi-frequency, multi-angle setups. This transition has been driven by advancements in technology, increasing application requirements, and deeper scientific investigations. The development of SAR systems can be viewed as an ongoing exploration and utilization process of microwave electromagnetic resources.
With the rapid development of microwave imaging technology, the traditional single-dimensional imaging is no longer able to meet the increasingly complex demands. The traditional microwave imaging system, which treats each dimension in isolation, fails to integrate the acquired information organically, thus unable to comprehensively present target characteristics. This issue results in poor visibility and discernibility of targets, making interpretation difficult and greatly limiting the promotion and application of microwave imaging technology. To address this problem, A multi-dimensional joint observation SAR imaging technology was developed, which constructs a multi-dimensional observational space and integrates the multi-dimensional observational signals to obtain additional information. With the development of SAR imaging technology and the promotion of its application needs, data acquisition are becoming more and more diversified, and gradually develops from single polarization, single angle and single band to multi polarization, multi frequency, multi angle and multi time. Multidimensional SAR will provide theoretical and methodological support for high-precision mapping of complex terrain, three-dimensional ocean exploration, quantitative monitoring of forest resources and ecological environment elements.
This session aims at observations and applications of multi-dimensional SAR using two or more dimension: resolution, frequency, polarization, angle and time, with an emphasis on innovative approach including new observations extraction, development of new sensor, and expansion of new applications.
Show/Hide Description
CCS.84: New Satellite Laser Data for Terrain Modeling
S/M.5: Mission, Sensors and Calibration — Spaceborne LIDAR Missions
In Earth observation missions, the emergence of advanced satellite-based laser measurement technologies has significantly enhanced the capability and accuracy of acquiring three-dimensional information of targets, paving the way for new technical approaches to high-precision global terrain modeling. 
Satellite-based laser measurements are subject to a variety of complex error sources, including those from the instrument, environment, and target. A thorough analysis of these error sources and their impact on measurement results, along with the development of corresponding calibration and correction methods, is crucial for improving the accuracy and precision of measurement data. This provides a reliable foundation for subsequent data applications. Considering the complexity and variability of global terrain, developing efficient filtering and classification techniques to remove noise and accurately identify target signals is essential for further enhancing the validity and reliability of the data. 
Integrating high-precision control data from satellite-based lasers with multi-source observation data, such as satellite imagery and synthetic aperture radar (SAR), can effectively improve the accuracy of existing digital elevation model (DEM) products. The accumulation of massive satellite-based laser data from long-term missions enables the generation of high-quality terrain models over large regions or globally.
Moreover, advancing intelligent processing and information extraction techniques for satellite laser data in specialized fields like forestry and oceanography provides vital data support for global carbon resource monitoring and nearshore terrain surveying. 
This topic will comprehensively showcase the latest developments in satellite laser data processing, product generation, and cutting-edge applications, aiming to foster deeper integration between scientific research and practical applications. By bringing together experts and scholars from various fields, we hope to collectively promote innovation and advancement in satellite laser mapping and remote sensing technology, to better address the environmental and resource challenges facing the world.
Show/Hide Description
CCS.85: Nighttime Remote Sensing for Sustainable Development Goals
D/S.7: Societal Engagement and Impacts — Remote Sensing for Sustainable Development
In recent years, nighttime remote sensing has gained prominence as a tool for monitoring human activities. Peer-reviewed studies in this field have surged significantly, particularly over the past five years, with more than 250 papers published annually on nighttime light. Much of this research focuses on the Asia-Pacific region, where urbanization—and consequently, nighttime light—has been rapidly increasing. Applications of nighttime remote sensing range from detecting power outages caused by natural disasters and analyzing the effects of Covid-19 lockdowns to assessing the impacts of wars, borders, poverty, urbanization, light pollution, skyglow, and cultural or social events. Additionally, nighttime remote sensing has been employed to investigate nighttime impacts on various environmental and societal factors, such as air pollution, urban heat islands, animal behavior, vegetation cycles, and human health. These research areas align with several Sustainable Development Goals (SDGs), including SDG-1 (No Poverty), SDG-3 (Good Health and Well-Being), SDG-7 (Affordable and Clean Energy), SDG-11 (Sustainable Cities and Communities), SDG-13 (Climate Action), SDG-14 (Life Below Water), and SDG-15 (Life on Land).

Nighttime remote sensing is fundamentally distinct from other types of remote sensing. While daytime remote sensing relies on sunlight as the primary illumination source, nighttime remote sensing mainly depends on artificial light generated by human activities on the Earth's surface at night. This unique light source requires specialized processing methods. For example, Nighttime light remote sensing is more sensitive to observation angles, time-series analysis is often essential, and the detection and pre-processing of clouds become even more critical.

Recent advancements and the launch of new satellites and sensors have led to a proliferation of nighttime-capable instruments and products, such as DMSP-OLS, VIIRS-DNB (featuring NASA's Black Marble), Landsat-8/9, Luojia-1, Jilin-1, EROS-B, SDGSAT-1, and the International Space Station. These products offer a range of capabilities, from long-term daily time-series continuity to multispectral imaging and very high spatial resolutions, reaching up to 1 meter. Additionally, the integration of nighttime thermal imaging enables researchers to assess both artificial light and heat emissions at night, offering deeper insights into urban environments. Researchers in the community utilize these diverse data products based on their specific needs, frequently collaborating and exchanging insights. This session proposal aims to provide a dedicated platform for researchers to discuss nighttime remote sensing data, analysis methods, and societal applications, particularly in relation to the SDGs.

By bringing nighttime researchers together in this community-contributed session, rather than scattering them across various sessions, participants will have the opportunity to engage in in-depth discussions on sensor calibration, algorithm development, product validation, uncertainties, and nighttime remote sensing applications. This focused session will foster more effective use of nighttime remote sensing products and enhance communication within the field, ultimately advancing geoscience and remote sensing.
Show/Hide Description
CCS.86: Pacific Applications of Earth Observation
SP.3: Special Scientific Themes — Remote sensing for sustainable development in the Asia-Pacific region
The proposed session titled "Pacific Applications of Earth Observation" aims to explore the potential of Earth Observation (EO) data in addressing the challenges faced by Pacific Island Countries and Territories. This session will highlight the applications of EO in critical areas such as climate adaptation, disaster management, marine resource monitoring, the blue carbon economy and food security. By fostering collaboration among representatives of local communities, business, governments, and researchers, we aim to enhance the accessibility and usability of EO data tailored to the specific needs of the Pacific region. The importance of this session lies in its focus on empowering people from the Pacific through knowledge sharing, ensuring that EO tools are effectively integrated into decision-making processes.
Show/Hide Description
CCS.87: Physical Modeling in Microwave and Optical Remote Sensing
T/M.23: Modeling — Electromagnetic Modeling
Modeling the signal collected by a remote sensor, considering the characteristics of the acquisition system (sensor, platform) and the electromagnetic interaction with the target and propagation through the atmosphere, is a key topic in remote sensing. It provides answers to the direct (forward) problems, thus enabling the solutions of the inverse (retrieval) problems. This shall be done in the whole range of the electromagnetic spectrum used to sense the environment.
The session is intended to address the technical space between basic electromagnetic theory, data collected by remote sensing instruments, and analyses/retrieval products. It focuses on models and techniques used to take geometric, volumetric, and material composition descriptions of a scene along with their electromagnetic (e.g., scattering, absorption, emission, optical bidirectional reflectance, dielectric properties, etc.) attributes and then predict for a given remote sensing instrument the resulting observation. 
This session seeks papers that can contribute to 1) improving our understanding in modeling microwave and optical signals and their responses to earth surface and atmospheric components; 2) describing advances in data analyses and field experiments for validation of models and inversion algorithms; 3) highlight the sensitivity of the measurements by active and passive sensors to relevant bio-geophysical parameters and identify new observational opportunities and applications. It will provide a forum for researchers and young scientists to share their findings, methods and possibly software tools, thus enabling and facilitating model comparison and validation works. Both modelling in the microwave and the optical spectrum are encouraged to be presented, to foster a fertilization of ideas and explore synergies between the two communities. 
Show/Hide Description
CCS.88: Physics Aware AI Models for SAR Data Generation and Understanding: Towards Global Environmental Monitoring
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
The theme of this year’s conference, “One Earth,” highlights the urgent need for collaborative, global efforts to tackle the threats facing our planet, from climate change to deforestation, natural disasters, and the preservation of ecosystems. The Digital Twin Earth (DTE) provides capabilities to visualize, monitor, and forecast natural and human activity on the planet that contributes to sustainable development and environmental improvement. The global Earth Observation (EO) data plays an important role in developing DTE. By simulating various scenarios, digital twin earth systems can predict future states based on different inputs. This is essential for forecasting climate change, urban development, and natural disasters. Among EO sensors, the Synthetic Aperture Radar (SAR), due to the observation capability during day and night and independence on atmospheric effects, is the only EO technology to insure global and continuous observations. Simulating and understanding SAR data allows for the creation of various scenarios to test hypotheses and understand the potential impact of different policies or environmental changes.

In the field of computer vision, the problem of image generation via generative AI (GenAI) models has been thoroughly discussed and explored. There are, however, some important differences when GenAI and SAR collide. Most of the existing GenAI models exhibit a deficiency in SAR knowledge and a lack of awareness regarding the intricate physical properties and complex nature of SAR data, such as polarimetric features, doppler variations, coherency, spectrum, and different imaging modes including Stripmap, Spotlight, TOPSAR, etc. In pursuit of this objective, they encounter limitations in their ability to produce a wide range of SAR images that possess appropriate physical characteristics, including polarimetry and interferometry.

In this session, we aim to call for papers to develop advanced physics-aware and trustworthy artificial intelligence methods for SAR data generation and understanding, fostering global environmental change monitoring. These technologies can provide more frequent and detailed SAR-simulated imagery used for disaster response planning, even in the absence of real-time data, and augment training datasets for machine learning models designed to predict and track natural disasters across different regions. Despite some initial accomplishments, the full potential of GenAI in the field of SAR remains largely unexplored.

This contributed community session calls papers including the following methods and applications.

AI methods include:
- Generative foundation models for SAR
- Applications of advanced generative AI methods for SAR/ISAR
- Physics-inspired generative AI methods for SAR/ISAR image generation
- AI-empowered physical modeling for SAR/ISAR image simulation
- Unification of generation and discriminative model for SAR
- Trustworthy generated SAR data assessment methods
Applications include:
- Climate change adaptation
- Urbanization monitoring, environmental monitoring
- Sustainable development
- Natural disaster warning and management
- What-if scenarios for Digital Twin Earth of land/ocean/polar regions
Show/Hide Description
CCS.89: Physics-Informed Machine Learning in Remote Sensing
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Machine Learning (ML) has transformed remote sensing by offering powerful data-driven tools capable of analyzing vast datasets and solving complex problems. However, traditional ML models may struggle to generalize across novel datasets or environmental conditions when underlying physical processes are poorly understood. Moreover, machine learning has often been criticized for the difficulty of model interpretation and the lack of physical consistency. 
The proposed CSS aims to explore the potential of Physical Model Informed Machine Learning (PIML) methods, which integrate the strengths of physical models—rooted in electromagnetic theory, sensing characteristics, and geophysics—with the data-driven flexibility of machine learning. By embedding fundamental physical constraints, PIML aims to develop robust and interpretable models that generalize well even with sparse data. This hybrid approach is particularly relevant in applications where remote sensing data may be limited or unrepresentative, such as climate monitoring and disaster prediction, as well as in cases involving the solution of forward and inverse problems. Topics include, but are not limited to the following
	Physics-Informed Neural Networks 
	Hybrid Data and Physics-Based Modeling 
	Physics-regularized Deep Learning 
	Data Augmentation with Physics-Based Simulations
	Physics-Guided Inversion Techniques
	Physics-Driven Surrogate Modeling
	Integrate Data Models and Assimilation Methods
	Explainable AI (XAI) with Physical Constraints
	Physics-guided uncertainty quantification

The proposed CCS session is supported by the GRSS Modelling in Remote Sensing (MIRS) Technical Committee, TC Working Group on Physical Model Informed Machine Learning (PIML)
Show/Hide Description
CCS.90: Planetary Mission Satellite Designs and Space Safety: Laws, Challenges and Future Scope
D/E.2: Education and Policy — Remote Sensing Data and Policy Decisions
Planetary Mission Satellite Designs and Space Safety: Laws, Challenges, and Future Scope focuses on the intricate interplay between satellite engineering for planetary missions and the regulations ensuring space safety. As space exploration expands beyond Earth, missions to planets like Mars, Venus, and outer celestial bodies require highly specialized satellite designs tailored to extreme environments. The session will explore recent advancements in satellite technologies, mission architectures, and innovations necessary for successful planetary exploration. In parallel, it will examine the growing importance of space safety protocols, legal frameworks, and the challenges posed by space debris, orbital congestion, and mission safety.

Scope:

This session will cover a broad spectrum of topics including:
- Satellite Design for Harsh Planetary Environments: The unique technical challenges associated with designing satellites capable of withstanding extreme temperatures, radiation, and atmospheric conditions on planets like Venus and Mars.
- Space Safety and Orbital Debris: Addressing the current and future issues of space debris and congestion in Earth’s orbit, as well as safe navigation of spacecraft in interplanetary space.
- Legal and Regulatory Frameworks: Exploration of international laws and treaties such as the Outer Space Treaty and how they govern planetary missions and ensure compliance with safety protocols.
- Future Trends and Technological Innovations: Anticipating the next generation of planetary mission designs, autonomous systems, and sustainable approaches to mitigate risks in increasingly crowded orbits.

Importance to Geoscience and Remote Sensing:

Planetary missions contribute directly to advancing geoscience by enabling detailed remote sensing of planetary surfaces, atmospheres, and subsurface structures. These missions rely on sophisticated satellite technologies to deliver crucial data about geological features, climate patterns, and surface compositions of planets. Such data enhance our understanding of Earth’s geology through comparative planetology and deepen insights into planetary evolution, climate change, and potential extraterrestrial life. Moreover, the challenges faced in satellite design and space safety, including mission failures due to harsh conditions or debris collisions, highlight the need for robust legal and regulatory frameworks.

As space exploration accelerates, this session will be essential for stakeholders in geoscience and remote sensing. It underscores the need for a harmonized approach, integrating cutting-edge technologies with stringent safety measures and international laws, ensuring the longevity and success of future planetary missions. By addressing these core issues, the session fosters collaboration between scientists, engineers, policymakers, and legal experts, driving forward safer, more efficient exploration of the solar system.
Show/Hide Description
CCS.91: Preparing for the Copernicus Imaging Microwave Radiometer (CIMR)
S/M.2: Mission, Sensors and Calibration — Spaceborne Passive Microwave Missions
The Copernicus Imaging Microwave Radiometer (CIMR) expansion mission is one of the six Copernicus Expansion Missions currently being implemented by the European Space Agency on behalf of the European Commission. CIMR will provide high-spatial resolution microwave imaging radiometry measurements and derived products with global coverage and sub-daily revisit in the polar regions and adjacent seas to address Copernicus user needs. The primary instrument is a conically scanning low-frequency, high spatial resolution multi-channel microwave radiometer. Two satellites are being implemented (to be launched sequentially) each with a design lifetime of 7.5 years and sufficient fuel to last for up to 12 years (thus providing up to ~20 years of continuous data) with a first launch anticipated in 2028/29. A dawn-dusk orbit has been selected to fly in coordination with MetOp-SG-B1 allowing collocated data from both missions to be obtained in the Polar regions within +/-10 minutes.  A conical scanning approach utilizing a large 8m diameter deployable mesh reflector with an incidence angle of 55 degrees results in a large swath width of ~2000 km.  This approach ensures 95% global coverage each day with a single satellite and no hole at the pole in terms of coverage. Channels centred at L-, C-, X-, Ku- and Ka-band are dual polarised with effective spatial resolution of < 60 km, ≤ 15 km, ≤ 15 km and < 5 km (both Ka- and Ku-band with a goal of 4 km) respectively.  Measurements are obtained using both a forward scan and a backward scan arc. On board processing is implemented to provide robustness against radio frequency interference and enables the computation of modified 3rd and 4th Stokes parameters for all channels. This solution allows many Level-2 geophysical products to be derived over all earth surfaces including sea ice (e.g. concentration, thickness, drift, ice type, ice surface temperature) sea surface temperature, sea surface salinity, wind vector over the ocean surface, snow parameters, soil moisture, land surface temperature, vegetation indices, and atmospheric water parameters amongst others.  In preparation for the CIMR mission, this session will focus on encouraging international and cross-disciplinary collaborative activities in the cryosphere, ocean, land and atmosphere domains.
Show/Hide Description
CCS.92: Probabilistic Machine Learning for Earth Observation
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Integrating machine learning (ML) models in Earth Observation (EO) applications has been crucial in progressing the fields of geoscience and remote sensing. However, many traditional ML approaches remain deterministic in their prediction, failing to consider the uncertainty, variability, and inherent noise within geospatial data. To address this limitation, probabilistic and statistical methods can be combined with ML models to enhance their ability to model complex heterogeneous data structures, quantify uncertainty, and increase interpretability.  

The proposed session will cover approaches using mathematical knowledge or mathematical formulation, such as embedding domain-specific knowledge (e.g., physical laws or statistical assumptions) directly into the objective function of ML models, or leveraging sampling techniques (e.g., Monte Carlo methods) to provide robust uncertainty estimations. These methodologies are particularly beneficial in EO applications related to disaster management, like flood mapping and wildfire detection, where understanding uncertainty in predictions can lead to more reliable decision-making. For instance, an ML model predicting multiple values for the same sample or predicting the target together with an uncertainty quantification enriches the outcome and illuminates the process by which the model made such a prediction. 

This session will be open to ML approaches using both supervised and unsupervised learning, leaving freedom if the work focuses on predictive tasks or representation learning. Some key models and techniques in this domain include (but are not limited to)  
* Graphical models  
* Latent variable models 
* Variational methods 
* Expectation-Maximization (EM) 
* Gibbs sampling 
* (Gaussian) Mixture models 
* Gaussian Processes (GPs) 
* Variational Autoencoders (VAEs) 
* Physics-Informed Neural Networks (PINNs) 
* Bayesian Neural Networks (BNNs) 
* Information Bottleneck (IB).

These approaches not only focus on enhancing predictive accuracy but also improve model interpretability and uncertainty quantification, making them invaluable for addressing the challenges posed by current EO applications and data sources. 
Show/Hide Description
CCS.93: Quantum Sensing: Revolutionizing Earth Remote Sensing
S/I.15: Sensors, Instruments and Calibration — Advanced Future Instrument Concepts
Quantum sensing, a field propelled by the fundamental principles of quantum mechanics, has emerged as a revolutionary paradigm, greatly enhancing measurement capabilities across various scientific domains. From fundamental physics to the development of cutting-edge remote sensing technologies, the impact of quantum sensors is profound and ever-expanding. In the vast landscape of quantum technology, a pivotal breakthrough was the ability to cool atoms up to Bose-Einstein Condensate (BEC) using laser and magnetic traps technology. The cold atoms have enabled the utilization of atoms as sensors for a diverse range of parameters, including gravity detection (gravity gradiometers), force measurement (accelerometers), magnetic and electric fields (Rydberg receivers), among others. The recent demonstration carried out by the Cold Atomic Lab (CAL) experiment of being able to create BEC in a microgravity environment on board of the International Space Station (ISS) has opened new avenues for sensor development and exploration, especially in the realm of spaceborne remote sensing. BEC atoms in microgravity are more stable and allow for cooler temperatures and longer interrogation times, allowing experiments and measurements that are not feasible on Earth surface. 

Critical challenges in this domain encompass technology maturation for achieving enhanced sensitivities and the reduction of Size, Weight, and Power consumption (SWaP) of these instruments. These advancements are essential for enabling future spaceborne remote sensing missions, whether they are hosted on the ISS or deployed as free-flying missions. Quantum technologies offer promising solutions, such as leveraging Rydberg atoms for receivers like radiometers, radars, or reflectometers, as well as implementing Quantum Gravity Gradiometers (QGGs) and quantum radars employing entangled photons, among other emerging quantum technologies applied to remote sensing.

This session wants to provide an in-depth exploration of theoretical studies, ongoing research, and practical implementations of quantum remote sensing technologies. It aims to offer a multifaceted perspective, advancing both the theoretical and technological facets of this technique. As researchers delve into the theoretical underpinnings and experiment with practical applications, the potential for revolutionizing remote sensing capabilities becomes increasingly promising, pushing the state of the art forward.

The integration of quantum principles into remote sensing holds vast potential for scientific discovery, and innovative applications. Quantum sensing enables a level of precision and sensitivity that was previously unattainable. This includes improving the measurement of temporal resolution of Earth gravity field for underwater measurement with a precision that current technology such as GRACE, GRACE-FO cannot achieve, and enhancing geophysical remote sensing.

Looking forward, the focus of the field will push the boundaries of quantum sensing remote technology to further enhance its capabilities. This includes advancements in miniaturization, power efficiency, and the integration of quantum sensors with data processing algorithms and artificial intelligence.
Show/Hide Description
CCS.94: Quantum Technology for Remote Sensing
SP.9: Special Scientific Themes — Wave propagation and remote sensing
This session proposal is from the Chair of the Working Group of Active Optical and Lidar, IEEE-GRSS Technical Committee on Instrumentation and Future Technology (IFT). It is in-line with the GRSS outreach to Space Agencies and Industry and the IGARSS 2024 special theme on “SP.9: Wave propagation and remote sensing.”

This session focuses on research and developments in an important topic related to Quantum Technology for Remote Sensing, a unique capability to observe a diverse variety of geophysical phenomena from orbit around the Earth and planets has stimulated new areas of remote sensing research that now attracts the attention of scientists and engineers worldwide. 

Quantum Sensing is an emerging and highly promising technology that leverages the quantum properties of matter and light - such as quantum superposition, measurement, and entanglement - to achieve unprecedented measurement sensitivity and performance. Quantum-enhanced methodologies may outperform their classical counterparts, making quantum sensors relevant across a wide range of applications, including precision navigation and timing, electromagnetic field sensing, attitude control, communications, and gravimetry. Quantum sensors exploit techniques involving quantized energy levels in atomic systems, entanglement between quantum systems, superposition of quantum states, quantum illumination methods, and the manipulation of photons and atoms. Advancements in our ability to generate, control, and manipulate quantum systems are driving the development of quantum sensing technologies, which promise unrivaled sensitivity, resolution, and precision. Significant gains include technologies important for a range of space-based remote sensing, in situ measurements, metrology, interferometry, quantum communication, ranging, imaging, radar and lidar receivers, and gravity measurements.

This community contributed session will focus on quantum sensing techniques, technologies and geoscience and remote sensing applications related to Earth Science, Astrophysics, and Planetary Science.


Show/Hide Description
CCS.95: Radio Frequency Interference and Spectrum Management Issues in Microwave Remote Sensing
T/D.16: Data Analysis — RFI Detection and Mitigation
Radio Frequency Interference (RFI) has been having an increasingly detrimental impact on microwave remote sensing. Interference can corrupt measurements and reduce the ability of retrieving relevant geophysical measurements over many regions of the globe. In several cases, even a primary spectrum allocation to remote sensing does not guarantee that the frequency range of operation of the sensors is free of RFI since illegal in-band transmissions or out-of-band emissions can still be present. Many innovative technical advances, both in software development and hardware design, have been made by the Earth observing microwave remote sensing community as well as by radio astronomers to improve detection of interference and mitigation of its negative effects.

The problem of increasing occurrence of interference in measurements is closely related to the ever growing spectrum needs of commercial interests, particularly those of the telecommunication industry. The development of 5G systems and constellations of satellites in low Earth orbit offering Internet services are some examples of how the demand for spectrum is placing enormous pressure on frequency bands utilized for microwave remote sensing. For this reason, the remote sensing community has also been working closely with spectrum managers to protect the frequencies of interest for science applications.

The session will present advances in interference detection and mitigation techniques developed within the passive and active remote sensing communities, report on the observed cases of interference and discuss the status of current and upcoming missions with regards to dealing with RFI and also discuss spectrum management issues facing remote-sensing frequencies.  
Show/Hide Description
CCS.96: Rapid Building Damage Detection After Natural Hazards Using Satellite Optical Imagery and UAV Data
SP.1: Special Scientific Themes — Natural disasters and disaster management
Natural disasters often cause unpredictable, massive, and devastating impacts, resulting in the loss of human life and property worldwide every year. According to Food and Agriculture Organization of the United Nations, natural disasters cost a total of 3.8 trillion US dollars in economic loss between 1991 and 2021. They also inflicted the lives of two billion people with two million deaths. The most frequent disasters include earthquakes, tsunamis, volcanic eruptions, landslides, avalanches, floods, debris flows, extreme temperatures, droughts, wildfires, as well as tropical storms, hurricanes, sandstorms, and heavy rainfall.
Given the increasing frequency and intensity of natural disasters, there is an urgent need to develop more effective tools for monitoring, early detection, and risk mitigation. After a natural hazard occurs, the early detection of building damage is critical for guiding emergency response efforts, prioritizing resource allocation, and enhancing overall recovery efforts.
Recent advances in geospatial remote sensing technologies, such as Unmanned Aerial Vehicles (UAVs) equipped with sensors like LiDAR and high-resolution optical imagery systems, combined with the rapid development of advanced machine learning algorithms, have revolutionized the field of disaster monitoring and damage detection.
For this purpose, the session titled " Rapid Building Damage Detection After Natural Hazards Using Satellite Optical Imagery and UAV Data" will showcase cutting-edge research on innovative technologies for the early detection of building damage. This session will cover advances in satellite optical imagery and UAV data, as well as fusion with other sources such as radar data, highlighting how the combination of multisource data can improve accuracy and efficiency in damage detection. It will also explore the role of machine learning in automating damage assessments and present real-world case studies of successful implementations.
Through interdisciplinary dialogue involving researchers, practitioners, and policymakers, this session aims to advance the understanding of how geospatial technologies and machine learning approaches can reduce the impacts of natural disasters. By harnessing the power of remote sensing data and recent advances in machine learning technologies, we can build more resilient systems that protect vulnerable communities and ecosystems from future geohazards. These technologies enable faster, more precise monitoring and assessment, allowing for improved disaster preparedness and response.
Show/Hide Description
CCS.97: Remote detection of invasive weed species
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Invasive weed species affect urban, natural and agricultural environments, with wide-ranging impacts including loss of production, infrastructure damage, and biodiversity loss. Disturbance and climate change are key drivers of invasive weed species establishment and spread. There are major economic and social impacts associated with invasive weeds, with the cost-benefit ratio of management significantly higher in the early stages of invasion where eradication and containment are the primary goals of management programs. To facilitate early detection, effective means of surveillance and monitoring across a range of spatial scales is paramount.

Weed detection and surveillance efforts are typically undertaken using a variety of approaches including on-ground surveys and community reporting. While effective, these approaches do not provide a systematic approach to detection and surveillance, particularly across landscapes and regions. Site accessibility is also a confounding issue, as weed species may often establish in areas with more restricted access, such as waterways with steep banks, or areas subject to disturbance through events such as bushfire and flood. The increasing integration of remote sensing technologies in weed management and biosecurity programs presents an opportunity to improve detection rates, the timeliness of data availability, survey potentially inaccessible areas, and improve the cost effectiveness of surveillance and eradication efforts. 

This session is an opportunity to discuss leading-edge geospatial research in an invasive weed context, in both aquatic and terrestrial environments. We invite papers covering a range of topics, including but limited to:
•	Remote sensing of invasive weed species case studies in natural, urban, and production agriculture environments, both aquatic and terrestrial
•	Applications of remote sensing technologies to detection and surveillance of invasive weed species including different platforms (e.g. satellite, airborne, drone or ground-based) and sensors (e.g. hyperspectral, multispectral or LiDAR)
•	Approaches to remote sensing and spatial data analytics in the context of weed species such as the use of machine learning, deep learning and/or artificial intelligence, or object-based/pixel-based analytical approaches
•	Calibration and validation requirements for remotely sensed data in this context
•	Integration of multiple sources of spatial data to support invasive species management, and
•	The use of spatial data to evaluate management activities which eradicate, or control weed species.

Papers and discussions included in this session will further support the growth and collaboration within this expanding research and application area.
Show/Hide Description
CCS.98: Remote Sensing Data - Data Quality Assessment
S/I.13: Sensors, Instruments and Calibration — Passive Optical Multi- and Hyperspectral Sensors and Calibration
The growing number of remotely sensed data sources offers users more choices. Understanding the characteristics and capabilities of current and new data sources, along with the quality of the data they provide, is an important topic to the remote sensing community.  This session covers data quality assessments from global civil and commercial remote sensing missions along with examples of how data quality affects science applications and analytics. Particular attributes of image data quality of interest are radiometric, geometric, and spatial performance.
 
Radiometric quality assessment measures the accuracy of the radiance or reflectance output by the system under test.  The accuracy is measured in two ways:  1) comparison of the test imagery to the radiance/reflectance of another image from a known high accuracy system with similar spectral properties and 2) assessment of the test imagery against known radiance/reflectance ground targets, such instrumented ground sites or as Pseudo Invariant Calibration Sites (PICS).
 
Geometric quality assessment independently assesses the geometric characteristics of the test imagery.  For a multispectral or hyperspectral remote sensing system, the assessment involves internal and external geometry.  The internal geometric assessment measures relative band-to-band and image-to-image registration.  The external geometric assessment evaluates absolute positional accuracy to a reference.
 
Spatial assessment provides a quantification of blur or lack of sharpness through the estimation of the Point Spread Function (PSF) or Modulation Transfer Function (MTF) in relation to pixel spacing or Ground Sample Distance (GSD). The combination of PSF/MFT, GSD, spacecraft motion, etc. all contribute to the spatial resolution of a sensor. 
 
Data quality topical areas could be related to new systems and technologies, tools, algorithms, processing, automation, and artificial intelligence.
Show/Hide Description
CCS.99: Remote sensing for coastal sustainability
D/S.7: Societal Engagement and Impacts — Remote Sensing for Sustainable Development
Human activities and climate change have significantly changed the global environment, ecosystem, economy, and society from various perspectives over the past decades, especially forming a double squeeze and threat on the coastal zone and its sustainable development. According to United Nations, around 40% of the world’s population lives within 100km of the coast. Coastal sustainability has become a crucial component of global sustainable development. This session topic is highly related to multiple themes in IGARSS, O.4 Coastal Zones, D/S.7 Remote Sensing for Sustainable Development, and several other technical themes with new sensors and new methods. However, there is still a lack of such a focus on coastal sustainability with the support of emerging remote sensing technologies. This topic has been successfully organized with two sessions (9 accepted oral presentations) in IGARSS2023 in Pasadena, USA, as well as one oral session and one poster session with a total of 9 presentations in IGARSS2024 in Athens, Greece. There were over 20 participants attending the sessions with fruitful discussions.

Recent decades have witnessed coastal reclamation and exploitation, coastal ecosystem and environmental evolution, urban population surge, and urban infrastructure expansion, which brings along different social and environmental impacts, i.e., biodiversity loss, ecosystem fragmentation, and climate change induced vulnerability for human beings. Sustainable coastal development addresses the significance of timely and efficiently monitoring of the urban, ecological and environmental processes together with their related issues in coastal regions, including urban sprawl, transportation systems, green space and wetland, biodiversity, air/water pollution, reclamation and aquaculture, natural disasters, etc. The advanced multisource remote sensing techniques, including airborne and spaceborne optical, SAR, and LiDAR at different resolutions together with in-situ data, can provide fine to coarse multi-angle, multi-scale, and multi-frequency observations for coastal monitoring, supporting their resilience and sustainable development. This session invites original research that presents the advances, methodology, and challenges of monitoring different coastal processes and their related issues in coastal regions using multisource remote sensed data. 
Show/Hide Description
CCS.100: Remote Sensing for Oil & Gas: Exploration, Production, and Environmental Monitoring
T/D.15: Data Analysis — Hyperspectral Data Processing and Analysis
Remote Sensing covers a wide range of applications for the exploration-production of Oil & Gas and environmental monitoring of the production, storage and transportation. Satellite constellations generate a large volume of data and cover the entire Earth's surface - Terabytes of new multispectral and hyperspectral data are freely available every single day. In addition, the spatial and spectral resolution of sensors are getting more and more precise to reach cm-scale resolution for optical images and submetric resolution for radar images. This opens new opportunities for surface characterization, either at sea or land surfaces. In onshore areas, remote sensing tools have been used for structural and mineralogical mapping, surface deformation quantification, detection of the modification of vegetation and the detection of soil alteration associated with hydrocarbon seeps. In offshore areas, remote sensing tools are used to detect oil films and to quantify/characterize the oil type. The identification of oil slicks presents strong implications for hydrocarbon exploration by identifying oil seeps but also for emergency response during man-made pollution. Other systems such as Lidar systems may be used to measure Digital Terrain Model (DTM, i.e. ground topography) and detect gases also with exploration and environment applications.
The objectives of this session are to gather academic and industrial communities working with remote sensing techniques in mineral and petroleum research and to get an overview of an ample range of applications. The session will cover the following applications: 
Onshore:
•	Hyperspectral and multispectral applications for geologic and mineral mapping,
•	Surface liquid and gaseous hydrocarbon detection and hydrocarbon-related soil alterations,
•	Mineral and O&G operation field development monitoring,
•	Gas storage facilities (CO2, CH4, H2) and plant monitoring,
•	Application of interferometry to quantify surface deformation,
•	Recent advances in Big data analysis: Deep learning technology

Offshore:
•	Surface hydrocarbon detection,
•	Natural oil seep detection for hydrocarbon exploration,
•	Anthropogenic spill monitoring to guide emergency response,
•	Applications of remote sensing for subsurface mapping and monitoring,
•	Recent advances in the response the Big data: Deep learning recognition of oil slicks patterns.
Show/Hide Description
CCS.101: Remote Sensing for Urban Resilience: Creating Digital Twins
L.6: Land Applications — Urban and Built Environment
Urban resilience refers to the capacity of cities to withstand and recover from various factors, including natural disasters, climate change, and rapid urbanization. Remote sensing plays a vital role in enhancing urban resilience by providing valuable insights into the structure and dynamics of urban environments. Through advanced technologies such as satellite imagery, LiDAR, and UAVs, remote sensing provides tools for detailed mapping and assessment of built structures, offering key data that can be employed for urban planning, disaster preparedness, and infrastructure management. This session, "Remote Sensing for Urban Resilience: Creating Digital Twins," will explore how remote sensing can be leveraged to enhance urban resilience by providing critical insights into the structural integrity and risk exposure of buildings before, during, and after disasters.
The session will cover key topics, including the use of LiDAR, radar, and multispectral satellite imagery to monitor building damage and assess structural changes over time. This session focuses on the application of remote sensing for building digital twins that support urban resilience. Topics include data collection techniques from various remote sensing platforms, data integration and fusion methods, and advanced analysis using machine learning and artificial intelligence (AI). Key case studies will showcase how digital twins are being used for monitoring infrastructure, modeling disaster impacts such as floods and earthquakes, and planning for urban growth while maintaining resilience. The session will also highlight the importance of processing vast amounts of remote sensing data. 
The importance of this topic to geoscience and remote sensing lies in the rapidly growing urban population and the increasing frequency of natural disasters, worsened by climate change. Accurate and timely assessment of built environments through remote sensing provides essential data for decision-makers, urban planners, and disaster response teams. By improving the resilience of urban infrastructure, remote sensing not only aids in the recovery from disasters but also helps design cities that are more resilient to future shocks. This session will bring together experts from academia, industry, etc. to discuss the latest advancements, challenges, and applications of remote sensing in urban resilience, ultimately contributing to safer and more sustainable cities.
Digital twins represent a transformative approach, combining real-time data with predictive analytics, offering a dynamic way to simulate various disaster scenarios. This makes them a crucial tool in urban planning and risk mitigation. In the realm of remote sensing, the ability to create accurate and continuously updated digital twins is a significant advancement for the field, bridging the gap between large-scale earth observation and localized urban management.
Beyond disaster response, remote sensing contributes to long-term urban resilience by supporting smart city planning and sustainable development. This session will offer cutting-edge insights into how digital twins can foster resilience in urban environments and will be of interest to geoscientists, urban planners, policymakers, and engineers working at the intersection of technology and urban sustainability. These insights are critical for building cities that are not only resilient to immediate threats but also sustainable in the long term.
Show/Hide Description
CCS.102: Remote sensing for wetland sustainability
L.9: Land Applications — Wetlands
    Wetlands, often referred to as the "kidneys of the Earth," are crucial for maintaining global ecological balance. Not only do they sustain rich biodiversity,but they also play vital roles in water conservation, water purification, flood control, drought mitigation, carbon sequestration, and climate regulation. With the rapid advancement of remote sensing technology, we now possess unprecedented capabilities to monitor and manage these fragile ecosystems. Currently, remote sensing is providing robust support for the dynamic monitoring and management of wetlands through high-resolution imaging, drone surveillance, and advanced data processing algorithms. The theme of this conference is closely related to IGARSS L.9: Wetlands, SP.3: Remote sensing for sustainable development in the Asia-Pacific region, and several other technical themes with new sensors and new methods. However, there is still insufficient focus on utilizing emerging remote sensing technologies to support the sustainability of wetlands.
     In recent years, wetland ecosystems have been facing unprecedented challenges. Climate change, intensified human activities, and a lack of public awareness have triggered a sharp decline in global wetland areas and biodiversity. Since 1970, inland and coastal wetlands have decreased by approximately 35%, while wetland species populations have dropped by an alarming 84%. If this trend is not promptly reversed, it will pose a serious threat to the ecological security of wetlands. By implementing multi-resolution, multi-temporal, and multi-modal remote sensing monitoring strategies, we can more accurately assess changes in wetlands and provide scientific insights for their conservation and restoration. This conference will focus on the application of these innovative methods, exploring the latest advancements, challenges, and future directions in wetland monitoring and protection. We look forward to participants sharing their research findings, working together to advance the field of wetland remote sensing, and contributing to the sustainable development of wetlands.
Show/Hide Description
CCS.103: Remote Sensing Monitoring of Ocean Mesoscale and Submesoscale Processes
O.1: Oceans — Ocean Biology (Color) and Water Quality
The ocean covers more than 70% of the global surface and is a key regulator of the global climate and weather. It provides humans with abundant food and mineral resources and plays a crucial commercial role in the global economy. As a vast and perpetually moving body of water, studying the coherent energy transfer and material transport processes within the ocean is fundamental to understanding the ocean and its global roles. Oceanic processes form a complex network of energy and material transport, affecting not only the local marine environment but also significantly impacting global climate change, ecosystem health, and the sustainable management of marine resources. They are an indispensable field of study within Earth science. Among these, mesoscale processes contain over 90% of the ocean's kinetic energy and are a key link in the ocean's energy cascade. Submesoscale processes, as crucial elements of energy transfer and material transport, finely regulate ocean dynamics and biogeochemical cycles, which are essential for understanding the micro dynamics of matter and energy in the ocean. In-depth research and monitoring of mesoscale and submesoscale processes in the ocean can not only promote the understanding of oceanic physical processes but also provide key data support for climate change studies.
Traditional methods of acquiring oceanic process data are costly and difficult to implement, with many areas around the globe still lacking any in-situ measurements. Remote sensing technology, particularly satellite remote sensing, has become an ideal tool for monitoring mesoscale and submesoscale processes in the ocean due to its ability to provide high-resolution data over a large area and a long time series. The high spatial and temporal resolution of modern remote sensing technology enables us to overcome the limitations of traditional methods and capture these complex, constantly changing marine phenomena.
Furthermore, this proposal will also promote the further development of remote sensing technology in complex marine scenarios. Since mesoscale and submesoscale processes in the ocean contain rich information and interact with each other in various forms, the need for data processing that is both complex and rich reveals the standards and challenges faced by remote sensing technology in niche application areas. This will encourage breakthroughs in data mining and processing capabilities, thereby driving its future technological development and evolution.
Thus, we propose the initiative "Remote Sensing Monitoring of Ocean Mesoscale and Submesoscale Processes" and outline three sub-topics: marine remote sensing image processing, marine remote sensing data analysis, and marine remote sensing observation methods. We hope that through the solicitation of papers for this initiative, we can obtain more relevant methodologies that inspire and offer new possibilities for the monitoring of mesoscale and submesoscale oceanic processes, as well as expand the potential and direction for remote sensing image processing.
Show/Hide Description
CCS.104: Remote Sensing of the Cryosphere from Spaceborne Altimeters
D/S.9: Societal Engagement and Impacts — Remote Sensing for Climate Change Impacts
The cryosphere is a key component of the Earth’s climate system and a sensitive indicator of climate change. Satellite altimetry which provides information on surface topography has transformed our understanding of the fast-changing cryosphere, especially in the remote polar regions. Since the launches of the first ERS-1 radar altimeter in 1991 and ICESat laser altimeter in 2003, these observations have revealed the dramatic decline in sea ice thickness in the Arctic and the loss of land ice from glaciers, Greenland and Antarctica. The CryoSat-2 and ICESat-2 missions which are currently in orbit, enable the continuity of this now multi-decadal altimetry record and document continual changes in sea ice, terrestrial snow, glaciers and ice sheets. These essential datasets are complemented by observations from lower-inclination missions such as AltiKa and Sentinel-3A/B. This session aims to highlight recent progress made in the remote sensing of the cryosphere using altimeter observations, including descriptions of the sea ice system (e.g., freeboard, ice thickness, floe-size distribution, melt ponds, snow on sea ice) and land ice properties (e.g., land ice elevation, mass balance change, surface roughness). Some of the key topics of this session include algorithm improvements, novel retracking techniques and validation with in-situ data. We particularly encourage submissions on:
•	Synergies between different altimeter-sensors to address current data gaps: lidar, Synthetic Aperture Radar (SAR) and interferometric SAR.
•	Data fusion with other instruments such as radiometers and SAR imagers.
•	Investigations using observations from the Cryo2Ice campaign.
•	Potential cryosphere applications using observations from the recently launched SWOT mission.
•	Cal/val and in situ/laboratory investigations to support the interpretation or the design of satellite missions.
•	Activities that are relevant to the upcoming ESA/NASA mission CRISTAL.


Show/Hide Description
CCS.105: Remote sensing of the potential fields
S/M.7: Mission, Sensors and Calibration — New Space Missions
Remote sensing of the Earth's gravitational and magnetic fields is as old as the space age itself. Such measurements are being utilized to understand the character of the Earth, including processes within its hydrosphere, cryosphere, magentosphere, and the geosphere. The US/German missions GRACE, GRACE Follow-On, and the upcoming GRACE Continuity missions have provided unprecedented insights through the use of mass-change inferred from remote sensing of the gravitational potential. A number of magnetic field measurement missions, such as the German CHAMP mission, or the ESA SWARM mission continue to provide deep insights into the Earth's behavior across all spatio-temporal scales.

Remote sensing of the potential field is unique in that the local manifestation of the planet's potential field is sensed at the location of the satellites and instruments. This is in contrast to remote sensing of other physical properties of the planet where the variable of interest (e.g. elevation, reflectivity, or composition, etc.) at the source is directly sensed through a distant imager or instrument. The methods for solving both the geomagnetic field and the gravitational field inverse problems are closely connected with the technology and quality of measurement. To the extent the satellite is the instrument in case of sensing from orbit, the flight mechanics and technology is closely connected with scientific inferences. 

As a result, this session will seamlessly connect between the fundamental technologies for remote sensing of the potential field; the architecture of the space missions for this purpose; the mathematical and computational methods for solution of inverse problems; and the scientific inference from past and future potential field measurements.

The success of past potential field missions has revealed future community needs (e.g. NASA Earth Science Decadal Survey 2017) for ever greater precision and resolution of global observations, possible only using remote sensing. At the same time, emerging technological innovations - such as quantum sensing - require a science-informed infusion of these innovations into future measurement systems. This session is intended to be a forum where both technology and geoscience communities can jointly develop a roadmap to the future of potential field remote sensing.
Show/Hide Description
CCS.106: Remote sensing to quantify and monitor Earth dynamics in support of the sustainable development goals and climate related impacts: Methods and Applications
D/S.7: Societal Engagement and Impacts — Remote Sensing for Sustainable Development
The 2030 Agenda for Sustainable Development clearly stresses the importance of Geospatial Information and Earth Observations (EO) to monitor progress and achieve the SDG targets. Effective monitoring of the SDG indicators and reporting of the progresses towards the SDG targets require the use of multiple types of data that go well beyond the traditional socio-economic data that countries have been exploiting to assess their development policies. Hence, it is considered of crucial importance to integrate data coming from technologies new to this domain, such as EO, in order to produce high-quality and timely information, with more detail, at higher frequencies, and with the ability to disaggregate development indicators. EO, together with modern data processing and analytics, offer unprecedented opportunities to make a significant change in the capacities of countries to efficiently track all facets of sustainable development. 

Amongst all the SDG targets, those related to a sustainable use of natural resources are of particular importance since pressures on our planet’s environment and finite resources are expected to increase further in the future to support continued economic growth or increased food production and consumption patterns. Recent advances in EO research, both on methodological development and technological solutions, offer promising prospects for helping countries set up informed and evidence-based development policies for an optimum management of terrestrial, coastal and marine resources.

The increasing spatial, temporal and spectral resolutions of EO data offer an invaluable opportunity for better informing development policies and quantifying various SDG indicators. However, those EO advances pose several challenges related to the acquisition, processing, integration, analysis and understanding of the data which need to be tackled by the scientific community in order to ensure operational applicability.

This invited session aims at presenting and showcasing the latest advances in EO solutions for achieving the SDG targets and quantify climate related impacts, monitoring progress and reporting on the SDG global indicator framework. The focus should be on remote sensing methods and applications on how to derive information products, that have potentially a local, regional or global impact. The potential methods could be obtained from optical, lidar or SAR systems including observation of different imaging modes (interferometry, polarimetry, tomography, multi-frequency, optical indices, EM models, lidar waveforms) and methods to quantify the derived product. 

DEIA: Two of the proposers are women from two different continents. We will advertise the session widely and encourage submission by all members of the Earth Science and Technology community.

Show/Hide Description
CCS.107: SAR Image Restoration and Deep Learning: the added value of artificial intelligence in handling noise in different SAR image configurations.
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
In the last forty years plenty of algorithms and solution have been defined for fully exploiting the potential of SAR images and extracting information by SAR data. Therefore, SAR processing has been assessed as one of the main important tools for the Earth Observation. Thanks to their coherent nature and to the ability of working in any meteorological conditions, SAR data are being intensively used for many different applications such as 3D reconstruction, feature extraction, land cover, target recognition, scatterer detection, data fusion and many others, within different domains (urban, natural, marine, etc…). Afterall, the SAR data needs a strong pre-processing restoration effort for facilitating the above-mentioned applications: denoising, co-registration, unwrapping, etc….
In the last years, the SAR image processing domain, and in particular SAR image restoration pre-processing, is observing a shift from traditional model-based solution to the deep learning (DL) based one. Indeed, the need of a fast and precise processing of the great amount of data new advanced sensors provide with short revisiting time matches with the versatility, efficiency, and wide applicability of DL methods. The ability of automatically extracting features from the data allow to achieve State-of-Art performance in many restoration tasks in different SAR domains such as amplitude, interferometry and polarimetry perfectly fit with the need of a fast and precise processing of SAR images for a continuous Earth Observation.
The actual obtained results are impressive, but there are many boundaries that need to be better explored. 
This session addresses the issue of exploiting the potential and, even more, the actual limitation of DL based methodologies in performing classical and not straightforward signal processing tasks in the SAR image domain with particular attention to SAR Amplitude, InSAR and PolSAR configurations.
This session aims to stimulate an open discussion on the application of DL methods in different SAR restoration per-processing tasks and seeking for new horizons and shared perspectives. 

Show/Hide Description
CCS.108: SAR Interferometry for Assessing Land Surface Deformation at the Local, Regional, and National Scales
T/S.1: SAR Imaging and Processing Techniques — Interferometry: Along and Across
Synthetic Aperture Radar (SAR) interferometry has become a powerful tool for monitoring land surface deformation, providing high-resolution insights into ground motion and surface changes across various scales. This session will explore the diverse applications of SAR Interferometry (InSAR) in assessing land deformation caused by both anthropogenic and natural factors. One critical aspect covered will be monitoring land deformation due to groundwater extraction, which poses a severe risk of damage to urban infrastructure. This session will also explore how underground tunneling for metro, railway, and hydropower projects affects land stability, which can be monitored using InSAR, and emphasize the importance of time-series InSAR-based analysis for tracking deformation around large-scale megaprojects like reservoirs, dams, and airports to ensure their long-term infrastructure safety. Furthermore, the session will examine the role of SAR interferometry (InSAR) in climate impact assessments, focusing on issues like coastal land subsidence, sea-level rise, and their effects on coastal cities, glacier dynamics, and permafrost ground deformation monitoring. The session will also analyze the application of SAR interferometry in actively monitoring land deformation triggered by natural disasters such as landslides, earthquakes, and tectonic movements, which are crucial for ensuring safety and preparedness in seismically active regions. Recent advances in Persistent and Distributed Scatterer detection techniques will also be discussed, along with innovative machine-learning approaches for interpreting InSAR data. These advanced processing techniques are essential for efficiently handling InSAR Big datasets. Lastly, we will explore the national and regional policy implications of land deformation monitoring, stressing the importance of informed decision-making in mitigating risks and improving resilience.

The topics to be covered in this special session include, but are not limited to: 
1.	Tracking Land Deformation in Urban Infrastructure and Built Structures with SAR Interferometry 
2.	Climate change impact assessment using InSAR 
3.	Coastal Subsidence monitoring and Sea Level Rise
4.	Time-Series Analysis of Land Deformation over Megaprojects such as dams and airports
5.	Groundwater-induced land subsidence monitoring over basin and sub-basin scale
6.	Landslides, Earthquake, and Tectonics-induced Land Deformation Monitoring
7.	Land Uplift monitoring due to Sub-surface Fluid Injection processes
8.	Machine Learning Approaches for InSAR Data Processing and Interpretation
9.	Advanced algorithms, Novel Techniques, and modeling frameworks for SAR Interferometry 
10.	Innovative Processing frameworks for Big Data SAR Interferometry
11.	National and Regional Policy Implications of Land Deformation Monitoring
Join us for an in-depth exploration of the latest advancements in SAR interferometry and its critical role in safeguarding urban environments and guiding policy on land management and disaster preparedness.
Show/Hide Description
CCS.109: SAR Monitoring of Hazards on Marine Coastal Environments
SP.1: Special Scientific Themes — Natural disasters and disaster management
Coastal marine environments, being invaluable ecosystems and host to many species, are under increasing pressure caused by anthropogenic impacts such as, among others, growing economic use, coastline changes and recreational activities. A continuous monitoring of those environments is of key importance for the identification of natural and anthropogenic hazards, for an understanding of oceanic and atmospheric coastal processes, and eventually for a sustainable development and use of those vulnerable areas. Here, Synthetic Aperture Radar (SAR), because of its high spatial resolution, along with its independence of day and night,  and its all-weather capabilities, is one sensor of choice.

This Community Contributed  Session (CCS) will focus on the way, in which SAR sensors can be used for the surveillance of changing marine coastal, environments, and how these sensors can detect and quantify processes and phenomena that are of high relevance for the local fauna and flora, for coastal residents and local authorities, and for a better quantification of hazards caused by global change and other anthropogenic impacts. Examples include:

o	Coastline changes and coastal morphodynamics
o	Coastal run-off and marine pollution
o	Wind fields and storm events
o	Surface waves and currents
o	Illegal, unreported and unregulated fishing
o	Operational use of SAR and other sensors for coastal zone applications.

Although the focus of this CCS is on the application of SAR, the CCS organizers encourage submissions of non-SAR presentations that complement SAR and lead to a more comprehensive solution to address multiple end-user needs for sustainable development of the marine coastal zone. 
Show/Hide Description
CCS.110: Satellite gravimetry/GNSS for water resources across Canada
SP.2: Special Scientific Themes — Geoscience and remote sensing in developing countries
The increasing vulnerability of water resources, both surface and groundwater, due to natural and human-induced factors poses a significant challenge in geoscience. As the demand for water rises amid climate change, land use, and population growth, effective monitoring of these resources is critical. Traditional field-based methods of water monitoring, while valuable, are often limited by spatial constraints, high costs, and labor-intensive data collection processes. Remote sensing and Global Navigation Satellite System (GNSS) technologies, including Gravity Recovery And Climate Experiment (GRACE), GRACE- Follow On (FO), and Global Positioning System (GPS), have emerged as powerful tools to supplement or replace traditional approaches.

These technologies offer global coverage with consistent spatial and temporal resolutions, enabling researchers and water managers to monitor large-scale water dynamics. GRACE and GRACE-FO, for example, measure changes in Earth's gravity field to infer variations in terrestrial water storage, providing insights into groundwater depletion, droughts, and surface water variations. GPS networks contribute by detecting land surface changes caused by subsurface water fluctuations. Despite the promise of these methods, challenges remain in fully integrating satellite and GNSS data into local and global water management practices.

Key issues include the need for higher-resolution data for small-scale water systems, integrating multi-source data for more accurate assessments, and overcoming uncertainties in remote sensing measurements. Additionally, translating satellite observations into actionable insights for policymakers and water managers remains a challenge. This session will delve into the latest advancements in remote sensing and GNSS applications for water resource monitoring, discussing their growing importance in addressing the increasing vulnerability of water resources. We will also explore current limitations and the need for further innovation to ensure that these technologies can meet the demands of a changing world.
Show/Hide Description
CCS.111: Satellite-Based Remote Sensing for Disaster Response
SP.1: Special Scientific Themes — Natural disasters and disaster management
Disaster response is a critical mission helping to save lives worldwide from emergencies that are diverse as drought, flood, cyclones, earthquakes, and fire. In the last few years, the world has seen a proliferation of satellite-based remote sensing systems that provide a unique tool for supporting the disaster response mission. These commercial and open-source systems provide near real-time, large-scale information that is crucial for early warning systems, damage assessment, and resource allocation. Despite these systems already being used in practice, technical improvements are still needed to enhance situational awareness, support decision making, and save lives by enabling a more efficient and effective response to emergencies. 

This session will bring together satellite-based remote sensing researchers interested in disaster response by focusing on three technologies:
 
1) Sensing: The session seeks papers that introduce new satellite-based remote sensing systems that are freely available to both researchers and mission teams focused on disaster response. These systems could include newly launched satellite systems as well as newly planned satellite systems.

2) Processing: The session seeks papers that improve the processing of remotely sensed data from optical (Electro-Optical / Infrared / Polarimetric / Multispectral / Hyperspectral) and Synthetic Aperture Radar (SAR) systems. Improvements could include new SAR processing techniques that leverage the magnitude and phase information for better situational awareness or image processing techniques that reduce noise and sharpen the imagery for analysis.

3) Analysis: The session seeks papers that create meaningful information that can be used by disaster response personnel to save lives. For this session, analysis could include new methods to quickly identify damage, provide decisions to emergency managers on where to move personnel and supplies, or fuse multiple sources of information to provide better situational awareness in near real-time. 
Show/Hide Description
CCS.112: Scaling rice Measurement, Reporting, and Verification platforms with Earth Observations
SP.3: Special Scientific Themes — Remote sensing for sustainable development in the Asia-Pacific region
Rice is a staple food and livelihood for over 5 billion people. However, producing rice poses significant sustainability challenges due to its intense water requirements and greenhouse gas (GHG) footprint from traditional cultivation practices. To combat these challenges, crop management strategies such as direct seeded rice (DSR) and alternate wetting and drying (AWD) are being promoted. The remote sensing community is contributing to these advances by tracking the adoption and impacts of sustainable agricultural practices using digital Measurement, Reporting and Verification (dMRV) quantification tools and platforms. A core pillar of next generation dMRV is the use of multi-source Earth Observations to assess management practices, drive process based crop models, and gauge the impacts of management on water use, GHG emissions, and yield. This CCS invites presentations that leverage Earth Observations and satellite remote sensing to assess, track, and quantify rice crop management practices, productivity and sustainability. Of particular interest to the community is advancing the Application Readiness Levels of assessing rice crop phenology at national scales, distinguishing seeding practices, characterizing irrigation and inundation, measuring residue biomass, and quantifying methane and nitrous oxide emissions. Techniques can include time series, physical retrievals, field experiments, multi-frequency SAR & SAR-optical fusion, and data science and machine learning.
Show/Hide Description
CCS.113: Space Lidar: Missions, Technologies, and Observations
S/M.5: Mission, Sensors and Calibration — Spaceborne LIDAR Missions
This session proposal is from the Chairs/Co-Chairs of the Working Group of Active Optical and Lidar, IEEE-GRSS Technical Committee on Instrumentation and Future Technology (IFT). It is in-line with the GRSS outreach to Space Agencies and Industry and the IGARSS 2024 special theme on “SP.9: Wave propagation and remote sensing.”

It is planned to have one session (10 papers) on Space Lidar: Missions, Technologies, and Observations with emphasize on the invited papers from international space agencies, industry, and academia on enabling technology developments, space missions and observations.

This session focuses on research and developments in an important topic in active optical remote sensing: Space Lidar. Lidar's unique capability to observe a diverse variety of geophysical phenomena from orbit around the Earth and planets has stimulated new areas of remote sensing research that now attracts the attention of scientists and engineers worldwide. With a number of instruments already operational or pending launches within the coming years, many of their original technological issues have been resolved, still the long term reliability of key active components and the survival on harsh space environment requires additional efforts and investments. Some the key topics for this session are:

• Space Agencies (NASA, ESA, JAXA, CNES etc.) on-orbit missions, future missions, technical challenges, observations, and science products
• Continuation of work in the domain of long-lived/ high power UV/visible/infrared lasers and optics, especially contamination and optical damage
• Research to improve the reliability of lasers/diode lasers and high-power optics operated in vacuum 
• Space-qualification of tunable lasers and optics to support trace gas lidars operating in the 1-5 µm region 
• Space-qualification of higher efficiency lasers such fiber lasers and amplifiers. Radiation hardening is an area of particular concern 
• General power scaling of space-qualified lasers with a focus on improved efficiency and thermal tolerances 
• Improved high gain, low dark noise and low NEP space qualified/qualifiable array/detector at all wavelengths, and in particular in IR bandwidth 

This session will focus on science and applications addressed by space-based lidar, as well as on techniques and supporting technology. Our goal is that this session is to provide a stimulating forum where members of the international lidar and related technology communities can present and discuss results, trends, and future directions.


Show/Hide Description
CCS.114: Space missions on the thermal infrared radiometry of the Earth at high spatio-temporal resolution
L.5: Land Applications — Agriculture
Energy transfer and exchanges of water and carbon fluxes in the soil–vegetation–atmosphere system need to be described to enhance the role of environmental biophysics. Climate indicators include water stress, urban heat island, flood and fire, drought periods, sea rise, ocean heat, retreat of glaciers and sea ice extent and shrinking ice sheet. Most of them can be well depicted provided that realistic information is obtained, which can be reached from a list of Essential Climate Variables (ECV) (Global Climate Observing System/GCOS) among which the land surface temperature (LST), the land surface emissivity (LSE) and the evapotranspiration (ET). LST is defined as the radiative skin temperature and is broadly considered in agriculture (plant growth, water stress, crop yield, precision farming, early warning, freezing scenarios), in hydrology (water cycle, catchment water, etc), and in meteorology. LSE partitions the surface attributes (vegetation, soil, snow, water, rock, manmade material) that shape the landscape. LST is a proxy for surface energy budget, urban heat island, and mixing processes over the coastline and shoreline.
A new generation of spaceborne platforms will measure the Thermal InfraRed (TIR) radiometry at a high spatial resolution - between 30 m and 100 m typically - with a frequent revisit at noon and night time. The precursor is ECOSTRESS ( ECOsystem Spaceborne Thermal Radiometer Experiment on Space Station ) flying today aboard the International Space Station (ISS), followed by TRISHNA (Thermal infraRed Imaging Satellite for High-resolution Natural Assessment), SBG (Surface Biology Geology) and LSTM (Land Surface Temperature Monitoring). A design driver of these foreseen space missions is water use in agriculture which represents 70 % of global resources, making sustainable irrigation a key issue. Automatic detection and mapping of irrigated farmland area is vital for many services in charge of water management. In that respect, cutting-edge TIR signal will bring new insights, additionally to visible and near infrared observations, on irrigated areas that display lowest LST values at the peak of growth. Indeed, the global change imposes an implementation of more efficient irrigation practices at the scale of an agricultural plot for better control. The decrease of moisture within the soil after water supply can be evaluated from the surface moisture estimated by radar but TIR observations remain better-suited to monitor vegetation water stress and irrigation at the agricultural plot to adapt the proper needs for individual crops. This is why it is important to develop new satellite observation systems in TIR range that conciliate spatial resolution and high revisit capabilities.
This special session will present the specificities of the forthcoming space missions devoted to the measurement of the thermal infrared signal of the entire Earth system with the successive launches of TRISHNA (2026), SBG (2027) and LSTM (2029, 2030). Since these Earth observing systems will overlap in time, joint efforts are set up between space agencies (CNES, ISRO, NASA, ESA) to develop mutual research work and harmonize missions characteristics  regarding the deliverables and the Cal/Val strategy. This special session will permit to highlight this through missions presentations.
Show/Hide Description
CCS.115: Space-based Imaging Spectrometers: Status of Current Missions and Planned Missions and Sensors
S/M.4: Mission, Sensors and Calibration — Spaceborne Hyperspectral Missions
This session provides the remote sensing community an update on government, commercial, and other space-based imaging spectrometers (hyperspectral imagers). This session creates an opportunity to inform the community on current missions providing data, for example, NASA's EMIT (Earth Surface Mineral Dust Source Investigation), the German Aerospace Center’s DESIS (DLR Earth Sensing Imaging Spectrometer) and EnMAP (Environmental Mapping and Analysis Program), ASI’s PRISMA (PRecursore IperSpettrale della Missione Applicativa), JAXA’s HISUI (Hyperspectral Imager Suite), and China’s GaoFen-5 AHSI (Advanced Hyperspectral Imager). The session is also a venue to highlight newly launched sensors designed to operate alone on small satellites, for example, Planet’s Tanager, on larger independent platforms, on the space station, or as part of a constellation, for example Orbital Sidekick’s GHOSt-1 and -2 (Global Hyperspectral Observation Satellite). This venue also facilitates communication of the latest developments on forthcoming missions, such as ESA’s CHIME (Copernicus Hyperspectral Imaging Mission for the Environment) and NASA’s SBG (Surface Biology and Geology). As data usage and societal impact are key goals of Earth observation missions, this forum also invites presentations on data systems, tools for data visualization and analysis, user needs and experiences, outreach to underserved communities and co-development of research and applications. As the pace of global change has increased the demand for science quality data to address new challenges, this session also seeks presentations on novel and emerging science and applications which have exceeded core objectives of imaging spectrometer missions. Discussions of leveraging national missions of varied measurement modalities, international cooperations, and partnerships between government, commercial, and other groups are encouraged.
Show/Hide Description
CCS.116: Spaceborne bistatic SAR: missions, systems, processing, and applications
T/S.5: SAR Imaging and Processing Techniques — Bistatic SAR
Within the context of microwave Earth Observation, synthetic aperture radar (SAR) systems working in bistatic configurations have been gaining increasing attention within both the scientific community and public and private bodies. Compared with conventional, i.e., monostatic, SAR imaging, bistatic geometries enable innovative acquisition modes and applications based on the increased degrees of freedom of the system. Single-pass interferometry with a single transmitter and two receivers is just one example, but further applicative opportunities can be envisioned in support of fundamental societal challenges, ranging from climate change to natural and anthropogenic disaster monitoring and mitigation. In 2023, the first demonstration of a spaceborne acquisition with a large bistatic angle was carried out by two satellites of the SAR constellation operated by the Capella Space company. The forthcoming launch of the PLATiNO-1 (PLT-1) mission operated by the Italian Space Agency, will make spaceborne bistatic SAR move from a speculative perspective to an operational phase, where bistatic SAR imagery from spacecrafts will be collected systematically by the cooperation of PLT-1 and CosmoSkyMed-2nd generation satellites. In the context of space-based bistatic SAR, the Harmony project has been recently selected by the European Space Agency as the tenth Earth Explorer mission. Harmony is conceived as a constellation of two receive-only satellites equipped with a C-band SAR payload operating in tandem with a Sentinel-1 satellite, which serves as an illuminator. 

Compared to the monostatic geometry, bistatic acquisitions pose numerous challenges spanning from the design and development phases – where synchronization between the transmitter and the receiver, orbit and attitude control to ensure an adequate coverage area are of key relevance – to data synthesis (conventional monostatic strategies might be inaccurate), modeling (new bistatic electromagnetic scattering and speckle models might be required) and exploitation (innovative/visionary applications could be driven by the bistatic observation geometry).

This Community Contributed Session is aimed at showcasing the latest advances on bistatic SAR systems with a special focus on those systems where at least one platform is spaceborne. Within this context, we welcome contributions covering diverse aspects related to mission and system design and development, data synthesis algorithms, and applications of bistatic SAR data.
Show/Hide Description
CCS.117: Synergistic Applications of VSWIR hyperspectral and thermal infrared data in support of future Earth science satellite missions
S/M.7: Mission, Sensors and Calibration — New Space Missions
New satellite missions will provide Earth observations of unprecedented combined spectral, spatial, radiometric, and temporal resolution. Contemporaneous observations from visible to short-wave infrared (VSWIR) hyperspectral and multispecctral thermal infrared data can help address complex and unique scientific and application questions. Global VSWIR and TIR measurements are expected from future satellites, such as NASA’s Surface Biology and Geology (SBG) and ESA’s CHIME and LSTM missions, among others.  Areas of potential application for joint VSWIR hyperspectral and TIR data include improved measurements of surface mineralogy, ecological biodiversity, agriculture,  land cover land use change, greenhouse gases, water quality, natural hazards such as wildfires and volcanic eruptions, surface energy balance, snowpack conditions, public health, and urban development, to mention a few. This session highlights synergistic use of VSWIR and TIR data that address unique questions and improved decisions in the areas above. 

There is a large amount of VSWIR and TIR data currently available, including global space-based satellites, such as EMIT, ECOSTRESS, DESIS, EnMap, PRISMA, and others. Recent airborne campaigns, such as USGS’s GEM-X and NASA’s BioSCape and SHIFT, have captured high spatial-resolution VSWIR and TIR data. This session will provide a forum for discussing development of algorithms that use VSWIR and TIR data for Earth science and applications as well as synergistic use of VSWIR and TIR derived data products, using this wealth of data. This can include, both joint products and joint science (where products are derived independently).  In addition, this session aims to strengthen ties between researchers who use VSWIR and TIR data for Earth science and applications and reveal best-practices for fusion of these diverse data sources.
Show/Hide Description
CCS.118: System Engineering and Supporting Technologies for Future Satellite Missions.
S/I.15: Sensors, Instruments and Calibration — Advanced Future Instrument Concepts
There is a need for a deeper understanding of the supporting satellite systems used to enable and accommodate future instruments. Satellite system engineering is a collection of interdisciplinary fields that allows the transformation of observations into a user product.  Future sensor systems are expected to have higher spatial and temporal resolution and generate data volumes at least one order of magnitude higher than today’s systems.  More capable on-board propulsion and launch vehicles plus the ability to shrink sensors and pack them into smaller satellites (such as cubesats) enables access to more orbits and enables constellations of small satellites to be used to fulfill observing needs.  Satellite on-board data handling will have the challenge to collect and process these data volumes, potentially apply data compression, and in some cases perform data processing to reduce the total downlink volume. Space communication will need to handle the increasing data volume by using all the available bands (S, K, KA, Q/V). Intersatellite links are becoming available in LEO and higher orbits. Geostationary orbit offers the capability of almost continuous observations as well as data downlink.  Modern computing cloud environments offer rapid data processing scalability, but this environment must be guarded with state-of-the-art security apparatus due to continuous and changing security threats. The final challenge is in distributing the processed data to a host of users in a timely and user-friendly manner. System engineering needs to be applied to orchestrate all of this emerging capability in a holistic way to allow maximum impact of future sensing systems. 
Show/Hide Description
CCS.119: Technology Validation and Science Observations from CubeSat and SmallSat missions.
S/M.7: Mission, Sensors and Calibration — New Space Missions
After seven years of successful CubeSat focused sessions at IGARSS, this year (2025) we plan to organize sessions focusing on progress of CubeSat and SmallSat missions contribution to validation of technology and observation methods critical to future of remote sensing. Since 2012 NASA Earth Science Technology Office has ongoing research program targeted toward technology validation in space, In-Space Validation of Earth Science Technologies (InVEST). This program encourages flying new technologies and new observation methods on CubeSat platforms. Recently InVEST program made leap into technology demonstration on SmallSat platform. This advance was made possible due to commercial industry investments in SmallSat platform development, which has made the SmallSat platforms competitive with CubeSat platforms.  Recently many other space agencies like ESA, JAXA, ASI, and Australian have selected or launched instruments on CubeSat platforms. This shows an increased interest in CubeSat based missions. 

Since the community has graduated from amateur experiments of building CubeSats in University settings, it is time to look into possible science applications of these platforms. Recently NASA and ESA have been able to gather science grade data from some of their CubeSats.  In this session we plan to focus on NASA/InVEST, ESA/Scout and similar programs in other organizations including Australian programs. This will give an opportunity for those teams to showcase the latest developments in the remote sensing through smaller platforms. Principal Investigators will be presenting their latest results from their missions. 
This session will be a high-level forum bringing together scientists from all over the world involved in the research, design, and development of CubeSat based instruments for Remote Sensing Applications. Also the sessions will give an opportunity for broader community to get a quick overview of the latest technology in Remote Sensing through CubeSats and SmallSats.
This session is organized as a part of GRSS Instrumentation and Future Technologies (IFT) technical committee working group on Remote Sensing Instruments for Small Satellites.
Show/Hide Description
CCS.120: Terrestrial Radar/SAR Systems and Applications
S/M.9: Mission, Sensors and Calibration — Ground based Systems
Over the past decade, Synthetic Aperture Radar (SAR) technology has evolved into a mature and indispensable tool for Earth observation, particularly with orbital platforms. These systems have significantly contributed to addressing critical global challenges such as climate change, food security, and the monitoring of natural and anthropogenic hazards. While spaceborne SAR systems will continue to play a pivotal role in the future, terrestrial radar and SAR systems have emerged as a valuable complement. Terrestrial platforms provide advantages for local-scale monitoring and time-sensitive applications where orbital systems face limitations, particularly in terms of high revisit times. For many applications, such as monitoring subsidence or vegetation dynamics, a temporal resolution of minutes, hours, or days is necessary, which orbital SAR systems cannot always provide.

The growing scientific interest in ground-based radar and SAR systems is mirrored by their increasing appeal to private industry. Commercial applications, particularly in the fields of subsidence monitoring, agricultural crop and vegetation monitoring, and change detection, have driven demand for agile, compact, and high-performance terrestrial SAR systems. These systems not only require innovative SAR hardware designs but also advanced navigation solutions using compact systems, sometimes combined with vision technologies. Moreover, terrestrial SAR applications demand the development of dedicated SAR imaging algorithms and Differential Interferometric SAR (DInSAR) processing chains to address the unique challenges posed by diverse imaging geometries.

This invited session seeks to bring together contributions from both academic research institutions and private companies to showcase the latest advancements in terrestrial radar and SAR systems. Presentations will highlight current state-of-the-art technologies, platforms, and applications, with a particular emphasis on geoscience-related use cases. By providing a platform for cross-sector exchange, this session aims to foster collaboration between research and industry to accelerate the development and deployment of terrestrial SAR technologies for societal benefit.

This session has been proposed and accepted in the past IGARSS. For instance, at IGARSS 2024, the attendance was in the order of 80-90 people.
Show/Hide Description
CCS.121: The concept and new development in real time remote sensing
S/M.7: Mission, Sensors and Calibration — New Space Missions
Earth observation satellite development worldwide supports and promotes the remote sensing application in various sections. For the major applications, the logic of retrieving information from satellite image is that user has to access to the satellite image before one can start to work on the image. The latency of accessing to the satellite image is days or hours for most satellites in the world. However, for emergency applications and operation business, the latency of days or hours is not tolerated. Therefore, the concept of real time remote sensing was proposed by Prof. Wang Qian in China in 2020 and is trying to develop a new solution of circulating the information from satellite direct to the users instead of transmitting the satellite image in the case of emergency application. 

Therefore, I would like to propose this session to address the SPEED issue of remote sensing applications. We believe the timely information to the user hands through the concept of real time remote sensing should be a very importance subdirection of remote sensing development. In this proposed session, we want to have speakers worldwide who are studying in the relevant fields, work together to exchange the lesion learned and promote the concept of real time remote sensing and finally build the community for this concept thereafter.

So, we expect the topics for this proposed session as following:
The theory concept, principle of real time remote sensing
The new earth observation scheme of satellite
The information retrieval algorithm with onboard computation
The development of onboard computing facility
The communication of satellite, ground station, users for information flow
The newly developed satellite program for real time remote sensing
Show/Hide Description
CCS.122: The Future of Analysis Ready Data (ARD): Ensuring More Data Delivers Greater Impact
T/A.21: AI and Big Data — Spatio-temporal Data Harmonization
“Analysis-Ready” Earth Observation (EO) satellite data are data that have been processed to a minimum set of requirements and organised into a form that allows immediate analysis with a minimal amount of additional user effort and interoperability both through time and with other datasets. The International Committee on Earth Observation Satellites (CEOS) initiated the “CEOS-ARD” certification process in 2018 to facilitate easier use of EO data, improved automation, and enhanced multi-source applications of observations from public and private sector EO missions.
This session will focus on several key areas that are vital to ARD’s successful evolution. These factors include improved EO data interoperability, the growing role of cloud computing infrastructure for large-scale data processing, Artificial Intelligence (AI) and the integration of commercial and public sector EO datasets. Bringing together these datasets in multi-source decision-support tools is crucial for improving climate adaptation, disaster response, and sustainable development. These applications and others will benefit from the combination of diverse EO data sources including the enhancement and development of reliable and effective tools. 
The great community uptake of CEOS Analysis Ready Data (CEOS-ARD) to date has demonstrated the potential when barriers to entry for large-scale EO data are reduced allowing a broader range of users to more easily access and automate the use of land imaging datasets. The EO community now needs to decide how to take the next step forward. In doing so, it is important to maintain trust, high-quality data, and integrity across multiple data providers and satellite missions.
A key session theme focuses on the potential of ARD to support seamless data interoperability across multi-missions and providers. Enhancing concepts like the CEOS-ARD framework allows diverse datasets from various EO missions, platforms, and providers to integrate and expand our understanding of Earth's systems. ARD also promotes transparency and accountability in AI applications by ensuring data is interoperable, consistent, and traceable. Additionally, the discussion will cover the role of cloud computing infrastructure, which is essential for providing the power and scalability needed to manage and analyse large volumes of EO data.
The session will also explore the importance of deeper partnerships between government and commercial EO data providers. Governments provide reliable long-term data, while commercial data providers bring innovation and scalability. As commercial sector involvement in EO increases, integrating these datasets with traditional public sources will enhance ARD’s availability and value. This combination is expected to create new opportunities in areas like precision agriculture, urban planning, climate forecasting, and disaster risk reduction, making high-quality data more accessible and actionable.
Further, this session aims to provide a comprehensive overview of the ARD concept and its adoption by public and private sector EO mission operators, showing how its growth could potentially advance societal benefits at the local, regional, and global scales. With improvements in data interoperability, cloud-based infrastructure, commercial dataset integration, and global collaboration, ARD is set to play a key role in making EO data more accessible, scalable, and effective in tackling environmental, natural resource, and climate challenges of today and tomorrow.
Show/Hide Description
CCS.123: The NISAR Mission: Status and Early Science
S/M.1: Mission, Sensors and Calibration — Spaceborne SAR Missions
The NASA-ISRO SAR (NISAR) Mission will launch in early 2025, delivering a flood of data acquired over land, ice, and coastal oceans.  After launch and a 90-day commissioning period, science operations will begin, where NISAR instruments will collect about 35 Terabits of L-band and 5 Terabits of S-band radar image data each day.  The L-band radar will observe all land and ice-covered surfaces of Earth on the ascending and descending portions of each orbit every 12 days, and the S-band radar will observe India and surrounding areas, Antarctica, and distributed global scientific areas of interest.  Nearly all S-band acquisitions are collected simultaneously with L-band acquisitions, creating a unique globally distributed time-series data set. The commissioning plan calls for early engineering mode acquisitions around one month after launch, some of which may be usable to form images, followed by a period of orbit adjustment and system timing and pointing calibration. To prepare for science operations, the science team has developed a list of areas where early data can be acquired to demonstrate the preliminary quality of the data and to illustrate the science themes NISAR is addressing: solid Earth sciences, ecosystems sciences including global soil moisture, and cryosphere sciences, as well as a host of applications.  Cloud-based tools for image processing and diagnostic analysis, usable by the science team members and science community alike, will be available to examine these early data sets.  The mission plans to release data to the science community at all levels as soon as possible, including preliminary pre-calibration data sets marked appropriately, to allow scientists to begin working with the data and provide feedback to the mission team.  Once the system is operating at its planned cadence, the data will be made available within a few days of acquisition from the Alaska Satellite Facility Distributed Active Archive Center.
Show/Hide Description
CCS.124: The Prospect of Remote Sensing Foundation Models: From Generation to Application
T/A.22: AI and Big Data — Data Analytics and AI Techniques in Remote Sensing
Remote sensing foundation models, enabled by massive amounts of Earth Observation data and Artificial Intelligence, are changing how remote sensing data is analyzed and providing deeper insights and efficient solutions for multiple downstream remote sensing tasks. However, there are clouds over the future of remote sensing foundation models. One is the process of generating foundation models. Although a vast amount of remote sensing data is generated every day, the ethical issues in data use, such as data sharing, privacy protection, and regulations, as well as the spatial and temporal heterogeneity of data, pose a significant challenge to the architectural design and generalization capability of the foundation models; The training cost of foundation models, including the cost of data acquisition and model training are very huge. Secondly, many problems still need to be explored in the direction of the application of foundation models. From the data perspective, whether there are regional bias in the data distribution of the pre-training data, and whether the distribution contributes to the bias predicted by the foundation model, thus creating geographical inequalities. Considering from downstream tasks, the potential of foundation models for urbanization monitoring, environmental protection, poverty eradication, and the achievement of the Sustainable Development Goals (SDGs) needs to be discussed.

Our session fits well with the theme of IGARSS2025 and responds positively to the use of remote sensing for collaborative global solutions. We want to bring together researchers from different backgrounds, including remote sensing data providers, remote sensing foundation model algorithm researchers, and researchers with backgrounds in ecology, urbanism, and sustainable development, to discuss and demonstrate their state-of-the-art Earth observation technologies and data, cutting-edge fundamental model design and training techniques, and the most influential fundamental model applications, to move towards a more equitable and effective global remote sensing foundation models through international cooperation, ethical considerations, technological innovations, and application orientation, thereby achieving the One Earth goal.
Show/Hide Description
CCS.125: Thermal Infrared Remote Sensing for Global Climate Change Analysis
SP.5: Special Scientific Themes — Global warming, climate records and climate change analysis
Global climate change is among the most urgent challenges facing humanity, with far-reaching impacts on ecosystems, economies, and public health. The need for reliable, comprehensive data on surface and atmospheric hydrothermal dynamics is more critical than ever. Thermal Infrared (TIR) remote sensing has emerged as a vital technology for obtaining both surface thermal properties and energy exchange processes, providing high spatial and temporal resolution data at global and regional scales. TIR remote sensing allows for accurate monitoring of land and sea surface temperatures, evapotranspiration, and other key parameters essential for understanding climate dynamics and their effects across various scales.
This session will focus on advancing global climate change analysis through the use of TIR remote sensing. It will cover breakthroughs in payload design and satellite sensor performance, which have improved data accuracy and collection frequency; new methodologies for retrieving surface temperatures, emissivity, and other climate-relevant variables; innovations in data fusion and product generation that enhance spatial and temporal coverage; and the application of TIR remote sensing to analyze and attribute climate-related phenomena such as thermal anomalies, urban heat islands, droughts, and extreme weather events.
This session aims to provide a platform for the exchange of knowledge on the applications of TIR remote sensing in global climate change analysis, emphasizing both the technical challenges and the scientific opportunities within this field. By highlighting case studies and theoretical advancements, the session will explore the broader implications of TIR technology for climate modeling, policy-making, and global sustainability efforts. By bringing together researchers and practitioners from diverse disciplines, this session aims to deepen our understanding of global climate change while fostering the development of innovative tools and strategies to mitigate climate risks.
Show/Hide Description
CCS.126: Thermal infrared remote sensing: Techniques and Applications
S/M.8: Mission, Sensors and Calibration — UAV and Airborne Platforms
Thermal infrared remote sensing has rapidly emerged as a crucial tool for observing the Earth's surface and atmospheric conditions. Recent advancements in satellite and UAV technology and sensor capabilities have significantly improved our ability to monitor land surface temperature, soil moisture, and evapotranspiration with greater precision. These technological breakthroughs are enabling scientists and researchers to address pressing environmental and climate challenges by providing more detailed data on energy fluxes and surface-atmosphere interactions. As we enter a new era of Earth observation and UAV with thermal infrared remote sensing will play an increasingly important role in understanding and managing environmental dynamics. The importance of thermal infrared remote sensing lies in its ability to provide critical insights into various Earth processes. Spaceborne and UAV Thermal infrared data, when integrated with other remote sensing data, offers a more comprehensive view of environmental changes at multiple scales, enhancing our ability to respond to natural disasters, monitor greenhouse gas emissions, and manage ecosystems. 

This special session will focus on the latest developments in spaceborne and UAV thermal infrared remote sensing, with an emphasis on its expanding applications and future potential. Submissions are invited that explore innovative retrieval algorithms, the integration of thermal infrared data in Earth sciences, sensor technology advancements, and case studies demonstrating real-world applications. We particularly welcome contributions that discuss the synthesis of thermal data with other datasets, as well as forward-looking articles on emerging technologies and methodologies. The session will provide a platform for researchers to share cutting-edge findings and reflect on the future direction of thermal infrared remote sensing.
Show/Hide Description
CCS.127: Towards meta-constellation GNSS/SoOp reflectometry science and applications
S/M.3: Mission, Sensors and Calibration — Spaceborne GNSS-R Missions
In recent years, more and more Spaceborne GNSS and other Signal of Opportunity (SoOp) reflectometry measurements are becoming available. Beyond NASA mission Cyclone GNSS (CYGNSS), many new SoOp data sources are coming online, such as China’s FengYun 3E and Bufeng-1, Taiwan’s TRITON, India’s EOS-08, ESA’s upcoming HydroGNSS satellites, and a significant influx of commercial SoOp satellite systems from MUON Space, Spire, Yunyao, Tianmu, and others. The GNSS and Signals of Opportunity working group of the GRSS Instrumentation and Future Technologies Committee believes that a community driven approach to examine and explore the synergies between these many data sources would be greatly beneficial to the community of researchers and scientists. This network of sensors could provide valuable new insights into threats to Earth’s ecosystems and promote collaborative, global efforts to address these threats. 
This session will investigate the combining of data from multiple GNSS-R satellite constellations into higher level data products and the associated benefits and difficulties. Evaluations of the data product requirements on current/future technologies will also provide valuable insight into future SoOp mission concepts and designs. Furthermore, beyond addressing inter constellation operability of GNSS-R products, further integration from additional SoOp sources (e.g. NASA InVEST P-band SNOOPI, future frequency scanning Rydberg Radar technologies, and/or other instruments) will be examined within the context of data assimilation techniques and adding to the density of remote sensing measurement networks. In order to best achieve the goals of this Community Contributed Session, we would aim to have participation from industry, academia, and national laboratories.
Show/Hide Description
CCS.128: Trends in Standardization Supporting the Acquisition, Processing, Analysis and Application of Remote Sensing Data
D/S.8: Societal Engagement and Impacts — Standardization in Remote Sensing
The Institute of Electrical and Electronics Engineers (IEEE) Geoscience and Remote Sensing Society (GRSS) created the GRSS ‘‘Standards for Earth Observation Technical Committee’’ to advance the usability of remote sensing products by experts from academia, industry, and government through the creation and promotion of standards and best practices. The papers in this session describe the many ways in which current and emerging standards are enabling maximum benefit to be derived from remote sensing data. This includes standards development projects that the GRSS has sponsored (P4001, P4002, IEEE 4003, P4005 and P4006) as well as new projects that are being contemplated (Global EO Lidar, Protcol for Validating New Methods for Soil Nutrient Management, Utilization in Developing Countries of Remote Sensing for Disaster Management). In addition, papers are invited that describe less formal, community-driven standards and conventions that support the exploitation of remote sensing data. Standards provide a framework in which technological innovations can achieve their maximum uptake and impact. They are a means by which individuals, institutions and industry from around the world work together to advance science and technology for the benefit of Earth and society. Submissions are welcome from all GRSS fields of interest where technology standards promote innovation. Standards can help in planning future data sets so that valuable and inter-comparable products will result with a view to enable long-term stability and retrieval consistency in support of science and operational applications. 
[additional words will be added after this initial submission and discussions with TC members who are currently unavailable]
Show/Hide Description
CCS.129: UAV/Mobile-Mapping SAR Systems and Applications
S/M.8: Mission, Sensors and Calibration — UAV and Airborne Platforms
SAR systems on UAVs and other mobile mapping platforms, such as cars, have increasingly gained attention also within the geoscience community. Small SAR systems deployed on such platforms offer complementary properties with respect to the revisit time, operational flexibility, and observation capabilities as compared to spaceborne and conventional airborne SAR systems. On the other hand, compared to stationary terrestrial radar/SAR systems, the increased synthetic aperture size of UAV/mobile mapping SAR systems allows to obtain a higher spatial cross-range resolution also for quasi-terrestrial observation geometries. 

These complementary properties of UAV/mobile mapping SAR systems open a large field of potential applications, some of which are addressed within the scope of this session including high-resolution DInSAR based measurements of surface displacements, monitoring of vegetation / agricultural crop, change detection.

From a system point of view, these agile SAR platforms require not only new compact SAR system designs, but also compact and innovative high-performance navigation using smaller INS/GNSS systems, in some cases combined with vision systems, as well as adequate SAR imaging algorithms and DInSAR processing chains adapted to the potentially non-linear sensor trajectories and partial aperture synthesis common to UAV/mobile mapping SAR systems and application.

UAV-borne SAR systems allow for experimental formation flying and therefore are an important tool to develop and test bistatic and multistatic SAR mission concepts including synchronization for future spaceborne SAR missions.

This community-contributed session aims at giving an insight into recent state-of-the-art UAV/mobile mapping based SAR systems and applications developed with a focus on geoscience applications. 
After our successful invited/community-contributed sessions on this topic during IGARSS 2021/2022/2023 and IGARSS2024 we would like to keep track of this topic providing insight into the latest technological developments with small SAR systems on UAV/mobile-mapping platforms.

The session typically covers a number of novel systems and UAV/mobile mapping platforms of different size, type (fixed-wing and VTOL UAVs, cars), and a range of applications such as repeat-pass differential SAR interferometry for displacement measurements, change detection, and tomographic configurations. 

We believe (and this has been confirmed by reaching more than 90 attendees in this community-contributed session at IGARSS 2024 in Athens) that our session topic UAV/mobile mapping based SAR systems and applications is of very high interest for the geoscience and remote sensing community already now and will be even increasing in the future.
Show/Hide Description
CCS.130: UAV-based multi-sensor identification and mapping of surface and buried explosive ordnance
T/D.11: Data Analysis — Object Detection and Recognition
Every year, across the world, millions of civilians bear the brunt of landmines, cluster munitions and other explosive remnants of war (ERW) (e.g., unexploded bombs, grenades, artillery shells, among others).  Lives are lost or irreparably damaged, as survivors and their families struggle with the physical, psychological, social, and economic consequences of accidents.  The presence of explosive ordnance (EO) is not just a threat to the physical safety and wellbeing of people.  It is also a key obstacle to the timely and effective delivery of humanitarian aid, post-conflict stabilization, recovery and reconstruction, peace, and sustainable development. 

Even now, we see this as a major concern given the on-going conflict between Ukraine and Russia, where land used for agricultural purposes, remains unusable due to the presence or suspicion of presence of explosive ordnance, putting pressure on Ukraine’s economy and the global supply chains.  Current methods and tools of land release will take dozens of years to clear Ukraine contaminated land, which is just a parcel of the global compounded contamination. 

In 2022, Ukraine has joined Afghanistan, Bosnia Herzegovina, Cambodia, Croatia, Ethiopia, Iraq and Turkey as the countries with more than 100 km2 contaminated by just Landmines. 

In 2022, at least 4,710 casualties of mines and ERW were recorded in 49 states (1,661 killed and 3,015 injured), 85% of which were civilians.  At least 60 states and other areas are contaminated by antipersonnel mines. 

Traditional landmine detection methods, such as Electromagnetic Induction (EMI) metal detectors and Ground Penetration RADAR (GPR), have limitations.  They are can be unsafe, expensive, slow to collect data, and unsuitable for difficult land topography.  UN News dated April 2nd, 2023, stated “With traditional methods, it will take 1,100 years to clear all active landmines in the world if no new mines are laid.”  

Thus, there is a very strong need to innovate on solutions that are reliable, fast, wide-area applicable, and affordable.  Recently, UAV’s, coupled with sensors, have made a large impact in the geoscience and remote sensing community with its numerous applications (e.g., precision Ag., infrastructure, disaster management, water quality, archaeology, etc.).  This session would solicit geoscience and remote sensing experts in the areas of UAV’s, sensors, sensor fusion, and algorithms (AI/ML and/or others) to share their approaches to the detection of surface, and especially buried, explosive ordnance. 
Show/Hide Description
CCS.131: Unlocking the Potential of HAPs for Earth Observation
S/M.10: Mission, Sensors and Calibration — High Altitude Platforms
This session explores the rapidly evolving field of High Altitude Platforms (HAPs) as a powerful new tool for Earth observation. Encompassing a range of technologies such as balloons, airships, and unmanned aerial vehicles operating in the stratosphere, HAPs offer a unique vantage point, effectively bridging the gap between satellite and airborne remote sensing. This provides a compelling alternative for acquiring high-quality geospatial data, addressing a critical need in the geoscience and remote sensing community.

We invite contributions that delve into the specific capabilities of remote sensing from HAPs, showcasing their inherent advantages. These include:
-- Enhanced Spatial Resolution: Compared to traditional satellites, HAPs operate closer to the Earth, facilitating higher resolution imagery for more detailed observations.
-- Increased Temporal Coverage: HAPs can be strategically deployed and repositioned to provide rapid response and increased observation frequency over specific areas of interest, unlike satellites with fixed orbits.
-- Operational Flexibility: HAPs offer adaptable flight paths and payload configurations, allowing for customized data acquisition tailored to specific research or operational needs.

Discussions will encompass cutting-edge sensor payloads designed for HAPs, innovative data processing techniques to handle the unique characteristics of HAP-derived data, and emerging applications across various sectors. This includes environmental monitoring (e.g., atmospheric composition), disaster response (e.g., wildfire tracking, flood monitoring, rapid damage assessment, real-time situational awareness), and infrastructure assessment (e.g., monitoring of critical infrastructure, precision agriculture).

This session aims to highlight the transformative potential of HAPs in acquiring high-quality, timely geospatial data crucial for addressing critical challenges in geoscience and remote sensing. By fostering a dynamic exchange of knowledge and ideas, we aim to advance the understanding and application of this promising technology.
Show/Hide Description
CCS.132: WSF-M Mission Status and Calibration
S/M.2: Mission, Sensors and Calibration — Spaceborne Passive Microwave Missions
The Weather System Follow-on Microwave (WSF-M) satellite is the next-generation of Department of Defense (DoD) operational environmental satellite system.  The WSF-M was successfully launched on April 11, 2024, to a sun-synchronous Low Earth Orbit (LEO) orbit.  WSF-M has two payloads, a polarimetric Microwave Imager (MWI) and an Energetic Charged Particle (ECP) sensor.  The MWI has a total of 17 channels at frequencies 10, 18, 23, 36 and 89 GHz, of which 10, 18 and 36 GHz are fully polarimetric.  The WSF-M has been performing very well on orbit since launch.  Two calibration maneuver campaigns were conducted, one in April and the other in October.  Both geolocation and radiometric performance are well within its specified requirements.  This session will provide mission status and discuss calibration results.  
The WSF-M predecessor, Defense Meteorological Satellite Program (DMSP), has been providing the Geoscience and Remote Sensing Community with microwave imager data for 30+ years.  The data from DMSP passive microwave radiometer have been used for both weather forecast and climate study.  The WSF-M will continue to produce high resolution of microwave imagery and generate essential weather data products such as ocean surface wind speed and direction, snow depth, sea ice characterization, soil moisture, tropical cyclone and other weather imagery, etc.  We have received many inquiries about the WSF-M mission and MWI sensor data sets.  This session will provide information to help Geoscience and Remote Sensing Community understand and use the WSF-M MWI data, which will in turn derive better weather products for final data users.
Show/Hide Description
CCS.133: AI InSAR
T/S.1: SAR Imaging and Processing Techniques — Interferometry: Along and Across
Interferometric Synthetic Aperture Radar (InSAR) has emerged as a crucial remote sensing tool for monitoring Earth’s surface dynamics, enabling the generation of Digital Elevation Models (DEM) and the extraction of ground deformation data. The integration of Artificial Intelligence (AI) with InSAR technology presents unprecedented opportunities to improve the efficiency, accuracy, and scalability of data processing, analysis, and interpretation in geoscience and remote sensing applications.
The growing body of research highlights AI's potential to address some of the most pressing challenges in InSAR studies, including phase unwrapping, pixel selection for reliable measurements, deformation signal post-processing, and complex terrain mapping. AI-driven methods offer computational advantages, particularly in scenarios involving large datasets, complex geophysical environments, and the need for real-time or near-real-time analysis. This session will explore cutting-edge AI methodologies applied to InSAR, showcasing innovations in detecting volcanic activity, analyzing urban area deformation, monitoring land subsidence, identifying landslide-prone zones, and predicting deformation patterns.
AI tools, such as machine learning and deep learning algorithms, have demonstrated remarkable effectiveness in processing the vast and complex datasets generated by InSAR systems. These tools enhance pixel classification accuracy, phase unwrapping efficiency, and deformation prediction models, making it possible to achieve more precise and reliable ground motion measurements. Furthermore, AI enables better handling of noisy or incomplete data, which is often a limitation in traditional InSAR analysis.
The fusion of AI and InSAR is not only pushing the boundaries of earth observation but also plays a pivotal role in addressing global challenges related to natural hazards, infrastructure monitoring, and environmental change. From monitoring volcanic eruptions and urban infrastructure stability to tracking subsidence in coastal areas and assessing slope instability, AI-enhanced InSAR techniques provide critical data for decision-making in disaster risk reduction and sustainable land management.
This session seeks to bring together researchers and practitioners from the fields of geoscience, remote sensing, and artificial intelligence to share recent advancements and discuss the future of AI in InSAR applications. By fostering collaboration and knowledge exchange, this session will help shape the next generation of tools and methodologies for geospatial analysis, ultimately contributing to more resilient and informed responses to natural and anthropogenic processes shaping our planet.