Flood Risk and large-scale analysis from space
How SAR data improves the monitoring of global flood events and how this can support global insurance and reinsurance company big data initiatives.
Guest author Gianni Cristian Iannelli, CEO at Ticinum Aerospace llc, provides an overview of how SAR data can produce better and quicker results when monitoring global flood events and how this can support global insurance and reinsurance company big data initiatives.
Statistically, floods are one of the most damaging hazards that occur each year. It causes thousands of fatalities, billions of dollars of damage and affects an ever increasing number of people. Two summary reports produced respectively by Munich RE (TOPICS GEO) and Aon (Annual climate catastrophe report) describe the impact of natural hazards in 2016, highlighting the effects of flood events. For example, based on the Munich Re’s NatCat SERVICE, 50% of total occurrences were related to Hydrological events. The Aon report confirms the same trend, and states that floods were the costliest perils for the fourth consecutive year (62USD Billions in 2016).
These are just some of the reasons which justify flood detection and monitoring. Reliable figures are necessary to reduce uncertainties in many fields, especially for insurance and reinsurance companies. In order to do it effectively, and most important at large-scale, remote sensing is a clever solution. Specifically, two main types of sensors are usually considered: optical and SAR (Synthetic Aperture Radar). The former is experiencing an unprecedented growth. Many companies are launching optical constellations able to acquire data, over the same area, dozens of times a day. Among these there are Planet, BlackSky, Satellogic, and Spire. While they support the future of new geospatial applications, flood monitoring, unfortunately, is not their best field of application!
The advantages of SAR
SAR systems offer two notable advantages over optical ones: their capability to operate day and night, and to penetrate clouds and heavy rainfall. These are features of special importance considering the poor-weather conditions under which flood events usually occur. SAR satellites have been around since 1991, when the first SAR earth-observation satellite, named ERS-1, was launched. Many public spatial agencies contributed in ensuring a good coverage of SAR data during the years launching more satellites and constellations. The graph below highlights that at any given time, SAR data is being generated by at least one satellite, and this continues seamlessly since 1991. They are characterized by different spatial and temporal (i.e. revisit time) resolutions, enabling different geospatial applications.
SAR, ESA, open data and API’s
A game-changing event took place three years ago when the European Space Agency (ESA) launched the SAR satellite named Sentinel-1A, followed by its colleague Sentinel-1B in 2016. ESA releases their images with an open-data policy, thus removing the (typically) high data procurement costs from the data exploitation process. The pair of satellites offers short revisit times (i.e. approximately three days in most cases) and a spatial resolution of 20 meters, which is an ideal size for large-scale flood detection. Moreover, the ESA offers a set of APIs to automatize continuous data processing in an efficient manner.
About flood, detection of water bodies from SAR data typically makes use of the unique characteristics that water exhibits in the microwave spectrum. Calm, open water surfaces act as mirror reflectors and bounce incoming radiation away from the sensor, thus producing typical, low backscatter measurements. Nevertheless, SAR data is not easily manageable without a strong knowledge of the system. Many satellite parameters and processing steps are required to understand completely the image, and thus applying the correct processing chain in extracting a reliable flood footprint. An example of fundamental step is the delineation of ‘permanent water’ (e.g. rivers, lakes). A typical case of permanent water in south east Asia is represented by rice crops. Thanks to a temporal series of SAR images is possible to identify rice crops easily! Paddy fields as seen by SAR images are visible in the figure below, in a time-lapse from January to December. The surface water, which characterizes these crops, results in low backscattering at SAR sensor (i.e. darker pixels).
Ticinum Aerospace flood risk and large-scale analysis
The team of Ticinum Aerospace has developed automatic and semi-automatic methods for delineating flood footprints/water monitoring by means of SAR images. An example of flood footprint extraction has been carried out in a region of Myanmar, and the resulting map is available in the following figure.
The concerned event is a flood occurred from July to September 2015. The event affected more than 1’000’000 people, and resulted in around 100 casualties. The flood map can be also correlated with the urban density layer, still extracted by Earth-observation satellites, to extract additional info about loss values.
Currently, Sentinel-1 constellation revisit time is 3 days which is a limitation when monitoring flood events, especially flash floods. In order to increase the temporal resolution, Ticinum Aerospace has already set tight links with ‘IceEye’, a Finnish company developing a new SAR microsatellite constellation. The new set of satellites will offer a much higher revisit frequency than the current standard, and will ensure an exceptionally good real-time coverage for global water/flood monitoring. Moreover, in the near future, more private actors will enter the spaceborne SAR business, thus enlarging our opportunities to effectively track flood events. One of the expected newcomers is ‘Capella Space’, promising a SAR constellation capable of acquiring a new image every hour at any given place on the Earth.
In the end, the new SAR data represents further opportunities to produce better, quicker results, and especially, to increase the observation frequency in a real-time environment. For global insurance and reinsurance companies, access to this level of detail has the ability to improve large-scale exposure assessment, enhance risk and pricing models and improve claims management and response times. It’s a great time to explore how new data can support these initiatives!
Comments