GLOSS Sea Level Applications

Sea level applications

Sea level is one of the most useful oceanographic parameters. Sea-level data are vital to scientists for studies of fluctuations in major ocean currents and global climate change, to engineers for the design of coastal installations, to a large community engaged in what is now called ‘operational oceanography’ (e.g. the provision of flood warnings from storm surges or tsunamis), and in local applications such as provision of tide tables for port operations. Some of these different applications are described briefly below.

The applications can differ in their requirements for accuracy, frequency and latency of sea-level data. However, the IOC Manuals [IOC, 2006] have made it clear that a suitable selection of gauge and telemetry hardware can be made for any particular location, such that the requirements of all applications can be accommodated. As a consequence, the number of users of the sea-level data will be as large as possible, and the stakeholder support for the long-term maintenance of the installation will be maximised. Moreover, there could even be cost savings arising from the use of a smaller number of high-quality sensors suitable for all applications, instead of deployment of multiple instruments for separate purposes.

 

Global sea level is believed to be rising at a rate of approximately 3 mm/year since 1993, although there are large regional departures from this global-average value [Bindoff et al., 2007]. A sea level record from a single gauge can barely be expected to show such an increase in a single year (the accuracy of an annual mean of sea level from a scientific-quality gauge being centimetric or, in some cases, sub-centimetric). However, over an extended period the gauge should be capable of demonstrating whether the global-average rise applies to any particular location. Therefore, requirements for data from such a gauge are concerned with accuracy (i.e. datum control), rather than frequency and latency.

The PSMSL and BODC hold what is called ‘delayed mode’ data, which in principle has been quality controlled, and is the most suitable to a wide range of scientific research, of which sea level rise is one example. Once again, in these applications accuracy is the major consideration.

Sea level data are also used by oceanographers to monitor fluctuations in ocean currents, as changes in currents modify the topography of the sea surface. Particularly important gauges are those which come in pairs on two sides of straits, such that flows through these restricted passages can be monitored, the spatial and temporal sampling of the alternative sea level measurement technique of satellite altimetry not being optimal for such monitoring. Another example includes the measurement of sea level for coastal upwelling monitoring [Aman and Testut, 2003]. Requirements for high accuracy are as important in these oceanographic applications as for sea level rise, but with additional requirements for near-real time data reporting (data to a centre within e.g., one hour).

Near-real time (or 'fast') reporting from gauges of comparable high accuracy is also necessary if data are to be used in applications such as the calibration of space-based altimeters [Mitchum, 2000]. Such calibrations must be performed within a timescale comparable to the provision of altimeter data (e.g. days to weeks), if the calibrated altimetric product is to be useful to operational monitoring of the large scale ocean circulation.

Estimates of the frequency of extreme coastal sea levels are required by engineers so as to design sea defences and other coastal infrastructure, and by insurance agencies to assess flood risk. Such studies require at least several years of delayed-mode sea- level data (depending on the technique used, see Pugh [1987], chapter 8) with hourly, or ideally more frequent, sampling. The data used in these applications should, as always, be as accurate as possible, although slightly lower accuracy standards could be tolerated.

The floods in the UK and Netherlands of 1953 might be said to have marked the start of ‘operational oceanography’ in Europe. In the following years, expanded tide gauge networks were established and real-time telemetry was developed, such that information on surge levels across the continental shelf could be used in combination with tide-surge numerical models to provide flood warnings several days ahead. Nowadays, many countries have similar flood forecast schemes [Flather, 2000; Alvarez Fanjul et al., 2000; Woodworth and Horsburgh, 2011]. Near- real time sea level data can also be assimilated into other types of ocean models, providing information on ecosystems and water quality. The requirements in these applications are lower for accuracy (perhaps several cm is adequate, i.e. comparable to or better than surge model precision), frequency of 1 hour or more frequent, and latency of typically within 1 hour (i.e. a shorter time than surge development). It is a regrettable fact that many major population centres in developing countries still do not possess effective warning; a list of the main schemes in each region is included in Table 7.1 of Woodworth and Horsburgh [2011].

A more extreme example of operational oceanography concerns tsunami monitoring. Following an alert of a possible tsunami based on seismic information, tide gauge data can be used to verify the existence of a real tsunami, or to cancel the alert in the event of no tsunami. The tide gauge data are necessarily of lower relevance to warnings in their own vicinity, rather than to locations further along the tsunami’s path. Requirements in this example are less for accuracy and datum control rather than frequency (e.g. 20 second sampling) and especially latency. The latter has stimulated research into new methods for transmitting tide gauge information to centres as fast as possible (e.g. Holgate et al., [2008]) and the implementation of automatic tsunami detection algorithms [Pérez et al., 2009].

Another important requirement of the use of tide gauge data for assimilation into models or for warnings is the need to implement automatic quality control procedures in near-real time, in order to avoid erroneous values entering the operational systems. The GLOSS Quality Control document, based on previous European Sea Level Service (ESEAS) work reflects this need. In Europe, a sea level near-real time quality control product has been approved and implemented within the MyOcean project [Pérez et al., 2010; Pouliquen et al. 2011].

Historically, many national datum levels for land surveys have been based on measurements of mean sea level over some defined period. These levels are often used to define state and national boundaries, for example as specified in the United Nations Convention on the Law of the Sea. Low water levels are used as the datum for tidal predictions and for the datum level in hydrographic charts. To determine these datums one requires several years of data as for determination of extreme levels.

The floods in the UK and Netherlands of 1953 might be said to have marked the start of ‘operational oceanography’ in Europe. In the following years, expanded tide gauge networks were established and real-time telemetry was developed, such that information on surge levels across the continental shelf could be used in combination with tide-surge numerical models to provide flood warnings several days ahead. Nowadays, many countries have similar flood forecast schemes. Near-real time sea level data can also be assimilated into other types of ocean models, providing information on ecosystems and water quality. The requirements in these applications are lower for accuracy (perhaps several cm is adequate, i.e. comparable to or better than surge model precision), frequency of 1 hour or more frequent, and latency of typically within 1 hour (i.e. a shorter time than surge development.

A more extreme example of operational oceanography concerns tsunami monitoring. Following an alert of a possible tsunami based on seismic information, tide gauge data can be used to verify the existence of a real tsunami, or to cancel the alert in the event of no tsunami. The tide gauge data are necessarily of lower relevance to warnings in their own vicinity, rather than to locations further along the tsunami’s path. Requirements in this example are less for accuracy and datum control rather than frequency (e.g. 20 second sampling) and especially latency. The latter has stimulated research into new methods for transmitting tide gauge information to centres as fast as possible and the implementation of automatic tsunami detection algorithms.

Another important requirement of the use of tide gauge data for assimilation into models or for warnings is the need to implement automatic quality control procedures in near-real time, in order to avoid erroneous values entering the operational systems.

Tide tables have been the major product of sea level measurements since the first recordings in the 17th and 18th centuries [Cartwright, 1999]. They are used extensively by port operators, fishermen and indeed anyone who uses or enjoys the coast. Although in principle 18.6 years of data are needed to produce the best ‘tidal constants’ employed to make tidal predictions, adequate constants can be derived from one year of data or sometimes less. For example, tide tables have been amongst the first demonstrable products of the recent ODINAFRICA III project, which has included several new installations [Woodworth et al., 2007]. Long tide gauge records are also useful for monitoring changes in tidal constituents over time, due to both natural and human causes (e.g., dredging, harbour and coastal construction).

Nevertheless, port operations will always benefit from real-time display of sea level instead of, or alongside, tidal predictions. Consequently, port operations might also be considered part of operational oceanography. Similar observations may pertain to operation of sluices and barrages.

It is important to emphasize that the uses of sea level data for science and for practical purposes are interdependent. For example, knowledge of long-term relative sea level rise needs to be input into the engineering design of coastal structures, many of which will have a lifetime of many decades or a century. Another example is that access to the real-time data needed for operational oceanography will tend to result eventually in higher quality delayed mode information, as faults can be identified and remedied immediately. Consequently, the point above can be repeated that a tide gauge installation is most efficiently installed and maintained if all applications are considered together.

In many of these applications the rapid exchange of reliable data, nationally, regionally and even globally, can increase the value of the work. This exchange of data and expertise is something which GLOSS can actively encourage.