Data integrity assessment, or quality control, involves deciding whether a particular feature in a measured or modelled data set is real or not. This can have considerable engineering implications. An informed decision relies upon a detailed understanding of the data source, plus the metocean processes at the site in question.

Cases arise in which there is insufficient information to make an objective data integrity decision. A sound understanding of the end user's application is required to make a pragmatic informed choice. Assumptions and uncertainties must be clearly communicated to ensure the data are used appropriately. Gus has been at the sharp end of these activities for close to 15 years, making difficult pragmatic decisions on an often daily basis.

Gus has personally undertaken intensive quality control of several large complex datasets, some with major engineering implications. His extensive experience of solitons and squalls has been frequently applied to distinguish these key metocean features from apparent spikes in measured data.

Oceanalysis is undertaken primarily in the MATLAB environment, under a rigorous digital workflow system. This ensures critical data manipulation is fully traceable and can be efficiently reproduced following small changes at any stage in the analysis. A proprietary toolbox is under continuous development to deliver generic functionality, including operational statistics and extreme values.