Quality control and pre-analysis treatment of 5-year long environmental datasets collected by an Internet Operated Deep-sea Crawler |
|---|
| Damianos Chatzievangelou, Jacopo Aguzzi, Laurenz Thomsen |
- Abstract:
- As technological advances nowadays allow for long-term, high-frequency deep-sea monitoring studies, the collected datasets are increasing in size and diversity. As a consequence, together with the need for larger-scale management, the issue of the standardization of data collection and treatment and the comparability between datasets of distinct sources is being raised. This study presents examples of data treatment steps followed, in order to ensure that the datasets collected during a period of 5 years by the Internet Operated Deep-sea Crawler “Wally" meet high quality standards and are adequate for the production of reliable results to monitor of the Barkley Canyon methane hydrates site, off Vancouver Island (BC, Canada). In addition to internationally established automated procedures, different standardizing, normalizing and detrending methods can be used on a case-by-case basis, depending on the nature of the treated oceanographic variable and the range and scale of the values provided by each different sensor.
- Download:
- IMEKO-TC19-METROSEA-2019-30.pdf
- DOI:
- -
- Event details
- IMEKO TC:
- TC19
- Event name:
- MetroSea 2019
- Title:
TC19 International Workshop on Metrology for the Sea
- Place:
- Genoa, ITALY
- Time:
- 03 October 2019 - 05 October 2019
- Event details
- IMEKO TC:
- TC8
- Event name:
- Special session at MetroSea 2019
- Title:
TC19 International Workshop on Metrology for the Sea
- Place:
- Genoa, ITALY
- Time:
- 03 October 2019 - 05 October 2019