|Title||New techniques in sediment core analysis: an introduction|
|Publication Type||Book Chapter|
|Year of Publication||2006|
|Authors||Rothwell, G, Rack, FR|
|Book Title||New Techniques in Sediment Core Analysis. Geological Society, London, Special Publications|
Marine sediment cores are the fundamental data source for information on seabed character, depositional history and environmental change. They provide raw data for a wide range of research including studies of global climate change, palaeoceanography, slope stability, oil exploration, pollution assessment and control, and sea-floor surveys for laying cables, pipelines and siting of sea-floor structures. During the last three decades, a varied suite of new technologies have been developed to analyse cores, often non-destructively, to produce high-quality, closely spaced, co-located downcore measurements, characterizing sediment physical properties, geochemistry and composition in unprecedented detail. Distributions of a variety of palaeoenvironmentally significant proxies can now be logged at decadal and, in some cases, even annual or subannual scales, allowing detailed insights into the history of climate and associated environmental change. These advances have had a profound effect on many aspects of the Earth Sciences, particularly palaeoceanography. In this paper, we review recent advances in analytical and logging technology, and their application to the analysis of sediment cores. Developments in providing access to core data and associated datasets, and data-mining technology, in order to integrate and interpret new and legacy datasets within the wider context of sea-floor studies, are also discussed. Despite the great advances in this field, however, challenges remain, particularly in the development of standard measurement and calibration methodologies and in the development of data analysis methods. New data visualization tools and techniques need to be developed to optimize the interpretation process and maximize scientific value. Amplified collaboration environments and tools are needed in order to capitalize on our analysis and interpretation capability of large, multi-parameter datasets. Sophisticated, yet simple to use, searchable Internet databases, with universal access and secure long-term funding, and data products resulting in user-defined data-mining query and display, so far pioneered in the USA and Australia, provide robust models for efficient and effective core data stewardship.