Last week I attended a workshop run by the Digital Curation Centre on Research Data Management (RDM) at Warwick. The following post is a summary of my notes giving some background on RDM and introducing some of the DCC’s tools for assessing current practice at your institution.
What is research data management (RDM)?
Research data management is the collection and preservation of research data. It encompasses a range of practice from file-naming conventions and creating local back-ups to archiving data and making it available through a repository. It also takes into account issues surrounding long-term access to data and the value of data.
Why is RDM important?
The size and nature of data that is used and generated through research these days means that it can no longer be presented alongside the analysis of the data in a journal article, for example. This means that we need to find new ways to make accessible and link the data associated with the analysis.
In addition, funders are increasingly setting data management conditions on the research grants that they award. The DCC provide an overview of the data policies of funding organisations on their website.
Whose responsibility is RDM?
The short answer is everyone’s. Below I’ve tried to match responsibilities to the key groups involved, there is obviously overlap and input that each group can provide in all areas:
- Senior management; policy, strategy and best practice
- Support staff; resourcing, training and infrastructure
- Researchers; documentation, metadata and best practice
How prepared is your institution?
The DCC provides support and resources to assess RDM provision and practice at your institution. Below are outlines of the two tools presented to us at the workshop:
Data Asset Framework (DAF)
DAF is a methodology and online tool that can be used to identify the broad challenges of RDM at your institution.
There are four steps to any DAF implementation:
- Planning the audit
- Identifying and classifying assets
- Assessing management of datasets
- Reporting and recommendations
Pilots have been run at the University of Edinburgh, Imperial College London, UCL and King’s College London. Reports of each pilot are available on the JISC pilot project’s website.
Information for the audit can be gathered through a variety of means:
- Desk research
- Focus groups
Throughout the audit senior management, support staff and researchers should all be consulted. The questions asked should cover the following topics, but not seek to go into any great detail at this stage:
- Creation process
Collaborative Assessment of Research Data Infrastructures and Objectives (CARDIO)
CARDIO can be used to identify existing RDM infrastructure and future requirements. It looks into three distinct areas:
- Organisation; policy, legal issues, transparency
- Resources; staffing, finance, risk
- Technology; storage, data validation, security
Much like DAF, CARDIO is a multi-stage process:
- Survey of various facets of RDM
- Complete data presented back to group; review, update and seek to form a consensus
- Produce plan to improve
The full version of CARDIO assesses current capabilities and capacities across 30 organisational, resourcing and technological facets. For each, respondents are asked to rate their institution on a maturity scale of 1-5, with descriptive text provided as guidance.
CARDIO can also be implemented in a reduced or modified format, for example:
- as an online quiz for a quick response to how an institution is currently performing
- a lite version where fewer facets are assessed in each area
- with the replacement of the maturity scale for a Likert scale
CARDIO seeks to establish an agreed view within the group of participants. Individuals are first asked to complete the survey and then the results are discussed and a consensus formed. This process highlights successes and shortcomings and informs and directs investment.
More information about the tool can be found on the CARDIO pages of the DCC website.