Frances Boyle (Executive Director of the Digital Preservation Coalition) who took part in the panel discussion for the LIFE2 Conference, has written up some key points from the conference.
A few thoughts on the recent LIFE2 conference
Frances Boyle, DPC
Money matters and all things economic are hitting the headlines both in the real world and in that most glamorous of niches, the digital preservation community. In recent months a number of reports have been published looking at particular aspects of digital preservation costs (e.g. the JISC LOCKSS & Research Data reports). It was against this backdrop that the timely LIFE2 conference was held at the BL last week.
The initial sessions outlined the roadmap of the LIFE project, in its various incarnations, by members of the project team. They had clearly engaged with the community throughout the project phases; the dialogue resulting in refinement of the model. Elements which had changed included the addition of a distinct metadata component – something which engendered debate throughout the day. Another refreshing aspect of the work was the extent to which lessons learned from case studies had been used to tweak and improve the model.
The key note speaker was the enigmatic Paul Courant (Dean of Libraries, University of Michigan), who addressed the most fundamental of issues - what are the benefits of digital preservation and why should we be spending money on this? The benefits identified were efficacy, accessibility and reliability – i.e. you know it is there, you can find it and you can use it– that perfect information retrieval moment! In his view the easy stuff to deal with is text e.g. output from digitisation projects, journals etc; the harder stuff is multimedia, material with embedded functionality and links etc; whilst the hardest stuff of all being the cultural record itself.
Again getting to the crux of the matter Paul raised the tricky question - who will pay for universal access for all when there is no palpable return on the investment?
As is often the case when interested parties from the community gather in an orderly fashion the value and the benefit to both project members and the gathered audience was realised in the interactive sessions sprinkled throughout the day. LIFE2 is a project which addresses practical generic issues which have resonance with not only the digital preservation community but also with the ‘powers that be’ who occupy the surrounding space and who ultimately pick up the bills. Indeed, the expressed aim from the original LIFE project was ‘to make a major contribution in understanding the long-term costs of digital preservation’. This deceptively simple aim remains a ‘biggie’ in the digital preservation space.
Some take-home messages from the day:
Progress had been made into a better understanding of the costing model components.
The model was workable in real libraries and archives. It was preservation strategy neutral and provided a comprehensible checklist – both important for practitioners.
At the methodology level it was mooted, in my view persuasively, that some level of discounting should be included in the model.
Roles & responsibilities – a recurrent theme which surfaces at many digital preservation events. Perhaps indicative that digital preservation is a relatively new area which is not only in transition itself as a discipline but impacts on organisations which are also in a state of flux.
The model’s flexible approach in isolating and articulating the elements and sub elements would enable the practitioner to juggle where the spend is best suited to fit their particular needs. For example decreasing the number of received file formats at ingest would impact on the costs of future technology watch activities.
How might consideration and management of significant properties be balanced against efficiency and costing savings? How might organisations achieve this balance?
An obvious issue which is not always taken into account is that data preservation costs vary according to when they are taken during the lifecycle. Retrospective costings add complications to the process.
Looking specifically at research data costs, issues to consider are economies of scale, first mover innovation costs and efficiency curve effects.
Tools which would help back at the ranch that were discussed:
- A predictive costing tool.
- Tools which would allow institutions to contrast and compare the cost of DP in-house, with the cost of shared service and in turn with the cost of outsourcing to a third-party service. Indeed LIFE2 is in a great position to take this on by extending some of the case studies.
- Scenario building examples which would allow the LIFE2 model to be contextualised for an institutions own business priorities.
Model refinements suggested during the day included:
- Mapping to OAIS terminology. In this jargon packed field it would improve clarity and embed the proposed models in the ‘lingua preservation’.
- Metadata assigned to the functions they relate to not as a discrete stage as it currently stands.
- Call for more case studies – and indeed some thought on how these might be best packaged and presented to the community.
- Add elements for appraisal, deselection & disposal activities.
- An indication of the frequency of preservation actions would help to spec the cost of the preservation actions.
- Express the model in terms of activity based costing and consider using FEC.
Thoughts which will linger, a few days after the event, are that we shouldn’t forget why we’re interested in this – we need to consider (at regular intervals) the value of the assets, material, stuff, or whatever we might want to call it. Without recognition of the value how can we make informed cost benefit analyses, cost opportunity calculations or indeed business cases? How we measure the value and impact is perhaps another area which might be of interest to project funders?
Extending the LIFE2 model might also afford opportunities to develop material for an organisation to benchmark their activity – the costs of specific components within the LIFE2 model will vary enormously by organisation, collections, activity aims and priorities etc. A set of ‘typical’ models reflecting a range of scenarios would provide a useful gauge for organisations. Is there any mileage in suggesting that a transparent costing model should be part of any trustworthy digital repository scheme? I would certainly welcome piloting the model with a digital preservation service.
So an interesting event and a project which will hopefully go on to develop further tools within the LIFE portfolio.