Allow me to chime in not from the music/performing arts side but from the archival/restoration/preservation side.
From what I see there are a few main touch points: on the software side the program (meaning the code for a specific work) and the operating system; and on the hardware side the actual physical hardware of the computer and interfaces (potentially custom built); then the volatility of “non-volatile memory”; and finally the longevity (“shortevity”) of data storage media including bit-rot.
Just buying many computers of a specific hardware/OS combination will not do it. The boot sequence for a computer to actually start up and read the OS has moved from ROM to EPROM to EE-PROM and now to flash-based chips, often integrated deeply with CPUs etc. In the old days, the boot sequence was “etched” into the physical layers of ROM - most likely the most durable way to store such information. Starting with programmable and erasable-programmable chips, things moved towards having an “expiration” date because of the physical properties of the storage. And with flash-like memory, it got most likely worse. (I am not referencing solid-state memory for large data with their question of read-and-write cycles, I am only talking about the chips holding the initial boot sequence.)
Industry/manufacturers give these solid-state-chips are only given a life-span“guarantee” of 10 years. That means that a computer having sitting on a shelf for 15 years is not guaranteed to even start up. This is certainly also true for all other digital devices like DVD/blu-ray players, video monitors etc. And this independent of soldering issues, capacitors giving up etc. Just talking about getting the machine to boot and then take the next steps.
So if one (like me, as I did...) thought that buying a stack of computers and a stack of optical players would allow to retrieve data for the next 30-50 years - well, that does not seem to be the case. I have not been able to find yet information on flash memory, if the reason for it loosing data (accumulation of electrons at the gates) can be reset by reprogramming the chip or if the electronic properties cannot be “flushed”. If it were the case that “the gates can be reset”, there would be a way to cycle through refreshing them for all hardware say every 8 -10 years. But I have not been able to confirm this.
Allow me to point out that I am speaking only of “bits” on the physical level. This then also applies to the “stored user data” and to “programs to execute”. The currently only path to store such data for the “longest time” are optical storage media which do not contain metals which can readily oxidize. These discs may last 50 to 500 years (depending on who you believe). But then you again need the devices to read these discs, which in turn have today boot sectors which might be prone to bit loss.
This is quite a cycle.
Considering that conventional hard discs are given a life expectancy of 5-8 years, and tape (with all its weaknesses of material, lamination etc.) of 15-20 years - the situation indeed is much worse than most anyone wants to see.
Even if you want to rely in OS emulators and porting the programs - this requires funds, people and continuous maintenance. And constant checking of the bits on the respective media - of they are still “there”. Which in turn requires machines, which can read the bits. Which in turn required boot devices which are still fine.
And this is not only true for programs, which are specifically driving say an art work. This is also true for all the documentation you might gather in forms of text, images, video and audio. Even if you base your archive on the probably most feasible assumption that a handful of formats will always be kept accessible because banks and the military and companies have a trove of information in these formats, the question of bit-rot or loss of bits beyond error correction algorithms is highly important.
Even “regular scientists” (like on the university level) do not want to be confronted with the question how they maintain their data, the actual bits. They may point to “the cloud”, which the depends on how long your money lasts, if/when the cloud company goes bankrupt, you are fine with not “owning” the data in the cloud - and it does nto solve the hardware issue which the bits need to come back to “life”. And then there are the big exceptions like CERN - take a look at their backup system and what that costs (and again only for “data”, “bits”, not for hardware.)
At this point in time, I see only one path - (hopefully I find another one next year :)
If you want to keep data in general you have to check the media regularly and be ready to copy “before it is too late”. This is storage media dependent.
This requires time and money and a continuity in staff and knowledge within an archive. I do not see any institution at all in the arts communities who has the funding and the continuous support for this (maybe Cornell’s archives (see contribution in this thread from Tim Murray) have the best chance being integrated with a university wide archival system).
And if you want to keep programs alive (generative/interactive/etc), the only chance is to have funds and people to port them constantly, to maintain a suitable emulator environment and to adapt everything to changes in hardware and OS technology.
We applied for creating an “all digital archive” in the late nineties - and the grant application was declined because at that time an “all digital archive” was deemed to be unfeasible.
In the subsequent years up to today a lot of funding is given to many initiatives, archives and institution to create digital repositories or archives. The funding is usually only given for the first generation. And once everything is digitized or stored - there is no more funding - and say your hard discs just give up after 8.5 years… and you were sick around that time and were not able to check the bits and make a new backup.
I gave a talk at ZKM among other places about these issues regarding all their digitized and archived videos and how the actual integrity of the bits is monitored, what the strategy for the second generation for an archive looks like. Just this small aperture into the “arts bits” - leaving out all programming/interactive/web/etc/art – might shed some light on the actual situation.
This is a long answer to why “just setting up a lab with lots of machines” will not do it. You need many experts to port, maintain, check, build, update, program. Which means a lot of money that needs to be dedicated at least for one generation - which let’s say in human terms is 30-40 years. Which in hard-disc terms is 5 generations. Or in tape based storage 2 generations. Or in hardware terms = from chips to protocols - … Well, not quite known, but definitely more like 4 generations.
I would be very, very happy to learn if anyone else is actually considering archiving on the bit level.
From: "Curating digital art - www.crumbweb.org" <[log in to unmask]<mailto:[log in to unmask]>> on behalf of David Rokeby <[log in to unmask]<mailto:[log in to unmask]>>
Reply-To: David Rokeby <[log in to unmask]<mailto:[log in to unmask]>>
Date: Tuesday, December 19, 2017 at 10:37 AM
To: "[log in to unmask]<mailto:[log in to unmask]>" <[log in to unmask]<mailto:[log in to unmask]>>
Subject: Re: [NEW-MEDIA-CURATING] Thought on time, temporality and new media public artwork
May I ask a stupid question ?
(being myself involved in programmed generative pieces and in the process of updating an interactive one)
Why is it so difficult to put financial and technical means to collect/restore/maintain computers and graphical operating systems over years, decades... ? Did not ZKM or Fondation Langlois .... or other places start to do that ?
I remember having a discussion with Jean Gagnon many years ago, who was then the founding (and only?) director of the Fondation Langlois about this very topic. He was asking what the fondation could do to assist in the challenges of conservation of new media, and stockpiling machine was one of the options that he said he was considering. I do not know whether the fondation took any steps in this direction. I suspect not, and at any rate, as with most things of this sort, the fondation now exists mostly in name and web-site only, so what would the fate of such a collection of machines have been? It is an unsexy task to devote money to, and needs someone with a mission, a long term vision, and long term funding to bring off. Institutions are the obvious answer, but the ones who have the funding to do it are usually the last ones to embrace new media artworks.
That said, I have been talking to various major museums lately, and have found that curators are often initially hesitant about opening this can of worms (conservation of new media), but the conservators and media departments are overjoyed that discussion is being initiated, because they are increasingly aware of looming challenges that few institutions are prepared for. There is consciousness dawning within institutions that something needs to be done, but we have to get past the current situation where curators, dealers, collectors and artists have all looked past the conservation issue, out of the honourable desire to actually get this work into collections. This has lead to a lot of vagueness about the nitty-gritty details and challenges for fear of blowing the deal (on all sides).
135 Manning Avenue
Toronto, Ontario M6J 2K6 Canada
[log in to unmask]<mailto:[log in to unmask]>