How to Escape from a Digital Black Hole?

ARTICLES

Certain archivists are sure of it: to protect emails for the next 500 years, it would be best to save them on microfilm. Their trust in digital supports is weak for items that are to be conserved for over 5 years. The computer could break down, storage systems could become outdated, file formats unreadable. This solution consists of giving a physical form to something immaterial in order to escape from a digital black hole that would plunge the world in a “digital dark age”. This danger imagines a near future where entire segments of knowledge will be inaccessible because of the obsolescence of supports and of their digital contents.

Science, The First Victim

The organizations and companies that feel concerned by this subject are rare, as seen during the public awareness project CASPAR supported by the European Commission. Who would think that NASA, the space agency that pilots robots on Mars, could be one of the first victims of this digital dark age? At the frontiers of our solar system, Pioneer 10 and 11 launched in the 1970s experienced acceleration anomalies. It wasn’t easy for program managers to bring themselves to contradict the laws of physics to explain this phenomenon.  The teams at Jet Propulsion Laboratory at Aerospatiale wanted to understand what was wrong with their calculations, or with the engine design. For this, they had to dig up the data recorded during both the probes’ construction and flight. But digital obscurantism had already carried out its work: it was impossible to read the old punch cards and find the plans. Considerable efforts were made over several years to uncover the information buried in antiquated and occasionally defective supports.

Digital Art Victim of its Overzealousness

If NASA was impacted, it is obvious that other institutions have alowed a part of their memories deteriorate. Beginning with IRCAM: when the publisher of Marc André Dalbavie’s work wanted to revive Diadèmes for a concert in the United States in 1986, the computer music designer, Serge Lemouton, discovered that the Yamaha synthesizers originally used were no longer functional. The fact that the technological parts of the work were carried out on old versions of software and on old machines meant that it was not performable in its current state. Weeks of work were required to bring the piece up to date on modern tools. Music, like science, is outmoded by the incredible speed of digital technology. Unreadable CDROMs of contemporary art installations created using old versions of software fill the FRAC’s drawers. A 10,000-year-old cave painting is still visible but a work commissioned 10 years ago is barely accessible. The artist must be contacted for the source code so that it can be brought up to date on recent software. The sales purchase agreement does not always contain this clause and the artist, endeared to the initial form, does not want to adapt it. In video games, this digital quagmire can also be seen; the industry thinks that conserving its heritage can only be carried out at the expense of innovation. Impassioned activists have taken it upon themselves to revitalize works. A French association (MO5.Com) is specialized in the conservation of game titles and supports, maintaining their initial interaction. Because of their work, it is now possible to play Frédérick Raynal’s Alone in the Dark on a keyboard from 1992, if you can get to one of their exhibits. It is much easier to watch a film by Marcel Carné on BluRay. The film industry has begun its change to digital technology, but in France it is still compulsory to deposit a reel of film at the CNC where it is archived.

Good Digital Technology Practices

While digital technology is a danger for the immediate future, it is still a good way to construct, disseminate, and store. The problem must be addressed and subtle strategies must be put into place today. The first strategy consists of programming in languages that can be read in the future. There will always be a clever engineer capable of waking up a square of silicon. We must insure that the lines memorized are understandable and compilable. For this, nothing is better than a mature language fairly close to the machine, but well structured. C and its affiliate C++ are excellent candidates: born in the 1970s, they have weathered the technological upheavals.

Concerning the documents themselves, it is essentially a choice of format. Software programs die in rhythm with their business cycles, taking with them proprietary locks. The .doc was put in place by Corel and taken over by Microsoft. Innumerable documents have been written and saved in this format. Nevertheless, the company Redmond is abandoning it because its retro-compatibility is too difficult to maintain. It appears to be preferable to favor open formats with public specifications. Separating contents from their formatting is also an avenue for preservation. This is one of the concepts of LaTeX, a highly successful software program with the scientific community that loves it for its transparence and functionality that are the results of freeware.

The famous “open source” seems to provide a sort of continuity assurance. The projects are carried out in large communities, knowledge is shared and the work is accessible to everyone. A user has all the keys to a product; a product is no longer a black box. The GitHub hosting service is a favored spot to discuss these programs. Source codes are deposited in this worldwide catalogue on servers located all over the Web. This is the principle of storage that has focused everyone’s attention today called “the cloud”. This cloud is made up of immense data centers spread out throughout the world. Servers are linked to each other so they can receive and distribute data synchronously according to the traffic.

By placing this information in such systems, the user becomes free of the system’s material constraints. The user simply deposes their objects on these servers via the Internet. The hosting company looks after the maintenance of the servers, guaranteeing permanent availability.  Questions of confidentiality and of the volatility of data are often brought up as a critique of this type of storage. While waiting for a more reassuring national cloud, bioinformatic scientists carry out experiments where documents are coded in DNA format. After all, these molecular chains, these keys to the information of living organisms, are a simple format that is simple to produce and store. The few thousand years that separate us from the first cells are a reassuring message for conservators.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>