[Table of Contents] [Search]

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Bridges to the past; links to the future

Below is the edited text of a paper I presented at a conference approx.
eleven years ago; most of the references to slides have been cut.  The
facts and figures are still accurate.

But first, I'd like to respond to some comments from Harmon and Luis.
Newspapers printed on groundwood pulp *are* among our least durable papers,
but my collection includes many newspapers which are over 100 years old.
Yes, they are fragile, but far from crumbling to dust.  And how much more
strongly would future researchers curse us if most of the publications that
remained from our time were contained on unreadable CD's?

We are encouraged to see the internet as a dynamic communication-redundant
storage tool, but this is not the case.  On of the Lists to which I
subscribe (Medtext-l) changed list owners over a year ago.  The archives
did not make the change to the new platform.  The new list owner indicated
for some months that the problem was being looked into; eventually the list
owner asked the subscribers to communicate how many of the old list files
were stored on individual hard drives.  This does nothing to convince of
the safety, security, or accessability of digital information.

And the term "man-hours" has no significance when discussing these issues.
When the Library of Congress conducted a study some years ago to determine
how long it would take to restore that fraction of their collection which
required restoration at that time, they used the term "man-centuries".

My son is a DJ at a radio station and he told me this afternoon that in the
past year one of their "jukeboxes" (a programmable CD player containing 30
CD's) ate one CD and could not "read" two others.  That's a 10% failure

Luis mentions the possibility of data storage at the atomic level, and that
will come.  At that level (and, indeed, at the chip level today) cosmic
rays are a problem.  No easy answers; only choices.

PNLA Conf., Vancouver, B.C.  [PNLA = Pacific Northwest Librarian's Association]
14 August, 1986
Jack C. Thompson


In 1936, the American Library Association published a book about data
preservation through replication in other formats.  One possibility was the
Fisk Reader.  As you can see, the size of the data was certainly reduced.
There was the problem of a suitable device for reading the data; this may
be why we haven't heard much about the Fisk Reader since 1936 [The Fisk
reader was a small version of a stereo photograph viewer].  Earlier, it was
thought that photographic copies of important research material would
permit the proliferation of data, giving researchers remote access to
material which would advance their studies.  This slide of one of Abraham
Lincoln's notebooks gives an idea of the fidelity of black and white
photographic reproduction.  The information is clearly there, but without
color, there is no sense of iron gall ink; with flat lighting there is no
sense of the texture of hand made paper.  In short, the data are available,
but the sense of time and place is not.  This form of photographic
reproduction has not solved the problem of preservation and transmission of

Microfilm is the only reproduction method discussed in the 1936 volume
which is still with us, to any extent.  Microfilm, with all of its
problems, like acidic storage boxes, inadequate processing, spotting,
sensitivity to temperature and humidity, is still with us.  I would like to
say at this time that I support the use of microfilm as a tool for
preserving and transmitting data.

I would also like to mention that I know of a few university and public
libraries, and state archives, which have cut books apart so that they
could be microfilmed.  The reason that I know about this is that I have
been approached to provide a solution to the problem of putting some of
these volumes back together.  Apparently, there is something in the
character of a librarian which trusts the bound book and does not trust a

In 1962, the Princeton University Press published a book by Fritz Machlup,
entitled _The Production and Distribution of Knowledge in the United
States_.  The book is very interesting, and I recommend it to you.  Some of
the information in this book is dated: for instance, chapter 8,
"Information Services," mentions lawyers, doctors, bankers, and such, but
there is no mention of librarians.

There is a table on page 319, an estimate of the number of computers
delivered, 1954-1959.  At that time, a large computer cost more than
$500,000, a medium computer cost between $100,000 and $500,000, and a small
computer cost less than $100,000.  In the six year period covered by the
table, 149 large computers, 709 medium computers, and 311 small computers
were delivered; a total of 1169 computers.  That was 26 years ago.  More
computers than that were sold in North America this morning before we
finished breakfast.

Something else Machlup said interested me.  "Automatic data processing
changes not only the methods by which information is processed but also the
speed and perhaps the accuracy, and therefore permits us to want much more
such processing."


Last August [1985], the Public Archives of Canada hosted the Society of
Photographic Scientists and Engineers for their Second International
Symposium: on the Stability and Preservation of Photographic Images.  One
of the speakers, John C. Mallinson, from the Center for Magnetic Recording
Research; UC San Diego, delivered a paper entitled "Archiving Human and
Machine Readable Records for the Millenia."

The substance of his paper was drawn from a report made to the Archivist of
the United States in 1984.  After examining a variety of media for archival
storage, the team he was on realized that durability of the media was not
the issue; availability of equipment to read the data was.  This
realization allowed the team to reduce the options for long term
utilization of data to two formats: paper and microfilm.  The capacity to
use magnetic or optically stored data depends on the availability of
machines to read the data and translate it into human readable form.

Mallinson gives the following figures for mean time before failure:
professional video recorders, 2,000 hours; IBM half inch computer tape
drives, 5,000 hours; and the gallium arsenide laser diodes for audio disc
players, between 1,000 and 1,500 hours.  Putting those figures in
perspective, 2,000 hours is one working year, assuming 2 weeks vacation.

Another way to look at the matter is that the industry has gone through 8
digital tape formats since 1952, 9 video tape formats since 1956, and as
new as optical discs are, that industry is in it's 3rd generation now.

Fortunately, the data recorded in one format can generally be transfered to
the new format when the time comes.  Unfortunately, that means that all of
the previously recorded data will have to transfered to the next generation
of equipment every 5-10 years.  The first time won't be so bad, but after
awhile, a substantial percentage of resources will have to be allocated to
a continuous process of transfering from one format to another.
That brings me back to paper and microfilm, as the most stable usable
format for preserving data for our own use and as our legacy to the future.
The problems and potentials of microfilm are fairly well known.  With good
quality control, from manufacture through exposure and processing, and with
good storage conditions, microfilm will last a very long time.  However, it
does not survive fire well, and mold can have a devastating effect.

That leaves paper, in libraries and archives, to carry our tale to the
future.  I don't know what impelled you to spend your lives with books.
For me, they are intellectual and technological artifacts.  I suppose that,
as a conservator, the technological aspect is most interesting to me;
although I have spent more time reading some books than restoring them.

A book is a collection of documents in the history of technology.  Vellum
leaves or paper; wooden boards or paste boards; inks, pigments, and gold;
iron, brass and bronze.  Each element represents the activity of one or
more trades.


Now I would like to discuss some of the artifacts of the future; the books
we have in our libraries right now.

As you know, the Library of Congress is developing a mass deacidification
system.  What you may not know is that the system LC is working on is
intended to be used for new accessions only, and that their projection is
that they will not need to use the system for more than approximately 20
years, because by that time virtually all publishers will be printing on
acid-free paper.  That may be true.  There is ample evidence that
environmental quality restrictions are making alkaline chemistry cheaper
than acid chemistry in the paper making industry.

Does that mean that the billions of books printed on poor quality paper,
the vast majority of books published since 1800 are lost to the future?

Will the 19th and 20th century be missing links in the future's past?  A
few books will survive, just as a few Dead Sea Scrolls have survived, and a
few Sumerian papyri have survived.  We have a sense of those people, but
they are dimly felt.

During the 1960's, research coming out of the Barrow lab indicated that
library collections in North America were deteriorating at a rate of
approximately 5% per year.

Dr. Richard Smith, President of Wei T'O Associates, has recently completed
his own survey of the rate of deterioration of library collections, before
and after deacidification with Wei T'O.
His findings, which will be published soon, substantiate Barrow's findings
and extend them.


The percentages which follow come from Dr. Smith's research.

Without deacidification, books lose strength at the rate of 4.8% per year.
Deacidified books lose strength at the rate of only 1.2% per year.

Put another way, a million dollar library collection is depreciating  at
the rate of $48,000 per year; a deacidified collection depreciates at the
rate of only $12,000 per year.



>Date:    Sun, 11 May 1997 09:36:29 -0500
>From:    Harmon Seaver <hseaver@ZEBRA.NET>
>Subject: Re: Bridges to the past; links to the future
>Jack C. Thompson wrote:
>> The MTBF of competently manufactured paper is measured in centuries and
>> millenia.
>   Then what you propose is that we take all our newspapers which are
>now being microfilmed or digitized and instead photocopy them on
>archival papers??? You are aware, I am sure, that newsprint has a
>shelf-life of only a few years at most????
>   Great idea, Jack! And how many man-hours would that take -- not to
>mention where the heck are we going to keep those mountains of
>paper????? And the enduring hatred of future researchers who would then
>have to wade thru those mountains, looking for those little articles,
>cursing us all the while, knowing how simple their work could have been,
>just doing a keyword search on an electronic archive. They curse us
>badly enough now having to blind themselves trying to read the microform

>Harmon Seaver hseaver@zebra.net http://www.zebra.net/~hseaver

>Luis Nadeau
>Fredericton, New Brunswick, Canada


>At one point, over the next decades, it should be possible to store data at
>the atom level on something as stable as stainless steel. Some patents have
>already been granted regarding this technology. Then we'll be talking about
>*massive* storage and hopefully means of massive digitization and
>transfers. This ain't here yet and I'm not holding my breath. The Library
>of Congress and other major organizations are keeping an eye on these
>issues. The Go Ahead signal will come from them.

Jack C. Thompson
Thompson Conservation Lab
7549 N. Fenwick
Portland, OR  97217


[Subject index] [Index for current month] [Table of Contents] [Search]