The Return of the Past That Never Was
Sitting on our old car seat on the front porch last night, as we do most evenings, watching the sun drift down, I got to thinking about the function of memory as the key to the past.
Our generation, and even my elder daughters’ generation, tend to wonder how and if the kids who’ve grown up with instant “presence” – the ability to immediately share the sights, sounds and broad spectrum of present reality – will be able to understand a world (our world) where events could not be shared as a matter of course.
But what I was thinking about at twilight was the fact that we and all the generations of roughly the last two centuries suffer from a similar inability to imagine the hundreds of thousands of years when the past existed only as personal memory and communal consensus.
Until the invention of photography in the early 19th century, all events died without physical trace. Yes, an artist might attempt to reproduce a face, a set of attributes, a battle scene, but the result was filtered through conflicting memories and the unique, often idiosyncratic attitude of the artist.
Printing was the first development to “freeze” an aspect of the past: For the first time, a specific collection of words could be reproduced exactly, in quantity, without the inevitable human lapses and errors of a Bartleby. As a result, oral poetry – the recited epic of Homer – already in decline, largely disappeared. Such prodigious arabesques of memory where no longer needed. (I’ve often wondered what new ways of cadging change the literate blind devised.)
But photography upended memory itself. Not just written words, but the exact lineaments of everything visual could be recorded (if initially without color), fixed in place, and reproduced essentially without limit. Uncle Eustace, in his twenties, could live not only into his old age, but into the old age of his grand- and great grandchildren.
The camera, the mechanical child of physics, has no attitude, no personal investment, no “outlook.” You may quibble, of course, that the image depends on the angle of shooting, the stance of the photographer, the limitations of materials and manufacture. But any two cameras, of the same type and quality, will produce essentially the same picture when the conditions are equal. With photography, individual memory and common consensus were trumped by an independent record that could be pointed to and declared definitive.
Later in the 19th century, sound recording produced almost the same change. The ability to achieve accurate reproduction was initially limited, but the effect was the same – the recorded song or sonata reproduced precisely the particular rendition of that song or sonata on a given occasion, unlike all others. Previously, you could describe a voice like Caruso’s, but you could in no way present it. The difference was one between imagination and immediate perception.
So, for none of us now alive – and none of our ancestors of at least the last century and a half – has the past been confined to personal memory. We can thumb our photo albums to pull Uncle Eustace from his grave or place a needle in a groove to accurately relive the music of our youth. Can any of us truly wrap our minds around life with an irreproducible past?
I delight in the online world and, like most today, feel angry (even cheated) when all information does not swoop instantly to hand. (And god forbid my ISP should tumble into the ditch, as it did last week: “A tornado two miles down the road? Why the hell does that mean I can’t access news from Ukraine?”)
Yet sometimes I have this gnawing sensation that life loses something when factual mystery has vanished. Will my brain atrophy if I’m no longer forced to trace myriad possibilities back to a most likely probability? In the midst of nailing down some obscure bit of etymology, I’ll be hit with a pang of loss: A lingering quizzicality that had been my cloaked friend all these years has been executed by the Google mafia. And would we really want to recover the lost eight hours of Greed only to find them mundane?
(My reluctance to use a camera – shared by my eldest brother, Rod – may stem in part from a desire to keep my memories firmly mine, even if that leaves them inaccurate and subject to the muddling meddling of my mind.)
As printing reduced the need for hand-copying and oral recitation, so has the ubiquitous presence of recorded music obliterated the small communal music gathering. Up here in the mountains of north central Pennsylvania, the tunes of the settlers have all but disappeared under the attack of generic country radio – not only for the younger generation, but for just about everyone. The very oldest residents express nostalgia for the progressive parties that took neighbors from one farm to another in horse-drawn wagons (summer) or sleds (winter) to share live jigs and polkas while dancing across living rooms where the furniture had been pushed back against the walls.
How have our brain patterns changed through the reduced need for retentive memory combined with the increased interaction of information from vastly wider and more varied sources? And does external, nominally accurate reproduction make memory stronger or more feeble?
Yet, another trend has come along in the last 20-30 years that’s at least a partial negation of the formerly inexorable movement to capture and imprison reality.
In the late ‘60s and early '70s, I had long phone conversations with my other brother, Vic, about “high-fidelity” (later, “stereo”) music systems. Since the time of Edison, the push had been to reproduce sound as closely as possible to how a listener would hear it if present at the live (auditorium or studio) performance. High Fidelity magazine was the bible of such things, constantly testing amplifiers and receivers for “wow” and “flutter.” (I always pictured these terms as accompanied by flourishes from a '20s vamp.)
I remember their test which, at the time, nearly put the whole issue to rest: A new Sony amp had come out with a distortion level below what the testing devices could register (never fear – the testers went on to develop more sensitive apparati). Vic was particularly into this aspect of sound analysis. I think he listened to the distortion more than the music (his record collection was mostly Mantovani, for crap’s sake).
But what’s happened since the coming of iTunes? Music, immediately available, is downloaded as mp3s and listened to with earbuds. The whole concept of “fidelity,” except with an aural elite, has been abandoned. On the one hand, this is a pretty decent trend, since it indicates a more direct interest in the music itself, rather the mechanism of reproduction. On the other, it reflects a death of discernment.
Photoshop has had a similar effect on visual images: We can no longer trust (and most times no longer care about) fidelity to the object photographed. Uncle Eustace has lost the scar across his left cheek and his redneck cowlick. And when did his deliver boy’s cap turn into a Stetson?
This helical kind of progress – everything advancing while everything cycles – may have invaded all of our culture. As Google and others work to develop driverless cars, in a funny sense they’ve reproducing the horse. Before automobiles, the carriage driver was free to fall into a reverie while the horse plopped along on its own, needing only an occasional light tug of the reins to suggest a change in direction. With driverless cars, GPS becomes the reins, while the “driver” examines the online New York Times, rather than the internal reaches of his mind.
by Derek Davis