LONDON – The communications theorist Marshall McLuhan observed that “we look at the present through a rear-view mirror.” And that “we march backwards into the future.” Amen. Remember the horseless carriage? Not to mention the fact that we still measure the oomph of a Porsche 911 in, er, brake horsepower.
But the car industry is a ferment of modernism compared with the computer business. When the bitmapped screen and the Wimp (windows, icons, menus, pointer) interface first surfaced in the early 1970s at Xerox Parc, its geeks searched for a metaphor that would make this new way of relating to computers intelligible to human beings. So they came up with the “desktop” on which were displayed little images (icons) of documents and document folders, just like you’d find on an actual desktop. Well, on the desktop of an efficient bureaucrat anyway.
But then they ruined everything by putting a trash can on the desktop. And Bill Gates & Co compounded the offence when they released Windows 95, which also had a start button on the desktop. The result was that, for a time, when most of the world’s computer users wanted to switch off their machines they had to press start. Even the car industry thought that was weird.
The problem with metaphors is that they are double-edged swords (as it were). On the one hand, we need them because they help us make sense of the new, which is where the horseless carriage, “coachbuilt” limousines and so on came from. Metaphors “carry explanatory structures from a familiar domain of experiences into an other domain in need of understanding or restructuring,” as the theorist Klaus Krippendorff has written.
But at the same time as they help us make sense of something, metaphors also constrain our thinking by locking us into the past. When the Web first appeared in 1991, for example, the obvious metaphor was that of a global library — a vast treasure house of digital artifacts held in repositories (sites) that could be accessed by anybody.
So although the Web has changed out of all recognition in two decades, our underlying metaphor for it probably hasn’t changed that much. And this has the downside that we’re effectively blind to what is actually happening, which is that we are moving from a world of sites and visits to one that is increasingly dominated by streams. The guy who articulates this best is a computer scientist at Yale University named David Gelernter.
The title of his latest essay on the subject — “The End of the Web, Search, and Computer as We Know It” — conveys the basic idea. “The space-based Web we currently have will gradually be replaced by a time-based worldstream,” he writes. “This lifestream — a heterogeneous, content-searchable, real-time messaging stream — arrived in the form of blog posts and RSS feeds, Twitter and other chatstreams and Facebook walls and timelines. Its structure represented a shift beyond the ‘flatland known as the desktop’ (where our interfaces ignored the temporal dimension) towards streams, which flow and can therefore serve as a representation of time.
“It’s a bit like moving from a desktop to a magic diary: picture a diary whose pages turn automatically, tracking your life moment to moment. … Until you touch it and then the page-turning stops. The diary becomes a reference book: a complete and searchable guide to your life. Put it down and the pages start turning again.”
Gelernter thinks that this diary-like structure is supplanting the spatial one as the dominant metaphor for cyberspace — and he may well be right. At any rate, he’s not the only geek thinking like this. The social-media expert Danah Boyd has also written perceptively along the same lines. As a metaphor, it certainly provides a way of making sense of the attractions of Facebook — now dominated by its timeline technology — and of Twitter, especially as seen through a stream-browser such as Hootsuite. So it will do for now, so long as we remember John Locke’s warning: that metaphors “must be made use of to illustrate ideas that we already have, not to paint to us those which we yet have not.” He wrote that in 1796.