This year’s New Year’s cleaning was quick: Pull out the file of Y2K clippings and dump all the doom and gloom in the trash with nary a backward glance. That got me digging through other files, and I spent a merry half hour reliving the Internet’s infancy: the prospect of genuinely mobile computing (shades of i-Mode), the revolutionary notion of fixed access fees and — gasp — free PCs, warnings that businesses had to embrace e-commerce or die. With prodigious mental effort, I can even remember life before there was the Web, a veritable digital Precambrian era.
Memory is a curious commodity in the digital world. In one sense — the physical sense — it’s just about free. There are gigabytes at our fingertips, yet a mere decade ago we were lopping two digits off dates to save precious space. At the same time, memory — the ability to look back and assess the past — is shrinking.
My file cleaning and the downward spiral in the cost of memory are a stark reminder of just how fast Internet time is. A dog year is the equivalent of seven human years, but that’s slug time compared to Net speed.
That matters for businesses contemplating the leap into cyberspace. Business theorists call it “the first mover’s” advantage — being first brings exponential gains that are difficult, if not impossible, for subsequent market entrants to make up for. The problem, of course, is that the business model keeps changing. Today’s brilliant strategy is tomorrow’s dud; pure fancy right now becomes fact overnight.
Another measure of Net speed is the exploding number of Web pages out there. A recent study concluded that there are over 1 billion of them, a 25 percent increase in one year. It’s estimated that the total could reach 8 billion within three years.
This situation has created a huge new industry: makers of search engines. Last summer, researchers at the NEC Research Institute showed that even the best search engine only covered 16 percent of the Net; all the major search engines combined index only 42 percent. (Curiously, a study done 18 months earlier concluded that a total of 60 percent of the Web’s 320 million pages were indexed, and the biggest one, Hotbot, covered about a third.)
The findings have sparked all sorts of boasting and breast-beating about whose engine is better and why. Fast Search and Transfer claims bragging rights with access to 300 million pages, although Excite@home says its service covers nearly 1 billion.
Pages come and go. URLs expire, publishers lose interest, accounts change.
The Net is acquiring its own memory as folks press the Web into service. Brewster Kahle, a computing pioneer, has developed an online archive of the Web. It has received a lot of attention, for good reasons. Among other things, he used it to develop Alexa, a search engine that Mark Thompson wrote about a few years back and is one of the best on the market.
The archive creates problems of its own, however, one in particular that is rarely explored — the loss of Memory. That sounds odd at a time when memory — digital storage capacity — is virtually infinite. The In
ternet Archives is proof that we can, like Borges, build a map with a one-to-one correspondence to the real world.
But those archives are creating instant history, a real-time record of this moment, a vast scrapbook of the eternal now.
That’s troubling on two counts. First, an obsession with “now” denies us the perspective that is needed to live well and truly enjoy the present. “Carpe diem” rolls nicely off the tongue, but it encourages us to adopt an empty, unreflective existence. History and memory are necessary ingredients if we are to get through today and build a better tomorrow.
But creating memories and putting them to use is work. It requires effort and practice. And this is the second beef: Technology encourages us to leave the thinking to the machines. A scrapbook may help, but it doesn’t qualify as memory: At best, it’s a spur to help us remember; at worst, it is a substitute for memory.
A record is only a marker; it has no meaning shorn of the context in which it was created. Those vast storehouses of information need human effort to make sense of their contents. But their size encourages us to let the bot do the heavy lifting. Sure, we’re only supposed to let the software do the mechanical work, but we’re also lazy. Why bother chewing over this or that when it is recorded for all posterity? Lemme get back to you on that one…
The brain is a muscle, and like all muscles, it atrophies without use. Machines were conceived as labor-saving devices that would let us devote more energy to thinking. But now they think for us too. Take keitai. There is something odd — and disturbing — about people who don’t know their coordinates in the real world (their own phone numbers.) Call it trivial, but it portends greater losses in the future. (Remember the debate about letting kids use calculators instead of learning rudimentary math skills?) This is an age-old debate, but it has taken on a new urgency in the information revolution.
We continue to farm more of our memories out to services in the name of distributed computing, thin servers and PDAs. Is it a serious problem? Ask Jeeves.
Sheer accumulation is no substitute for the process of reflection — sorting, evaluating, ranking — that creates memory. In fact, the volume of data at our fingertips means the sifting requires even more effort. Thus, we face the supreme irony that just when technology puts yet more memory at our disposal our Memory risks being inundated and swamped.
It doesn’t have to be this way. Technology is seductive in the way that it induces passivity, but the trick is to ensure that it serves our needs, and not vice versa.
This issue raises another, equally squishy question: Does transience create value? When survival is in doubt, then survival means something. We say that great art is immortal, but what happens when everything is immortal? The big question is whether our vast storehouses of data — which, given new digitalization technologies, include even real people — will rob us of our past.