Writing the other day in Quartz, an admirable sister publication of The Atlantic magazine, the experienced technology watcher Christopher Mims struck a gloomy note. Under the headline “2013 was a lost year for tech,” he lamented that “all in, 2013 was an embarrassment for the entire tech industry and the engine that powers it — Silicon Valley. Innovation was replaced by financial engineering, mergers and acquisitions, and evasion of regulations. Not a single breakthrough product was unveiled.”

Warming to his gloomy theme, Mims argued that: Innovations in smartphones had stalled (“2013 was the year smartphones became commodities, just like the PCs they supplanted”); “smart-watches were easily the biggest letdown of the year”; “former giants” (i.e., Microsoft, Intel and Blackberry) had continued their “inglorious decline”; “mergers and acquisitions had replaced innovation”; social media became “profitable if not compelling”; mainstream media’s appetite for sensational stories made them vulnerable to “techno-hype” about stuff such as Bitcoin; and of course the NSA revelations cast a chilly spell over all things technological.

As an end-of-year retrospective piece, Mims’ essay was perfectly workmanlike. After all, a glass can be half empty or half full, depending on what point of view one wishes to uphold. But it had a predictably annoying impact on people in Silicon Valley, who tend to think of Palo Alto as the centre of the known universe. One complainant was Om Malik, who is at least as experienced a tech watcher as Mims. “Dear Quartz,” he wrote, “maybe it’s Quartz that needs new glasses and a map. 2013 was not a lost year for tech.”

The essence of Malik’s argument was that it all depends what you mean by “technology.” If you mean the flashy, consumer-product stuff, then Mims’ dismissive view of 2013 may indeed be valid (though Malik disagrees with him about the iPhone 5s, citing its M7 chip as a development with major disruptive capabilities). But if you think of “technology” as the deep structure that eventually enables all kinds of disruptive developments, then it’s meaningless to talk about stops and starts in innovation, because the really big stuff is also on a slow burn. Even in a fast-moving industry such as computing, it can sometimes take 25 years before a major technological breakthrough starts to show results in terms of products, services and major industrial disruption.

As an example, Malik cites Amazon’s launch of Amazon Web Services (its cloud-computing operation) in 2006. Back then, he writes, “there weren’t very many of us who had an idea that it would one day become the key component of an economic engine that would jump-start entrepreneurial activity across the planet. No one thought that (cloud computing) was sexy. Today, if you ask Dropbox CEO Drew Houston, he will have a few billion reasons to think of AWS as the greatest thing since sliced bread. Yeah, that joke of a service will soon be a multibillion-dollar business that has put everyone from Oracle, Dell and HP on thin ice.”

I’m with Malik on this. Cloud computing is a good illustration of why much media commentary about — and public perceptions of — information technology tends to miss the point. By focusing on tangible things — smartphones, tablets, Google Glass, embedded sensors, wearable devices, social-networking services and so on — it portrays technology as gadgetry, much as earlier generations misrepresented (and misunderstood) the significance of solid-state electronics by calling portable radios “transistors.”

What matters, in other words, is not the gadget but the underlying technology that makes it possible. Cloud computing is what turns the tablet and the smartphone into viable devices. And underpinning cloud computing and most of the shiny stuff we take for granted — from the Web to Skype to Facebook to the iTunes Store to eBay to Amazon to Google — is the good ol’ Internet, which was created in the 1960s and ’70s with public money and no expectation of profit. Without the Net, none of what we take for granted today would have been possible. And yet when the Net first appeared, almost nobody understood its significance — and one of Mr. Mims’ predecessors might have been complaining in December 1983 (11 months after the network had been switched on for public use) that it had been “a lost year for tech.” Plus ça change!

In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.