Samuel Arbesman on the importance of long-term data

Posted on Thursday, January 31st, 02013 by Austin Brown
link Categories: Long Term Science, Long Term Thinking   chat 0 Comments

Digital data is exploding in volume and there’s enough money in making sense of it all that it’s garnered its own buzzword lately: big data. In an increasingly measurable world, data-sets of unprecedented size and comprehensiveness are turning up new and genuinely exciting insights. Applied Mathematician Samuel Arbesman points out, though, that many of these data-sets are but snapshots, when it’s timelapse videos we need to really understand something:

Why does the time dimension matter if we’re only interested in current or future phenomena? Because many of the things that affect us today and will affect us tomorrow have changed slowly over time: sometimes over the course of a single lifetime, and sometimes over generations or even eons.

Datasets of long timescales not only help us understand how the world is changing, but how we, as humans, are changing it — without this awareness, we fall victim to shifting baseline syndrome. This is the tendency to shift our “baseline,” or what is considered “normal” — blinding us to shifts that occur across generations (since the generation we are born into is taken to be the norm).

Arbesman spoke last year at a Salon event at The Long Now Foundation on his book, The Half-Life of Facts. He explained that there are patterns in the ways our scientific knowledge changes over time. Much of what we take to be true today has a half-life: it will decay at a predictable rate as new science overturns our current understanding. Long data, of the type he champions in this recent article, is essential to unearthing these types of insights and avoiding a static understanding of a dynamic world.

(The image above is a page from a notebook of Isaac Newton’s.)

  • http://twitter.com/doug_laney Doug Laney

    Great concept of an information “half life” but we data warehouse professionals have architected-in the original tenets of “time variance” and “non-volatility” since Bill Inmon (father of the data warehouse concept) first codified them in the late 80s. Understandably this gets difficult in the realm of Big Data when the notion of physically moving and integrating high volume, velocity and variety data streams becomes untenable. –Doug Laney, VP Research, Gartner, @doug_laney

  • HT

    We now have ‘shifting baseline syndrome’ to loath. Today’s norms are slowly decaying and altered much like physical landscapes. So this is ‘applied mathematics?’

  • HT

    I think what’s important here is that long term, or quantities of data have reached a zenith, only to be expanded enabling a larger scope of data over a greater range of sources not excluding any previous data banking and storage. This is certainly more than a concept.


navigateleft Previous Article

Next Article navigateright