Sunday, July 06, 2008

End of Science? End of Theory?

Chris Anderson has done it again, this time with an article about the end of theory. How? In short: raw data. In End of Theory, he believes that with massive data, the millennial-long scientific model of hypothesize, model, test is becoming obsolete, Anderson believes.


Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

And according to Anderson, biology is heading in the same direction. What does this say about science and humanity? In February, the National Science Foundation announced the Cluster Exploratory, a program that funds research designed to run on a large-scale distributed computing platform developed by Google and IBM in conjunction with six pilot universities. The cluster will consist of 1,600 processors, several terabytes of memory, and hundreds of terabytes of storage, along with the software, including IBM's Tivoli and open source versions of Google File System and MapReduce.

Anderson's been right before. See Long Tail and Free. But this one's just speculation of course. Perhaps one commentator hit the point when he says, "Yeah, whatever. We still can't get a smart phone with all the bells and whistles to be able to use any where in the world with over 2 hours worth of use and talk time...so get back to me when you've perfected all of that." Well said. Let's wait and see some more.

No comments: