According to this WIRED blog entry IBM and Cray have both cracked the petaflop barrier.
Computer scientist Mark Seager of Lawrence Livermore National Laboratory claims that this will change the scientific method for the first time since Galileo invented the telescope (in 1509)”.
The reason for that is that simulation and approximation can be used to come to acurate models of complex phenomena instead of just reasoning about formula by theory and experimenting to prove those.
With 362 terabytes of memory and 1.059 quadrillion floating-point calculations per second the Jaguar of the Oak Ridge National Laboratory is tuned for scientific calculations like climate and energy models, drug discovery, new materials, etc.
The question arises if these amounts of speed and data processing could one day break one fundamental rule: that some problems will always be beyond discovery through calculation. Neurology, psychology, sociology, economy and cultural studies are scientific areas that haven’t really started yet. Large scale simulation can be the one scientific method that is missing for those (implied that the methods of observation deliver enough data to model upon).
And if so, there is a danger that even governmental policies may one day be driven by probability and not ethics.
Leave a Reply