The question has been raised, what is distributed computing? Why is it used? We hope today's entry will help to flesh out the process that occurs between our home computers and the larger project.
Computer processors work on data. When we use our computers independently, it is similar to a person on a cart pulling a load with one horse. Here, the power is limited yet the control is (potentially) complete.
When a load is larger than the power of the strongest available horse, a team of them is needed. This presents a challenge: how are those horses coordinated? The problem is worsened if we replace some of the horses with mules, donkeys or elephants -- this is a challenge to distributed computing projects -- the computers are independent and aren't identical.
Whether a team is pulling a load of 'stuff' or data, there must be control. Distributed computing projects employ a load manager, much as a team of horses has one driver with reigns. This divides workload among the 'clients' (or animals), making sure each one isn't overburdened, while maintaining control. To accomplish this, software called middleware acts as a translator between different types of 'client' computers and the resources they work upon. Just as a successful team's force is coordinated in one direction, our computers can work together meaningfully.
What is the larger effort that climateprediction.net works towards? Initially, climate models could only be solved by supercomputers. Yet even then the most powerful supercomputer could only solve a "simple " model -- one with a narrow range for a given variable (atmospheric carbon dioxide, sulphur, et cetera). Often these models were limited to only one value, the one judged most likely. This project uses distributed computing to solve far more challenging models, where human impacts on the environment are evaluated more broadly and finely. When our computers work together, they can, collectively, solve larger problems than many super-computers.
1 comment:
Post a Comment