A new algorithm, that could be applied to a wide range of complex problems in computer models treated accelerates processing speed so much that these calculations normally require several days pass completed in several hours.
Working with computer models is to work in a world of unknowns: the models that simulate complex physical processes from changing Earth’s climate performance of hypersonic combustion engines are amazingly complex, sometimes incorporating hundreds of parameters, each of which describes a part of a larger process.
The parameters are often questions in their models, remain largely unknown to all their contributions. To estimate the value of each unknown parameter is needed to introduce hundreds, if not thousands, of values and run the model each time to get closer and closer to the exact value, a computing process that can take days and sometimes weeks.
Now, the team of Youssef Marzouk, the US Massachusetts Institute of Technology (MIT) in Cambridge, has developed a new algorithm that greatly reduces computing almost any computer model.
The new algorithm, rather than systematically test each value takes into account the chances that computing will increasingly well as multiple runs of a model succeed. Using this strategy, and in combination with some relevant data, getting closer to its goal: a probability distribution of values for each unknown parameter.
With this method, in tests performed researchers have managed to find the same answer found by classical computing approaches, but 200 times faster than these.
Early in the analysis, the algorithm draws what could compare with large and vague throughout all the land of the board targets. After successive runs with the model and the data, the algorithm targets gradually become narrower, focusing on the peaks of the “ground”, spaces or values which most likely represent the unknown parameter.