Sometimes to simulate some complex events one cannot use standard computing facilities because of the following reasons: the computational performance is not enough to process required data, or quite the reverse, the input data are not enough to predict all outcomes.
In that case predictive modelling is usually used. It starts with loading the data on a supercomputer, which develops a model. The computer model is a virtual sample of a real situation. Using the results of data processing the experts design various scenarios. It doesn`t mean that the system predicts the situation because it just determines the conditions that affect the situation.
“For example, analyzing a natural disaster the experts take into account such environmental changes such as volcanic or even nuclear explosions so as to design all possible scenarios. Using these predictions we can analyze the behavior of an “ocean-atmosphere” system,” said Alexander Boukhanovsky.
The research team of ITMO University used predictive modelling to develop a mathematical support and a software for regulating a dam control system of St. Petersburg.
These technologies allow to simulate all urban processes such as crowd behavior and traffic, as well as infection transmission routes and the behavior of criminal organizations. The specialists use migration service`s data, surveillance camera records, geolocation apps` data and other kinds of information. Thanks to this method one also can determine how many sensing devices are required for correct estimations.
On the one hand it seems that supercomputers can cope with all tasks but on the other hand technologies are not so independent to succeed in managing all urban processes without any support. Predictive modelling systems don`t make final decisions, the only thing they do is offer various solutions. Moreover, human behavior is complex so it`s very hard to predict it.
Dealing with complex systems one also faces the problem with calculating process paralleling. Usually to simulate something experts use several computers that have different architecture and computational performance. It is a great challenge to make them work synchronously. It is also very important to use computational resources rationally because supercomputers consume large amounts of energy. To solve this problem the professor offered to use distributed cloud computing.
Mr. Boukhanovsky also talked about the evolutionary modelling, which is used to simulate a new situation when a researcher cannot develop scenarios by himself or to change a model by adding new information.
“Using some frame a computer develops a new model. This process is based on computer-aided learning. This method gives an opportunity to create a “population” of various models choosing the best ones then the system continues to produce new models. One also uses ensemble forecasting suing which supercomputers develop new samples combining different decisions,” said Alexander Boukhanovsky.