UDC 004.8
MODIFICATION OF ASSOCIATIVE GAUSSIAN MIXTURE MODEL IN EXPERT ASSESSMENTS
D. V. Gorbunov, post-graduate student, BMSTU, Moscow, Russia;
orcid.org/0000-0002-4646-2636, e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it..
K. L. Tassov, lecturer, BMSTU, Moscow, Russia; e-mail: This email address is being protected from spambots. You need JavaScript enabled to view it..
S. V. Telegin, bachelor, BMSTU, Moscow, Russia;
orcid.org/0009-0000-6637-7124, email: This email address is being protected from spambots. You need JavaScript enabled to view it.
The idea of ensemble training is to organize such a pool of experts that will allow them to be combined into a common system. Various prediction algorithms including neural networks can be considered as an expert. The essence of ensemble methods is that each of the experts provides its own response which can then be used by a generalizing algorithm. One of the generalizing algorithms for classification is the associative Gaussian mix-ture model. One expert for one class determines the result in only one area that is a disadvantage. Therefore, a modified method of associative Gaussian mixture model is proposed where one expert determines the result in several areas at once. Within the framework of the method an algorithm is proposed that includes the following steps: input data normalization, model training based on normalized input data, finding output values. The method can serve in any areas in which "local experts" are used and where it is necessary to rank experts. The article discusses the unification of associative Gaussian mixture model and Rumelhart perceptron, input data preprocessing, training and prediction using a modified method of associative Gaussian mixture model, the results of obtained output values are presented.
Key words: : modified associative Gaussian mixture model, associative Gaussian mixture model, weak experts, neural networks, Rumelhart perceptron