Wydawnictwa Uczelniane / TUL Press

Stały URI zbioruhttp://hdl.handle.net/11652/17

Przeglądaj

Wyniki wyszukiwania

Teraz wyświetlane 1 - 2 z 2
  • Pozycja
    Improvement of Attention Mechanism Explainability in Prediction of Chemical Molecules’ Properties
    (Wydawnictwo Politechniki Łódzkiej, 2023) Durys, Bartosz; Tomczyk, Arkadiusz
    In this paper, the analysis of selected graph neural network operators is presented. The classic Graph Convolutional Network (GCN) was compared with methods containing trainable attention coefficients: Graph Attention Network (GAT) and Graph Transformer (GT). Moreover, which is an original contribution of this work, training of GT was modified with an additional loss function component enabling easier explainability of the produced model. The experiments were conducted using datasets with chemical molecules where both classification and regression tasks are considered. The results show that additional constraint not only does not make the results worse but, in some cases, it improves predictions.
  • Pozycja
    Contextual ES-adRNN with Attention Mechanisms for Forecasting
    (Wydawnictwo Politechniki Łódzkiej, 2023) Smyl, Sławek; Dudek, Grzegorz; Pełka, Paweł
    In this study, we propose a hybrid contextual forecasting model with attention mechanisms for generating context information. The model combines exponential smoothing and recurrent neural network to extract and synthesize information at both the individual series and collective dataset levels. The model is composed of two simultaneously trained tracks: context track and main track. The main track generates forecasts and predictive intervals, while the context track generates additional inputs for the main track based on representative time series. Attention mechanisms are integrated into the model in six different variations to adjust the context information to the forecasted series and so increase the predictive power of the model.