# Predicting Timed Traces With Neural Networks

Ayham Omar

Recently, deep learning has been applied to the ﬁeld of logical reasoning delivering promising results as a complement to classical algorithms. Particularly the Transformer architecture has been proved, in existing work, to be proﬁcient not only in predicting the satisﬁability of propositional and Linear Time Temporal Logic (LTL) formulas but also in the ability to construct satisfying solutions for these formulas. However, applying deep learning algorithms to formal speciﬁcations in the more complex continuous-time domain has not been explored yet.

In this work, we introduce the problem of predicting a satisfying timed trace for a Metric Interval Temporal Logic (MITL) formula to a state-of-the-art Transformer neural network. Speciﬁcations in MITL contain explicit time intervals to reason about the behavior of real-time systems, thus enforcing the Transformer to predict more profound traces.

We describe the Transformer architecture and explain the methods used in generating meaningful training data for a supervised training approach. Furthermore, we conduct several experiments to determine to what extent the model learns the semantics of MITL. We, to this end, diﬀerentiate between the semantic and syntactic accuracy of the solutions predicted by the model.

We ﬁnd that Transformers prove proﬁcient in solving MITL formulas, reaching over 90% of accuracy in some experiments. We also observe that a trained Transformer can predict correct solutions that deviate from the ones constructed by the data generator, demonstrating signs of generalizing to the semantics of the logic. This generalization property was even evident when challenging the Transformer with formulas much longer than it encountered during training or even formulas on which the data generator timed out. Concerning the stage of studying the eﬀects of the size of the time intervals on the Transformer, we ﬁnd that the Transformer continues to deliver good results when faced with much bigger intervals. An interesting result since solving formulas containing big intervals is particularly expensive for most classical approaches.

Bachelor Thesis.