Generating and Solving Temporal Logic Problems with Adversarial Transformers

Jens Kreber

The applications of deep learning models have recently advanced towards symbolic domains that require complex reasoning and on which previously, only classic algorithmic approaches have been used. One such domain is hardware and software verification, where linear-time temporal logic (LTL) is commonly used as specification language. Recent work made use of the neural Transformer architecture to obtain satisfying traces for given LTL formulas. Firstly, this thesis continues the idea and tackles the LTL satisfiability problem both with standard Transformer encoders and universal ones, where especially the latter achieve a remarkable accuracy. To enable training, a new data generation process for obtaining balanced classes of LTL formulas based on flexible patterns is presented. Secondly, this thesis addresses a fundamental problem of learning in symbolic domains: Obtaining a representative training dataset when no rule-based generation process is available, but only a small set of collected real-world instances. To this end, a new type of generative adversarial network (GAN) for symbolic domains built with Transformer encoders is presented. It is able to synthesize large amounts of syntactically correct and diverse LTL formulas without autoregression. Two variants, a standard GAN and a Wasserstein GAN with gradient penalty are compared with respect to their performance and training requirements. On the LTL satisfiability problem, it is demonstrated that GAN-generated data can be used as a substitute for real training data when training a classifier and, especially, that training data can be generated from a dataset that is too small to be trained on directly. The GAN setting also allows for altering the target distribution: By adding a classifier uncertainty part to the generator objective, a new dataset is obtained that is even harder to solve than the original one.

Master Thesis.

(pdf)