Compile time aanalysis for hardware transactional memory architectures

More Info
expand_more

Abstract

Transactional Memory is a parallel programming paradigm in which tasks are executed, in forms of transactions, concurrently by different resources in a system and resolve conflicts between them at run-time. Conflicts, caused by data dependencies, result in aborts and restarts of transactions, thus, degrading the performance of the system. In case these data dependencies are known at compile time, then the transactions can be scheduled in a way that conflicts are avoided, thereby, reducing the number of aborts and improving significantly the system’s performance. This thesis presents the Compiler insights to Transactional memory (CiT) tool, an architecture independent static analyzer for parallel programs, which detects all potential data dependencies between parallel sections of a program. It provides feedback about load-store instructions in a transaction, dependencies inside of a loop and branches, and severals warnings related to system calls which can affect the performance. The efficiency of the tool was tested on an application including different types of induced data dependencies, as well as several applications in the STAMP benchmark suit. In the first experiment, a 20% performance improvement was observed when the two versions of the application were executed on the TMFv2 HTM simulator.

Files