Non-deterministic parallelism considered useful
University of Cambridge, Computer Laboratory
Proceedings of the 13th USENIX conference on Hot topics in operating systems, HotOS’13, 2011
The development of distributed execution engines has greatly simplified parallel programming, by shielding developers from the gory details of programming in a distributed system, and allowing them to focus on writing sequential code [8, 11, 18]. The "sacred cow" in these systems is transparent fault tolerance, which is achieved by dividing the computation into atomic tasks that execute deterministically, and hence may be re-executed if a participant fails or some intermediate data are lost. In this paper, we explore the possibility of relaxing this requirement, on the premise that non-determinism is useful and sometimes essential to support many programs.
September 21, 2011 by hgpu