Rudy Bunel*, Alban Desmaison*, Pushmeet Kohli, Philip H.S. Torr, M. Pawan Kumar
This paper proposes an adaptive neural-compilation framework to address the problem of efficient program learning. Traditional code optimisation strategies used in compilers are based on applying pre-specified set of transformations that make the code faster to execute without changing its semantics. In contrast, our work involves adapting programs to make them more efficient while considering correctness only on a target input distribution. Our approach is inspired by the recent works on differentiable representations of programs. We show that it is possible to compile programs written in a low-level language to a differentiable representation. We also show how programs in this representation can be optimised to make them efficient on a target distribution of inputs. Experimental results demonstrate that our approach enables learning algorithms for given data distributions with a high success rate.
If you don’t want to install the haskell dependency of the compiler and are only interested in compiling your programs, you can usethis binary, a compiled version of this revision .