Presented annually to the author(s) of a paper presented at the ICFP held 10 years prior to the award year. The award includes a prize of $1,000 to be split among the authors of the winning paper. The papers are judged by their influence over the past decade.
Selection Committee
The award given in year N is for the most influential paper
presented at the conference held in year N-10. The selection
committee consists of the following members:
The SIGPLAN Chair shall adjudicate conflicts of interest, appointing substitutes to the committee as necessary.
(for 2014) Refinement Types for Haskell
Citation:
This paper integrated refinement types in Haskell through the proposal of Liquid Types, a specific form of refinement types that balance expressiveness with decidability. Liquid Types provide a practical and powerful tool for ensuring program correctness. The paper has had significant impact on both academic research and practical programming, inspiring further work on refinement types in other languages and in applications such as program synthesis and software contracts.
(for 2013) Handlers in Action
Citation:
This paper drew the attention of the ICFP community to handlers as an abstraction for effectful computation. In exposition, and in providing the first operational semantics for handlers & effects, and an effect system, it seeded many subsequent works. Pragmatically, it detailed implementation of handlers in Haskell, and outlined the ideas behind OCaml, SML, and Racket implementations, as well as providing experimental results comparing handlers with equivalent monadic code.
(for 2012) Addressing covert termination and timing channels in concurrent information flow systems
Citation:
This paper is among the first to describe covert channels caused by termination and timing in concurrent programs, and it presents a mitigation strategy. It has received attention in the general security community, beyond the usual range of ICFP papers. There are several follow up papers that build on its insights and extend the results to more general settings. It presents a dynamic information control flow system that eliminates termination and internal timing covert channels, while mitigating external timing, all without relying on cooperative scheduling. It achieves this by running potentially sensitive actions in separate threads. The system is implemented as a Haskell library using labeled concurrency primitives, and demonstrated in an example server-side web application framework with untrusted client applications sharing a persistent key-value store, showing how the library addresses potential timing and termination leaks.
(for 2011) Frenetic: A Network Programming Language
Citation:
This is the first conference paper in a highly influential line of work creating a bridge between PL and networking. The authors analyzed deficiencies in the state of the art in languages for programming networks, addressed these limitations in a new language design, and described its implementation and performance on a suite of microbenchmarks. The language applied a range of ideas from the functional programming community: a declarative query language for classifying and aggregating network traffic; a reactive combinator library for describing high-level packet-forwarding policies; and a run-time system that automatically generates the low-level packet-processing rules, installing, uninstalling, and querying them on physical switches. This paper inspired a mountain of follow-on work by these authors and others. The practical value of improving network programming, and this paper’s influence on subsequent advances in the state of the art, make it a deserving winner of the award for Most Influential Paper of ICFP 2011.
(for 2010) Abstracting Abstract Machines
Citation:
The idea is to systematically go from a high-level program semantics to a nondeterministic finite-state machine that can be implemented not only efficiently but in a relatively straightforward manner.
The paper showed how this works for a wide variety of language constructs such as mutable state, first-class continuations and laziness, and how careful choices lead to analyses with tunable computational complexity.
A key advantage of the approach is that ideas from programming language design and implementation can be directly imported and applied to the design and implementation of abstract interpreters.
The approach has influenced static analyses for Erlang, JavaScript and other languages, and it has influenced many researchers who work on sound static analyses.
(for 2009) Runtime Support for Multicore Haskell
Citation:
One may think that purely functional programs should run well on parallel hardware without much extra effort. But, at the time the paper was published it had proved hard to realize this potential in practice. This state of the affairs changed with the work presented in the paper. Due to this work, runtime multicore support is one of the GHC’s unique features today. The language and runtime make it very simple to exploit degrees of parallelism. The multicore support features are widely used in real-world Haskell - the package implementing the multicore support is itself downloaded widely and most of Haskell’s top packages that are widely used in web-servers and web frameworks make use of some form of GHC’s parallel and concurrent runtime. Thanks to the multicore support, Haskell is perhaps one of the few languages, where STM is actually widely used in practice. Overall, the paper has a strong case for helping make and cement the case for why functional programming is a happy match for the multicore era.
(for 2008) Parametric higher-order abstract syntax for mechanized semantics
Citation:
This paper describes how adding parametricity to higher-order abstract syntax yields a powerful tool for simplifying proofs of programming-language metatheorems. Specifically, parametricity simplifies both the treatment of variables and contexts, and also the proofs of associated lemmas such as weakening and substitution.The paper applies the technique, known as PHOAS, to machine-checked proofs in Coq. This spurred many follow-on papers on syntax representation, mechanized metatheory, and certified compilation. Washburn and Weirich introduced a similar (but not identical) encoding for Haskell first, which inspired an implementation compatible with proof assistants that use dependent type theory. A key benefit of PHOAS is avoiding any modeling of contexts, for many worthwhile examples. A recent application is the fast, PHOAS-verified crypto code that is used by Chrome, which is the world’s most popular web browser.
(for 2007) Ott: Effective Tool Support for the Working Semanticist
Citation:
The ICFP paper describes Ott’s goals and design, focusing particularly on its metalanguage, which aims to balance mathematical precision and ease-of-use. Ott strikes a balance between informal notation and checked structure that is both lightweight and powerful. As its syntax is notationally close to existing vernacular, it is easy for language designers to adopt. At the same time, Ott’s flexible parser and native support for meta-functions and relations means that it “type checks” language specifications, catching subtle errors in definitions even without full encoding in a proof assistant. Over the past ten years, ICFP researchers have benefitted tremendously from the open-source tool and the effective design space exploration that it promotes.
(for 2006) Simple unification-based type inference for GADTs
Citation:
GADTs are a simple but powerful generalization of the recursive algebraic datatypes introduced in Hope and a central feature of functional programming languages like ML and Haskell. GADTs have become an important technique in the programmer’s toolbox, enabling a lightweight style of dependently typed programming, similar to the use of inductive families. However, type inference is difficult for GADTs, as they necessarily entail the loss of principal types. This paper shows how to exploit a few type annotations in order to make inference possible again. The key innovation is the notion of “wobbly types”. It has started a lively line of research in new approaches to combining type inference and type checking - in particular with System FC, an extension of System F with constraints, that has provided a simple yet powerful internal language for the GHC compiler - and made many of the benefits of dependent types accessible to more mainstream programmers.
(for 2005) Associated Type Synonyms
Citation:
Building on the 2005 POPL Paper “Associated types with class”, Chakravarty, Keller, and Peyton Jones added the ability to write open type functions directly in Haskell. The new opportunities for computation on the type level have become popular with Haskell programmers, and it would be hard to overstate the influence of this work on the community. Of greatest influence have been these three contributions to type-level programming:
Type-level functions offer a qualitative jump in the expressiveness of Haskell’s type system, adding some of the advantages of dependently typed languages while retaining pervasive type inference.
Associated types give type classes a functional notation for enabling one type to depend on another. As opposed to the relational notation provided by functional dependencies, associated types enable programmers to use the same style of programming at the type level that they do at the term level, making type-level programming more accessible.
The supporting language extension includes a type-level equality constraint, which supports the non-injective nature of type-functions. This extension has had many serendipitous uses, including in the surprisingly expressive low-level intermediate language System FC.
(for 2004) Scrap More Boilerplate: Reflection, Zips, and Generalised Casts
Citation:
In their TLDI 2003 paper, Lämmel and Peyton Jones introduced “Scrap Your Boilerplate” (SYB), a simple but powerful design pattern for generic programming in Haskell. SYB introduced a library of strategic combinators which one could use to compose generic custom traversals of complex data structures, such as syntax trees or XML documents. It was not initially clear, however, whether SYB would scale to handle more traditional applications of generic programming, such as serialization, deserialization, and generic equality, which require a form of reflection. The authors’ subsequent ICFP 2004 paper significantly extended their original SYB library with support for reflection, zips, and generalised casts, thus demonstrating the applicability of SYB to a much broader range of generic programming tasks. This broad applicability, combined with the ease with which programmers could pick up and use the library, led to its adoption in numerous Haskell developments, and contributed greatly to the burgeoning interest in statically typed, practically usable libraries for generic programming over the past decade.
(for 2003) MLF: Raising ML to the Power of System F
Citation:
Le Botlan and Rémy’s paper on MLF presented a major technical breakthrough in type inference for first-class polymorphism. Remarkably, by going from ML to a system that is more expressive than System F, principal types could be recovered. The paper was the first and only paper (to date) to present a type inference and unification algorithm with all the “good” features of ordinary ML inference and unification, except on a richer universe of types and constraints that allowed impredicative instantiation. The MLF system enjoys most general unifiers, completeness properties, and robustness under program transformations, with user type annotations providing the oracles needed to reach the expressiveness of System F and beyond. The MLF paper has been highly influential on subsequent work in type inference for first-class polymorphism, and is a must-read for anyone interested in the topic.
(for 2002) Contracts for higher-order functions
Citation:
Assertion-based contracts have proven very useful for dynamically enforcing first-order program invariants in procedural languages, but until 2002 they had not been supported in languages with higher-order functions. Findler and Felleisen’s paper filled this gap, presenting “the first assertion-based contract checker for languages with higher-order functions.” The paper also spawned a great deal of follow-on work on such topics as the semantics of blame assignment and the integration of static typing and dynamic contract checking. Higher-order contracts have become a central component of the Racket (formerly PLT Scheme) system – one of the major software artifacts of the functional-programming community – which serves as both a research testbed and an effective tool for programming-language education.
(for 2001) Recursive Structures for Standard ML
Citation:
Various attempts at extending ML modules with recursions were made starting in the early 90’s. However, they all ran into a nasty typing issue later dubbed the “double-vision problem” by Derek Dreyer. Russo’s ICFP 2001 paper was the first to propose a correct and practical solution to this problem. The solution played very cleverly on the nonstandard formalization of ML modules developed in his thesis. Building on this novel insight, the ICFP 2001 paper develops a complete design for recursive modules in ML, which Russo implemented in Moscow ML and which is complete enough to handle many desirable examples. Russo’s design was the main source of inspiration for adding recursive modules in OCaml, taking recursive modules from an open research issue to a realised language feature that simply works.
(for 2000) Quickcheck: a lightweight tool for random testing of Haskell programs
Citation:
This paper presented a very simple but powerful system for testing Haskell programs that has had significant impact on the practice of debugging programs in Haskell. The paper describes a clever way to use type classes and monads to automatically generate random test data. QuickCheck has since become an extremely popular Haskell library that is widely used by programmers, and has been incorporated into many undergraduate courses in Haskell. The techniques described in the paper have spawned a significant body of follow-on work in test case generation. They have also been adapted to other languages, leading to their commercialisation for Erlang and C.
(for 1999) Haskell and XML: Generic combinators or type-based translation?,
Citation:
Malcolm Wallace and Colin Runciman’s 1999 ICFP paper “Haskell and XML: Generic combinators or type-based translation?” was one of the first papers to spell out the close connection between functional programming languages and XML. It described a typed correspondence between XML’s document type definitions (DTD) and Haskell datatypes. This correspondence leads to a natural representation of XML data in functional languages and permits native functions to operate on this representation. The paper also describes a generic encoding of XML trees together with a combinator language for querying and transforming the XML. The paper led to a widespread awareness of the close connection between XML and functional programming and initiated a flurry of research on functional XML processing. Moreover, the accompanying implementation was widely distributed as part of a CD with various XML tools, thus making an impact in the problem domain beyond the functional programming community: a perfect example of functional programming solving real world problems.
(for 1998) Cayenne — a language with dependent types,
Citation:
Lennart Augustsson’s 1998 ICFP Paper “Cayenne: A language with dependent types” made dependently-typed programming accessible to non-type theorists. The language design was a bold one, both in its use of dependent types and in its adoption of an undecidable type system. Although the idea seemed quite radical at the time, it allowed future work to break out of the straight-jacket of decidable type systems. It also pointed the way towards the merger of programming languages and proof systems that we are starting to see today in languages such as Agda.
(for 1997) Functional Reactive Animation,
Citation:
“Functional Reactive Animation” by Conal Elliott and Paul Hudak was the first published paper on functional reactive programming. It described a collection of data types and functions that comprised an embedded domain-specific language called Fran for composing interactive, multi-media animations. The key abstractions were first-class behaviors and events. Intuitively, a behavior is a value that varies with continuous time while an event is a discrete counterpart including time-varying predicates. The idea of regarding the entire lifetime of a time-varying quantity as a single first-class value was new and very surprising at the time, but the paper made it seem simple and obvious. The insight in the paper led to a significant number of follow-on projects including FranTk, Fruit, Pidgets, FrTime, Frob, FRP, Frappe, Frag, Fvision, Yampa, Feris, and work on embodying financial contracts in functional terms.
(for 1996) Optimality and inefficiency: what isn’t a cost model of the lambda calculus?
Citation:
Julia Lawall and Harry Mairson’s 1996 ICFP paper “Optimality and inefficiency: What isn’t a cost model of the lambda calculus?” exposed a fundamental problem with proposed algorithms for optimal reduction. Starting with Jean-Jacques Lévy’s seminal work in 1978, the goal of optimal reduction was to correctly normalize lambda-calculus terms without duplicating redexes. Various strategies were subsequently devised to realize optimal reduction, notably the solution of John Lamping at POPL 1990, then simplified and improved by Georges Gonthier, Martín Abadi, and Jean-Jacques Lévy at POPL 1992. Each solution used subtle bookkeeping mechanisms to control sharing.
Lawall and Mairson showed that these bookkeeping mechanisms introduced a complexity and inefficiency of their own. They discovered terms that could be normalized in linear time, but whose bookkeeping costs required exponential time. They further showed that Frandsen and Sturtivant’s cost model for lambda-calculus reduction, presented at FPCA 1991, needed to account for the size of intermediate terms, and that optimal-evaluation interpreters were at least exponentially slower than the proposed cost model. Lawall and Mairson concluded that the notion of optimality did not necessarily coincide with that of efficiency. As a consequence, different and possibly optimal evaluation strategies were still needed, as were more realistic cost models. Subsequent work in this area has focused on such cost models, on further analysis of the inherent complexity of optimal reduction, and on relaxing the optimality condition in exchange for lower bookkeeping overhead and greater overall efficiency.