ATS | |
Paradigms: | multi-paradigm functional, imperative, object-oriented, concurrent, modular |
Family: | ML |
Designer: | Hongwei Xi |
Developer: | Boston University |
Latest Release Version: | ATS2-0.4.2[1] |
Typing: | static, dependent |
Influenced By: | Dependent ML, ML, OCaml, C++ |
File Ext: | .sats, .dats, .hats |
License: | GPLv3 |
In computing, ATS (Applied Type System) is a multi-paradigm, general-purpose, high-level, functional programming language. It is a dialect of the programming language ML, designed by Hongwei Xi to unify computer programming with formal specification. ATS has support for combining theorem proving with practical programming through the use of advanced type systems.[2] A past version of The Computer Language Benchmarks Game has demonstrated that the performance of ATS is comparable to that of the languages C and C++.[3] By using theorem proving and strict type checking, the compiler can detect and prove that its implemented functions are not susceptible to bugs such as division by zero, memory leaks, buffer overflow, and other forms of memory corruption by verifying pointer arithmetic and reference counting before the program compiles. Also, by using the integrated theorem-proving system of ATS (ATS/LF), the programmer may make use of static constructs that are intertwined with the operative code to prove that a function conforms to its specification.
ATS consists of a static component and a dynamic component. The static component is used for handling types, whereas the dynamic component is used for programs. While ATS primarily relies on a call-by-value functional language at its core, it possesses the ability to accommodate diverse programming paradigms, such as functional, imperative, object-oriented, concurrent, and modular.
According to the author, ATS was inspired by Martin-Löf's constructive type theory, which was originally developed for the purpose of establishing a foundation for mathematics. Xi designed ATS “in an attempt to combine specification and implementation into a single programming language.”[4]
ATS is derived mostly from the languages ML and OCaml. An earlier language, Dependent ML, by the same author has been incorporated into ATS.
The first implementation, ATS/Proto (ATS0), was written in OCaml and was released in 2006. This was the pre-first edition of ATS and is no longer maintained. A year later, ATS/Geizella, the first implementation of ATS1, was released. This version was also written in OCaml and is no longer used actively.[5]
The second version of ATS1, ATS/Anairiats, released in 2008, was a major milestone in the development of the language, as the language was able to bootstrap itself. This version was written almost completely in ATS1. The current version, ATS/Postiats (ATS2) was released in 2013. Like its predecessor, this version is also almost entirely written in ATS1. The most recently released version is ATS2-0.4.2.[5]
, ATS is used mostly for research; less than 200 GitHub repositories contain code written in ATS. This is far less than other functional languages, such as OCaml and Standard ML, which have over 16,000 and 3,000 repositories, respectively. This is likely due to the steep learning associated with ATS, which is present because of the language's use of dependent type-checking and template instance resolution. These features usually require the use of explicit quantifiers, which demand further learning.[6]
, ATS/Xanadu (ATS3) is being developed actively in ATS2, with the hope of reducing the learning needed by two main improvements:
With these improvements, Xi hopes for ATS to become much more accessible and easier to learn. The main goal of ATS3 is to transform ATS from a language mainly used for research, into one strong enough for large-scale industrial software development.[5]
The main focus of ATS is to support formal verification via automated theorem proving, combined with practical programming.[2] Theorem proving can prove, for example, that an implemented function produces no memory leaks. It can also prevent other bugs that might otherwise be found only during testing. It incorporates a system similar to those of proof assistants which usually only aim to verify mathematical proofs—except ATS uses this ability to prove that the implementations of its functions operate correctly, and produce the expected output.
As a simple example, in a function using division, the programmer may prove that the divisor will never equal zero, preventing a division by zero error. Let's say, the divisor 'X' was computed as 5 times the length of list 'A'. One can prove, that in the case of a non-empty list, 'X' is non-zero, since 'X' is the product of two non-zero numbers (5 and the length of 'A'). A more practical example would be proving through reference counting that the retain count on an allocated block of memory is being counted correctly for each pointer. Then one can know, and quite literally prove, that the object will not be deallocated prematurely, and that memory leaks will not occur.
The benefit of the ATS system is that since all theorem proving occurs strictly within the compiler, it has no effect on the speed of the executable program. ATS code is often harder to compile than standard C code, but once it compiles, it is certain that it is running correctly to the degree specified by the proofs (assuming the compiler and runtime system are correct).
In ATS proofs are separate from implementation, so it is possible to implement a function without proving it, if desired.
According to the author, ATS's efficiency[7] is largely due to the way that data is represented in the language and tail-call optimizations (which are generally important for the efficiency of functional languages). Data can be stored in a flat or unboxed representation rather than a boxed representation.
dataprop
expresses predicates as algebraic types.
Predicates in pseudo‑code somewhat similar to ATS source (see below for valid ATS source):
FACT(n, r) iff fact(n) = r MUL(n, m, prod) iff n * m = prod FACT(n, r) = FACT(0, 1) | FACT(n, r) iff FACT(n-1, r1) and MUL(n, r1, r) // for n > 0 // expresses fact(n) = r iff r = n * r1 and r1 = fact(n-1)
In ATS code:
where FACT (int, int) is a proof type
Non tail-recursive factorial with proposition or "Theorem" proving through the construction dataprop.
The evaluation of returns a pair (proof_n_minus_1 | result_of_n_minus_1)
which is used in the calculation of . The proofs express the predicates of the proposition.
[FACT (n, r)] implies [fact (n) = r] [MUL (n, m, prod)] implies [n * m = prod]
FACT (0, 1) FACT (n, r) iff FACT (n-1, r1) and MUL (n, r1, r) forall n > 0
To remember:
universal quantification [...] existential quantification (... | ...) (proof | value) @(...) flat tuple or variadic function parameters tuple .<...>. termination metric[8]
dataprop FACT (int, int) = | FACTbas (0, 1) of // basic case | // inductive case FACTind (n+1, (n+1)*r) of (FACT (n, r))
(* note that int(x), also int x, is the monovalued type of the int x value.
The function signature below says: forall n:nat, exists r:int where fact(num: int(n)) returns (FACT (n, r) | int(r)) *)
fun fact .
This can all be added to a single file and compiled as follows. Compiling should work with various back end C compilers, e.g., GNU Compiler Collection (gcc). Garbage collection is not used unless explicitly stated with)[9]
val (predicate_proofs | values) = myfunct params
universal quantification [...] existential quantification (...) parenthetical expression or tuple (... | ...) (proofs | values)
.<...>. termination metric @(...) flat tuple or variadic function parameters tuple (see example's printf) @[byte][BUFLEN] type of an array of BUFLEN values of type byte[10] @[byte][BUFLEN] array instance @[byte][BUFLEN](0) array initialized to 0
sortdef nat = // from prelude: ∀ a ∈ int ...
typedef String = [a:nat] string(a) // [..]: ∃ a ∈ nat ...
// : ∀ a,b ∈ type ... fun swap_type_type (xy: @(a, b)): @(b, a) = (xy.1, xy.0)
asserts that there is a view of type T at location L
the type of ptr_get0 (T)
is ∀ l : addr . (T @ l | ptr(l)) -> (T @ l | T) // see manual, section 7.1. Safe Memory Access through Pointers
[12]
viewdef array_v (a:viewt@ype, n:int, l: addr) = @[a][n] @ l
as in case+, val+, type+, viewtype+, ...
staload "foo.sats" // foo.sats is loaded and then opened into the current namespace staload F = "foo.sats" // to use identifiers qualified as $F.bar dynload "foo.dats" // loaded dynamically at run-time
Dataviews are often declared to encode recursively defined relations on linear resources.[13]
dataview array_v (a: viewt@ype+, int, addr) = | array_v_none (a, 0, l) | array_v_some (a, n+1, l) of (a @ l, array_v (a, n, l+sizeof a))
Datatypes[14] datatype workday = Mon | Tue | Wed | Thu | Fri
lists datatype list0 (a:t@ype) = list0_cons (a) of (a, list0 a) | list0_nil (a)
A dataviewtype is similar to a datatype, but it is linear. With a dataviewtype, the programmer is allowed to explicitly free (or deallocate) in a safe manner the memory used for storing constructors associated with the dataviewtype.[15]
local variables var res: int with pf_res = 1 // introduces pf_res as an alias of view @ (res)
on stack array allocation: #define BUFLEN 10 var !p_buf with pf_buf = @[byte][BUFLEN](0) // pf_buf = @[byte][BUFLEN](0) @ p_buf[16]
See val and var declarations[17]