Loading...
Please wait, while we are loading the content...
Similar Documents
Wherefore thou art ... Semantics of Computation ?
| Content Provider | Semantic Scholar |
|---|---|
| Author | Honsell, Furio |
| Copyright Year | 2015 |
| Abstract | The power of digital simulation combined with the elementary simplicity of Universal Computational Models (e.g. Turing Machines, Church's λ-calculus, Curry's Combinatory Logic, cellular automata, …) is apparently the Pythagorean dream made true, but because of the remoteness and gratuitousness of Computational Models it is also the original sin of Computing. In effect, the dynamics of token/symbol manipulation in such models is too idiosyncratic to be insightful. Hence, the more computers are used in life-critical applications, the more incumbent are digital woes due to potentially incorrect software. To achieve correct software, i.e. software which meets its specifications, we need to establish a formal correspondence between low level peculiarities and higher level conceptual un-derstandings. This amounts to defining formal Semantics of programming languages [26] and addressing the related critical issue of adequacy of formalizations and encodings, which are ultimately irreducible to formalization [11, 12, 18]. I will try to outline a brief history of the quest for a Final Semantics in Computing. The initial paradigm, since the 60's, was that of Denotational Semantics: the meaning of a program is a function (an algorithm being a total function), whose behavior is captured by certain logical invariants called types or by observations. This approach used λ-calculus, which is a theory of functions, as the canonical computational model. Every language component received a functional interpretation. Categorically speaking, this approach is syntax directed and can be termed initial semantics, in that the interpretation function is an initial algebra-morphism which maps uniquely the algebraic structure of the syntax, of the programming language, to a set of abstract entities called denotations. The crucial property of the interpretation function is compositional-ity, namely the (algebraic) inductive structure of the syntax is reflected by the semantics, which therefore must feature a similar, but conceptually independent, algebraic structure. Hence this semantics is extensional and referentially transparent. Denotations are usually morphisms in suitable categories such as, possibly higher order, topological spaces, or domains [24, 21, 22, 25]. The added value of domains comes from the fact that they are endowed with an enriched structure. This allows for natural definitions of recursive objects, since all endomorphisms have fixed points and for approximations, and hence for new proof principles for reasoning on programs, such as Fixed Point Induction. Semantics is, ultimately, just an equivalence relation, in fact a congruence relation. The methodology of Program Synthesis through Program Equivalence capitalizes on this understanding of semantics. The … |
| File Format | PDF HTM / HTML |
| Alternate Webpage(s) | https://hapoc2015.sciencesconf.org/conference/hapoc2015/pages/Honsell.pdf |
| Alternate Webpage(s) | http://hapoc2015.sciencesconf.org/conference/hapoc2015/pages/Honsell.pdf |
| Language | English |
| Access Restriction | Open |
| Content Type | Text |
| Resource Type | Article |