Overview
ABSTRACT
A compiler is a software program that transforms a source program (written in a programming language) into another target programming language, most often the processor programming language that permits its execution. This article shows compiler principles, the technologies used and other applications of compilation technologies.
Read this article from a comprehensive knowledge base, updated and supplemented with articles reviewed by scientific committees.
Read the articleAUTHORS
-
Henri-Pierre Charles: Research Director, CEA-Grenoble
-
Christian Fabre: Senior Research Engineer, CEA-Grenoble
INTRODUCTION
Computer architecture and computer applications have always been in a state of flux. Since compilation is at the frontier of these two rapidly evolving worlds, it's only natural that it should have evolved in tandem.
Broadly speaking, the history of compilation can be divided into four main periods:
1950-1970: During this period, the main notions of hardware architecture were in place, but it was difficult to program computers, which were bulky machines.
Regular expressions (1956) using finite automata theory and parsers (1965) enabled the creation of tools for programming languages and their compilation. COBOL and FORTRAN (1957) were the first compiled computer languages.
We can say that this period corresponds to the creation of the theoretical and practical foundations of compilation.
1970-1990: It was during this period that consumer electronics were popularized. The Intel 4004 processor (1971) was the first consumer processor, while the Intel 386 processor was the basis for the popularity of PCs (Personal Computers). The MIPS processor was created (1984), introducing the notion of RISC architecture. It was also during this period that Gordon Moore re-evaluated his famous law stating that "the number of microprocessor transistors on a silicon chip doubles every two years".
It was during this period that the C (1972) and C++ (1983) languages were created, as well as the first open source compiler, GCC 1.0. This compiler was a huge success and was adopted by many computer manufacturers.
The objectives of program compilation were: "to provide a binary program semantically equivalent to the source program". Optimization was still in its infancy.
1990-2000: This period was rich in new architectural concepts: pipelines, superscalar processors, cache memory, parallel machines.
It was during this period that the Java language was created (1995) and that on-the-fly compilation became widely used (1999).
The objectives of program compilation have become: "to provide a semantically equivalent binary program that makes the best use of architectural concepts". Optimization phases became increasingly complex. The first parallel programming tools were created, such as MPI for inter-processor message passing (1991) or OpenMP for automatic parallelization (1997).
2000-2017: This last period saw an explosion in the complexity of architectures: multicore processors, graphics co-processors (GPUs). The CUDA (2007) and OpenCL (2009) languages appeared during this period.
...
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!
KEYWORDS
instruction level parallelism | processor | programming language | compiler | computer architecture | resource allocation | lexical analysis | syntactic analysis | program optimization
Compiler
Article included in this offer
"Software technologies and System architectures"
(
227 articles
)
Updated and enriched with articles validated by our scientific committees
A set of exclusive tools to complement the resources
Bibliography
Exclusive to subscribers. 97% yet to be discovered!
Already subscribed? Log in!