|Published (Last):||3 September 2004|
|PDF File Size:||3.5 Mb|
|ePub File Size:||5.88 Mb|
|Price:||Free* [*Free Regsitration Required]|
For B. What is a compiler? State various phases of a compiler an d explain them in detai l. It is a translator that translates high level to low level language. It displays the errors after the. A Compiler operates in phases, each of which transforms the source program from one.
The following are the phases of the compiler:. Main phases :. It is the first phase of the compiler. It gets input from the source program and produces tokens as output. It reads the characters one by one, starting from left to right and forms the tokens. Token : It represents a logically cohesive sequence of characters such as keywords, operators, identifiers, special symbols etc. Group of characters forming a token is called the Lexeme. The lexical analyser not only generates a token but also enters the lexeme into the symbol table if it is not already there.
It is the second phase of the compiler. It is also known as parser. It gets the token stream as input from the lexical analyser of the compiler and generates syntax tree as the output. Syntax tree:. It is a tree in which interior nodes are operators and exterior nodes are operands. It is the third phase of the compiler. It gets input from the syntax analysis as parse tree and checks whether the given syntax is correct or not. It performs type conversion of all the data types into real data types.
It is the fourth phase of the compiler. It gets input from the semantic analysis and converts the input into output as intermediate code such as three-address code. The three-address code consists of a sequence of instructions, each of which has atmost three operands. It is the fifth phase of the compiler. It gets the intermediate code as input and produces optimized intermediate code as output.
This phase reduces the redundant code and attempts to improve the intermediate code so that faster-running machine code will result. During the code optimization, the result of the program is not affected. To improve the code generation, the optimization involves deduction and removal of dead code unreachable code. It is the final phase of the compiler.
It gets input from code optimization phase and produces the target code or object code as result. Intermediate instructions are translated into a sequence of machine instructions that perform the same task. The code generation involves - allocation of register and memory generation of correct references generation of correct data types generation of missing code.
Symbol table is used to store all the information about identifiers used in the program. It is a data structure containing a record for each identifier, with fields for the attributes of the identifier.
It allows to find the record for each identifier quickly and to store or retrieve data from that record. Whenever an identifier is detected in any of the phases, it is stored in the symbol table. Each phase can encounter errors. After detecting an error, a phase must handle the error so that compilation can proceed.
In lexical analysis, errors occur in separation of tokens. In syntax analysis, errors occur during construction of syntax tree. In semantic analysis, errors occur when the compiler detects constructs with right syntactic structure but no meaning and during type conversion. In code optimization, errors occur when the result is affected by the optimization.
In code generation, it shows error when code is missing etc. The figure shows the representation of this statement after each phase:. What are the cou s in s of a Compiler? Explain them in detail. Assembler Loader and Link-editor. The output is said to be a preprocessed form of the input data, which is often. They may perform the following functions :. The mapping process that instantiates. File Inclusion:. Preprocessor includes header files into the program text.
When the preprocessor finds an include directive it replaces it by the entire content of the specified file. Rational Preprocessors:. These processors change older languages with more modern flow-of-control and datastructuring facilities. Language extension :. These processors attempt to add capabilities to the language by what amounts to built-in macros. For example, the language Equel is a database query language embedded in C.
There are two types of assemblers:. One-pass assemblers go through the source code once and assume that all symbols will be defined before any instruction that references them. Two-pass assemblers create a table with all symbols and their values in the first pass, and then use the table in a second pass to generate code. A linker or link editor is a program that takes one or more objects generated by a compiler and. Three tasks of the linker are :. Searches the program to find library routines used by program, e.
Determines the memory locations that code from each module will occupy and relocates its instructions by adjusting absolute references Resolves references among files. A loader is the part of an operating system that is responsible for loading programs in memory,. Briefl y explain Compiler construction tool s.
The following are the compiler construction tools:. These produce syntax analyzers, normally from input that is based on a context-free grammar. It consumes a large fraction of the running time of a compiler. These generate lexical analyzers, normally from a specification based on regular expressions.
The basic organization of lexical analyzers is based on finite automation. These produce routines that walk the parse tree and as a result generate intermediate code. Each translation is defined in terms of translations at its neighbor nodes in the tree. It takes a collection of rules to translate intermediate language into machine language. The rules must include sufficient details to handle different possible access methods for data.
It does code optimization using data-flow analysis, that is, the gathering of information about how values are transmitted from one part of a program to each other part. A program or function which performs lexical analysis is called a lexical analyzer or scanner. A lexer often exists as a single function which is called by a parser or another function.
Its main task is to read the input characters and produce as output a sequence of token that the parser uses for syntax analysis. To make the design simpler. To improve the efficiency of the compiler.
To enhance the computer portability. A token is a string of characters, categorized according to the rules as a symbol e. The process of forming tokens from an input stream of characters is called tokenization. A token can look like anything that is useful for processing an input text stream or text.
Lexeme Token type. Assignment operator. Addition operator. End of statement. Collection or group of characters forming tokens is called Lexeme. For identifiers and some other tokens, the pattern is a more complex structure that is matched by many strings.
Attributes for Tokens Some tokens have attributes that can be passed back to the parser. The lexical analyzer. The attributes influence the translation of tokens.
Identifiers: pointer to the corresponding symbol table entry. Explain the error recovery strategies in lexical analysis in detail.
Note for Compiler Design - CD By Dr. D. Jagadeesan
Post a comment. Friday, 21 February cs principles of compiler design 2 marks. CSPrinciples of compiler design. What is a Complier?
Cs2352 Pcd Unit1 Notes