Lexical Analysis(MCQs)

What is the primary role of lexical analysis in a compiler?

a) Parsing syntax errors
b) Generating machine code
c) Tokenizing the source code
d) Optimizing intermediate code
Answer: c) Tokenizing the source code

Which component of the compiler is responsible for lexical analysis?

a) Parser
b) Lexical Analyzer
c) Code Generator
d) Optimizer
Answer: b) Lexical Analyzer

In lexical analysis, what is a “token”?

a) A sequence of characters
b) A basic unit of syntax
c) An intermediate code representation
d) An error message
Answer: b) A basic unit of syntax

What type of information does a lexical analyzer produce?

a) Syntax tree
b) Machine code
c) Token stream
d) Symbol table
Answer: c) Token stream

Which of the following is not typically a function of lexical analysis?

a) Removing comments and whitespace
b) Tokenizing input text
c) Generating assembly code
d) Identifying keywords and identifiers
Answer: c) Generating assembly code

What is a “lexeme”?

a) A type of token
b) A sequence of characters that form a token
c) A syntax error
d) A type of grammar rule
Answer: b) A sequence of characters that form a token

Which data structure is commonly used to store tokens during lexical analysis?

a) Stack
b) Queue
c) Array
d) Linked list
Answer: c) Array

What is the purpose of a finite state machine (FSM) in lexical analysis?

a) To generate machine code
b) To parse tokens into syntax trees
c) To recognize patterns and sequences in the source code
d) To optimize intermediate code
Answer: c) To recognize patterns and sequences in the source code

Which phase of compilation processes whitespace and comments in the source code?

a) Syntax Analysis
b) Lexical Analysis
c) Semantic Analysis
d) Code Generation
Answer: b) Lexical Analysis

What is the role of regular expressions in lexical analysis?

a) They define the syntax rules of the language
b) They describe the patterns for tokens
c) They generate the abstract syntax tree
d) They optimize code execution
Answer: b) They describe the patterns for tokens

Which of the following is an example of a token type?

a) Integer
b) Array
c) Stack
d) Graph
Answer: a) Integer

What does a lexical analyzer use to differentiate between keywords and identifiers?

a) Syntax rules
b) Regular expressions
c) Context-sensitive rules
d) Machine code
Answer: b) Regular expressions

In lexical analysis, what is the purpose of tokenization?

a) To create a syntax tree
b) To generate machine code
c) To break the input source code into meaningful symbols
d) To optimize the performance of the program
Answer: c) To break the input source code into meaningful symbols

Which of the following is a common output of lexical analysis?

a) Abstract syntax tree
b) Token stream
c) Intermediate code
d) Object code
Answer: b) Token stream

How does a lexical analyzer handle multi-character operators like <= or !=?

a) By treating them as separate tokens
b) By combining them into a single token
c) By ignoring them
d) By converting them to machine code
Answer: b) By combining them into a single token

Which component is responsible for matching regular expressions to the input stream in lexical analysis?

a) Syntax analyzer
b) Finite state machine
c) Code generator
d) Optimizer
Answer: b) Finite state machine

What does a lexical analyzer do with invalid tokens?

a) Ignore them
b) Generate a syntax error
c) Replace them with valid tokens
d) Convert them to machine code
Answer: b) Generate a syntax error

What is the role of a symbol table in lexical analysis?

a) To store token patterns
b) To manage variable names and their attributes
c) To generate intermediate code
d) To parse syntax trees
Answer: b) To manage variable names and their attributes

Which of the following is a common tool used for lexical analysis?

a) Bison
b) Yacc
c) Flex
d) Gcc
Answer: c) Flex

What is the main purpose of removing comments and whitespace during lexical analysis?

a) To improve code readability
b) To simplify the token stream for further processing
c) To optimize the source code
d) To generate machine code
Answer: b) To simplify the token stream for further processing

Which phase of lexical analysis is responsible for handling identifiers?

a) Tokenization
b) Pattern matching
c) Symbol management
d) Error handling
Answer: c) Symbol management

What does a lexical analyzer do when it encounters a string literal?

a) Generates a syntax error
b) Converts it into a token representing the string
c) Replaces it with a regular expression
d) Ignores it
Answer: b) Converts it into a token representing the string

How does a lexical analyzer typically handle reserved keywords?

a) Treats them as special tokens
b) Converts them to identifiers
c) Ignored
d) Translates them into machine code
Answer: a) Treats them as special tokens

Which of the following is a key feature of a lexical analyzer?

a) Syntax checking
b) Token generation
c) Semantic analysis
d) Intermediate code generation
Answer: b) Token generation

What is the role of a regular expression in defining token patterns?

a) To create machine code
b) To specify how tokens should be recognized
c) To generate syntax trees
d) To optimize intermediate code
Answer: b) To specify how tokens should be recognized

What does “lexical scope” refer to in lexical analysis?

a) The range within which tokens are valid
b) The depth of the syntax tree
c) The extent of memory usage
d) The range of regular expressions used
Answer: a) The range within which tokens are valid

Which of the following is not typically handled by lexical analysis?

a) Identifiers
b) Keywords
c) Syntax trees
d) Operators
Answer: c) Syntax trees

What does the “scanner” refer to in lexical analysis?

a) The component that performs syntax checking
b) The tool that converts tokens into machine code
c) The lexical analyzer that reads and tokenizes the input
d) The phase that performs semantic analysis
Answer: c) The lexical analyzer that reads and tokenizes the input

Which type of language construct is directly managed by lexical analysis?

a) Control flow statements
b) Function definitions
c) Basic tokens like keywords and operators
d) Data structures
Answer: c) Basic tokens like keywords and operators

How does lexical analysis handle multi-line comments?

a) Converts them into tokens
b) Ignores them and removes them from the source code
c) Generates syntax errors
d) Translates them into intermediate code
Answer: b) Ignores them and removes them from the source code

What is the main challenge in lexical analysis when dealing with numeric literals?

a) Differentiating between integer and floating-point literals
b) Generating machine code
c) Parsing complex expressions
d) Handling syntax errors
Answer: a) Differentiating between integer and floating-point literals

What role does a “lexer” play in lexical analysis?

a) It parses tokens into syntax trees
b) It generates intermediate code
c) It performs tokenization and produces tokens
d) It performs code optimization
Answer: c) It performs tokenization and produces tokens

Which of the following is a common output format of lexical analysis?

a) Assembly code
b) Token stream
c) Abstract syntax tree
d) Intermediate code
Answer: b) Token stream

Which type of errors does lexical analysis primarily handle?

a) Syntax errors
b) Semantic errors
c) Lexical errors
d) Runtime errors
Answer: c) Lexical errors

What is a “token stream” in the context of lexical analysis?

a) A sequence of machine code instructions
b) A list of recognized tokens and their attributes
c) A sequence of syntax errors
d) A set of intermediate representations
Answer: b) A list of recognized tokens and their attributes

How does a lexical analyzer treat numerical constants?

a) Converts them into tokens representing their values
b) Ignores them
c) Parses them into syntax trees
d) Generates error messages
Answer: a) Converts them into tokens representing their values

What is the purpose of “token classification” in lexical analysis?

a) To generate syntax trees
b) To categorize tokens into types such as keywords, identifiers, or literals
c) To optimize intermediate code
d) To perform code generation
Answer: b) To categorize tokens into types such as keywords, identifiers, or literals

Which type of lexical analyzer tool is used to generate lexical analyzers from regular expressions?

a) Yacc
b) Flex
c) Bison
d) Gcc
Answer: b) Flex

What is the main purpose of handling escape sequences in lexical analysis?

a) To handle special characters within string literals
b) To generate machine code
c) To parse complex expressions
d) To create syntax trees
Answer: a) To handle special characters within string literals

Which phase directly precedes syntax analysis in the compilation process?

a) Semantic analysis
b) Code optimization
c) Lexical analysis
d) Code generation
Answer: c) Lexical analysis

In lexical analysis, what is meant by “lexical unit”?

a) A part of the machine code
b) A sequence of characters that form a token
c) A type of syntax error
d) An intermediate code representation
Answer: b) A sequence of characters that form a token

How does lexical analysis handle nested comments in programming languages?

a) Treats them as nested tokens
b) Ignores them as invalid syntax
c) Removes them and processes the remaining code
d) Converts them to machine code
Answer: c) Removes them and processes the remaining code

Which aspect of source code is not directly analyzed by the lexical analyzer?

a) Variable names
b) Function names
c) Syntax rules
d) Operators
Answer: c) Syntax rules

What is the primary function of the “token table” in lexical analysis?

a) To store machine code
b) To keep track of recognized tokens and their attributes
c) To generate abstract syntax trees
d) To optimize intermediate code
Answer: b) To keep track of recognized tokens and their attributes

What does a lexical analyzer do with identifiers it encounters?

a) Converts them to numeric values
b) Categorizes them and stores them in a symbol table
c) Generates syntax trees
d) Translates them to machine code
Answer: b) Categorizes them and stores them in a symbol table

What is an example of a token type handled by a lexical analyzer?

a) Variable declaration
b) Expression evaluation
c) Operator
d) Function call
Answer: c) Operator

What does a “lexical error” typically indicate?

a) Incorrect syntax
b) Invalid or unrecognized token
c) Semantic mismatch
d) Runtime exception
Answer: b) Invalid or unrecognized token

How does lexical analysis contribute to error reporting?

a) By generating runtime error messages
b) By providing detailed syntax error reports
c) By identifying and reporting invalid tokens
d) By optimizing code for performance
Answer: c) By identifying and reporting invalid tokens

Which tool is used to specify the lexical rules of a language in a more readable format?

a) Yacc
b) Bison
c) Lex
d) Gcc
Answer: c) Lex

What is the role of “lookahead” in lexical analysis?

a) To predict future tokens
b) To determine the context of the current token
c) To handle multi-character tokens
d) To generate syntax trees
Answer: c) To handle multi-character tokens