Lexical analysis is the process of converting a sequence of characters from a source program into a sequence of tokens. It involves reading the source program, scanning characters, grouping them into lexemes and producing tokens as output. The lexical analyzer also enters tokens into a symbol table, strips whitespace and comments, correlates error messages with line numbers, and expands macros. Lexical analysis produces tokens through scanning and tokenization and helps simplify compiler design and improve efficiency. It identifies tokens like keywords, constants, identifiers, numbers, operators and punctuation through patterns and deals with issues like lookahead and ambiguities.