Lexical Analysis Scanning is the first phase of a compiler in which the source program is read character by character and then grouped in to various tokens. Token is defined as sequence of characters with collective meaning. The various tokens could be identifiers, keywords, operators, punctuations, constants, etc. The input is a program written in any high level language and the output is stream of tokens. Regular expressions can be used for implementing this token separation.
-
Notifications
You must be signed in to change notification settings - Fork 0
UsnikB/Token-Seperation
Folders and files
| Name | Name | Last commit message | Last commit date | |
|---|---|---|---|---|
Repository files navigation
About
No description, website, or topics provided.
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published