This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author lukasz.langa
Recipients lukasz.langa
Date 2018-04-23.01:05:31
SpamBayes Score -1.0
Marked as misclassified Yes
Message-id <[email protected]>
In-reply-to
Content
### Diff between files

The unified diff between tokenize implementations is here:
https://gist.github.com/ambv/679018041d85dd1a7497e6d89c45fb86

It clocks at 275 lines but that's because it gives context. The actual diff is
175 lines long.

To make it that small, I needed to move some insignificant bits in
Lib/tokenize.py.  This is what the other PR on this issue is about.
History
Date User Action Args
2018-04-23 01:05:31lukasz.langasetrecipients: + lukasz.langa
2018-04-23 01:05:31lukasz.langasetmessageid: <[email protected]>
2018-04-23 01:05:31lukasz.langalinkissue33338 messages
2018-04-23 01:05:31lukasz.langacreate