This issue tracker has been migrated to GitHub, and is currently read-only.
For more information, see the GitHub FAQs in the Python's Developer Guide.

Author vstinner
Recipients vstinner
Date 2008-10-02.14:11:12
SpamBayes Score 0.0009836039
Marked as misclassified No
Message-id <[email protected]>
In-reply-to
Content
Ah! tokenize has already a method detect_encoding(). My new patch uses 
it to avoid code duplication.
History
Date User Action Args
2008-10-02 14:11:14vstinnersetrecipients: + vstinner
2008-10-02 14:11:14vstinnersetmessageid: <[email protected]>
2008-10-02 14:11:13vstinnerlinkissue4008 messages
2008-10-02 14:11:13vstinnercreate