IndentationError regression in flake8 v3
processor.py replaced the token handling that was previously done in
pycodestyle, but intentionally chose not to catch
SyntaxError, which means
IndentationErrors are now crashing flake8.
def generate_tokens(self): """Tokenize the file and yield the tokens. :raises flake8.exceptions.InvalidSyntax: If a :class:`tokenize.TokenError` is raised while generating tokens. """ try: for token in tokenize.generate_tokens(self.next_line): if token > self.total_lines: break self.tokens.append(token) yield token # NOTE(sigmavirus24): pycodestyle was catching both a SyntaxError # and a tokenize.TokenError. In looking a the source on Python 2 and # Python 3, the SyntaxError should never arise from generate_tokens. # If we were using tokenize.tokenize, we would have to catch that. Of # course, I'm going to be unsurprised to be proven wrong at a later # date. except tokenize.TokenError as exc: raise exceptions.InvalidSyntax(exc.message, exception=exc)
This note is very strange. At least in Python 3, all the work is done in
tokenize.generate_tokens is the thinnest of layers, and
tokenize.tokenize is also a very small wrapper.
# An undocumented, backwards compatible, API for all the places in the standard # library that expect to be able to use tokenize with strings def generate_tokens(readline): return _tokenize(readline, None)
One kind of
SyntaxError that is thrown by
Here is an example backtrace caused by omitting catching this syntax error:
$ flake8 . Process Process-22: Traceback (most recent call last): File "/opt/python/3.3.5/lib/python3.3/multiprocessing/process.py", line 258, in _bootstrap self.run() File "/opt/python/3.3.5/lib/python3.3/multiprocessing/process.py", line 95, in run self._target(*self._args, **self._kwargs) File "/home/travis/virtualenv/python3.3.5/lib/python3.3/site-packages/flake8/checker.py", line 222, in _run_checks_from_queue checker.run_checks(self.results_queue, self.statistics_queue) File "/home/travis/virtualenv/python3.3.5/lib/python3.3/site-packages/flake8/checker.py", line 559, in run_checks self.process_tokens() File "/home/travis/virtualenv/python3.3.5/lib/python3.3/site-packages/flake8/checker.py", line 534, in process_tokens for token in file_processor.generate_tokens(): File "/home/travis/virtualenv/python3.3.5/lib/python3.3/site-packages/flake8/processor.py", line 238, in generate_tokens for token in tokenize.generate_tokens(self.next_line): File "/home/travis/virtualenv/python3.3.5/lib/python3.3/tokenize.py", line 553, in _tokenize ("<tokenize>", lnum, pos, line)) File "<tokenize>", line 456 def test_parser_errors(self): ^ IndentationError: unindent does not match any outer indentation level