Skip to content
Commits on Source (6)
Metadata-Version: 1.1
Metadata-Version: 2.1
Name: mypy
Version: 0.570
Version: 0.580
Summary: Optional static typing for Python
Home-page: http://www.mypy-lang.org/
Author: Jukka Lehtosalo
Author-email: jukka.lehtosalo@iki.fi
License: MIT License
Description-Content-Type: UNKNOWN
Description: Mypy -- Optional Static Typing for Python
=========================================
......@@ -27,3 +26,4 @@ Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Topic :: Software Development
Provides-Extra: dmypy
mypy (0.580-1) unstable; urgency=medium
* Team upload.
* New upstream version
* debhelper 11
* Add missing Depends: python3-distutils
Closes: #894075
* Docs moved to /usr/share/doc/mypy
-- Andreas Tille <tille@debian.org> Mon, 26 Mar 2018 13:20:59 +0200
mypy (0.570-1) unstable; urgency=medium
* New upstream version.
......
......@@ -3,7 +3,7 @@ Maintainer: Debian Med Packaging Team <debian-med-packaging@lists.alioth.debian.
Uploaders: Michael R. Crusoe <michael.crusoe@gmail.com>
Section: utils
Priority: optional
Build-Depends: debhelper (>= 10),
Build-Depends: debhelper (>= 11~),
dh-python,
flake8,
help2man,
......@@ -56,7 +56,8 @@ Package: python3-mypy
Architecture: all
Section: python
Depends: ${misc:Depends},
${python3:Depends}
${python3:Depends},
python3-distutils
Breaks: mypy (<< 0.540-2)
Replaces: mypy (<< 0.540-2)
Description: public modules for mypy (Python 3)
......
......@@ -5,5 +5,5 @@ Abstract: This is the reference documentation for mypy.
Section: Programming/Python
Format: HTML
Index: /usr/share/doc/mypy-doc/html/index.html
Files: /usr/share/doc/mypy-doc/html/*.html
Index: /usr/share/doc/mypy/html/index.html
Files: /usr/share/doc/mypy/html/*.html
......@@ -11,14 +11,17 @@ flag (or its long form ``--help``)::
usage: mypy [-h] [-v] [-V] [--python-version x.y] [--platform PLATFORM] [-2]
[--ignore-missing-imports]
[--follow-imports {normal,silent,skip,error}]
[--disallow-any-{unimported,expr,decorated,explicit,generics}]
[--disallow-untyped-calls] [--disallow-untyped-defs]
[--disallow-any-unimported] [--disallow-any-expr]
[--disallow-any-decorated] [--disallow-any-explicit]
[--disallow-any-generics] [--disallow-untyped-calls]
[--disallow-untyped-defs] [--disallow-incomplete-defs]
[--check-untyped-defs] [--disallow-subclassing-any]
[--warn-incomplete-stub] [--warn-redundant-casts]
[--no-warn-no-return] [--warn-return-any] [--warn-unused-ignores]
[--warn-incomplete-stub] [--disallow-untyped-decorators]
[--warn-redundant-casts] [--no-warn-no-return] [--warn-return-any]
[--warn-unused-ignores] [--warn-unused-configs]
[--show-error-context] [--no-implicit-optional] [-i]
[--quick-and-dirty] [--cache-dir DIR] [--skip-version-check]
[--strict-optional]
[--quick-and-dirty] [--cache-dir DIR] [--cache-fine-grained]
[--skip-version-check] [--strict-optional]
[--strict-optional-whitelist [GLOB [GLOB ...]]]
[--junit-xml JUNIT_XML] [--pdb] [--show-traceback] [--stats]
[--inferstats] [--custom-typing MODULE]
......@@ -28,9 +31,9 @@ flag (or its long form ``--help``)::
[--shadow-file SOURCE_FILE SHADOW_FILE] [--any-exprs-report DIR]
[--cobertura-xml-report DIR] [--html-report DIR]
[--linecount-report DIR] [--linecoverage-report DIR]
[--memory-xml-report DIR]
[--txt-report DIR] [--xml-report DIR] [--xslt-html-report DIR]
[--xslt-txt-report DIR] [-m MODULE] [-c PROGRAM_TEXT] [-p PACKAGE]
[--memory-xml-report DIR] [--txt-report DIR] [--xml-report DIR]
[--xslt-html-report DIR] [--xslt-txt-report DIR] [-m MODULE]
[-c PROGRAM_TEXT] [-p PACKAGE]
[files [files ...]]
(etc., too long to show everything here)
......@@ -100,7 +103,10 @@ and anything imported from there.) For example::
$ mypy -p html
will type check the entire ``html`` package (of library stubs).
will type check the entire ``html`` package (of library stubs). One can
specify multiple packages and modules on the command line, for example::
$ mypy --package p.a --package p.b --module c
Finally the flag ``-c`` (long form: ``--command``) will take a string
from the command line and type check it as a small program. For
......
......@@ -4,6 +4,12 @@ Revision history
List of major changes:
- March 2018
* Publish ``mypy`` version 0.580 on PyPI.
* Allow specifying multiple packages on the command line with ``-p`` and ``-m`` flags.
* Clarify that ``SupportsInt`` etc. don't support arithmetic operations.
* Publish ``mypy`` version 0.570 on PyPI.
* Add support for :ref:`attrs_package`.
......
Metadata-Version: 1.1
Metadata-Version: 2.1
Name: mypy
Version: 0.570
Version: 0.580
Summary: Optional static typing for Python
Home-page: http://www.mypy-lang.org/
Author: Jukka Lehtosalo
Author-email: jukka.lehtosalo@iki.fi
License: MIT License
Description-Content-Type: UNKNOWN
Description: Mypy -- Optional Static Typing for Python
=========================================
......@@ -27,3 +26,4 @@ Classifier: Programming Language :: Python :: 3.4
Classifier: Programming Language :: Python :: 3.5
Classifier: Programming Language :: Python :: 3.6
Classifier: Topic :: Software Development
Provides-Extra: dmypy
......@@ -118,6 +118,8 @@ mypy/server/astdiff.py
mypy/server/astmerge.py
mypy/server/aststrip.py
mypy/server/deps.py
mypy/server/mergecheck.py
mypy/server/objgraph.py
mypy/server/subexpr.py
mypy/server/target.py
mypy/server/trigger.py
......@@ -132,7 +134,6 @@ mypy/test/testcheck.py
mypy/test/testcmdline.py
mypy/test/testdeps.py
mypy/test/testdiff.py
mypy/test/testdmypy.py
mypy/test/testerrorstream.py
mypy/test/testextensions.py
mypy/test/testfinegrained.py
......@@ -156,7 +157,6 @@ mypy/test/update.py
scripts/dmypy
scripts/dumpmodule.py
scripts/find_type.py
scripts/finegrained.py
scripts/mypy
scripts/mypy.bat
scripts/myunit
......@@ -232,7 +232,6 @@ test-data/unit/check-classvar.test
test-data/unit/check-columns.test
test-data/unit/check-custom-plugin.test
test-data/unit/check-default-plugin.test
test-data/unit/check-dmypy-fine-grained.test
test-data/unit/check-dynamic-typing.test
test-data/unit/check-enum.test
test-data/unit/check-expressions.test
......@@ -495,16 +494,30 @@ typeshed/stdlib/2/distutils/emxccompiler.pyi
typeshed/stdlib/2/email/MIMEText.pyi
typeshed/stdlib/2/email/__init__.pyi
typeshed/stdlib/2/email/_parseaddr.pyi
typeshed/stdlib/2/email/base64mime.pyi
typeshed/stdlib/2/email/charset.pyi
typeshed/stdlib/2/email/encoders.pyi
typeshed/stdlib/2/email/feedparser.pyi
typeshed/stdlib/2/email/generator.pyi
typeshed/stdlib/2/email/header.pyi
typeshed/stdlib/2/email/iterators.pyi
typeshed/stdlib/2/email/message.pyi
typeshed/stdlib/2/email/parser.pyi
typeshed/stdlib/2/email/quoprimime.pyi
typeshed/stdlib/2/email/utils.pyi
typeshed/stdlib/2/email/mime/__init__.pyi
typeshed/stdlib/2/email/mime/application.pyi
typeshed/stdlib/2/email/mime/audio.pyi
typeshed/stdlib/2/email/mime/base.pyi
typeshed/stdlib/2/email/mime/image.pyi
typeshed/stdlib/2/email/mime/message.pyi
typeshed/stdlib/2/email/mime/multipart.pyi
typeshed/stdlib/2/email/mime/nonmultipart.pyi
typeshed/stdlib/2/email/mime/text.pyi
typeshed/stdlib/2/encodings/__init__.pyi
typeshed/stdlib/2/encodings/utf_8.pyi
typeshed/stdlib/2/multiprocessing/__init__.pyi
typeshed/stdlib/2/multiprocessing/pool.pyi
typeshed/stdlib/2/multiprocessing/process.pyi
typeshed/stdlib/2/multiprocessing/util.pyi
typeshed/stdlib/2/os/__init__.pyi
......
This diff is collapsed.
......@@ -3,7 +3,6 @@
import itertools
import fnmatch
from contextlib import contextmanager
import sys
from typing import (
Dict, Set, List, cast, Tuple, TypeVar, Union, Optional, NamedTuple, Iterator
......@@ -11,23 +10,19 @@ from typing import (
from mypy.errors import Errors, report_internal_error
from mypy.nodes import (
SymbolTable, Statement, MypyFile, Var, Expression, Lvalue,
SymbolTable, Statement, MypyFile, Var, Expression, Lvalue, Node,
OverloadedFuncDef, FuncDef, FuncItem, FuncBase, TypeInfo,
ClassDef, GDEF, Block, AssignmentStmt, NameExpr, MemberExpr, IndexExpr,
ClassDef, Block, AssignmentStmt, NameExpr, MemberExpr, IndexExpr,
TupleExpr, ListExpr, ExpressionStmt, ReturnStmt, IfStmt,
WhileStmt, OperatorAssignmentStmt, WithStmt, AssertStmt,
RaiseStmt, TryStmt, ForStmt, DelStmt, CallExpr, IntExpr, StrExpr,
BytesExpr, UnicodeExpr, FloatExpr, OpExpr, UnaryExpr, CastExpr, RevealTypeExpr, SuperExpr,
TypeApplication, DictExpr, SliceExpr, LambdaExpr, TempNode, SymbolTableNode,
Context, ListComprehension, ConditionalExpr, GeneratorExpr,
Decorator, SetExpr, TypeVarExpr, NewTypeExpr, PrintStmt,
LITERAL_TYPE, BreakStmt, PassStmt, ContinueStmt, ComparisonExpr, StarExpr,
YieldFromExpr, NamedTupleExpr, TypedDictExpr, SetComprehension,
DictionaryComprehension, ComplexExpr, EllipsisExpr, TypeAliasExpr,
RefExpr, YieldExpr, BackquoteExpr, Import, ImportFrom, ImportAll, ImportBase,
AwaitExpr, PromoteExpr, Node, EnumCallExpr,
ARG_POS, MDEF,
CONTRAVARIANT, COVARIANT, INVARIANT)
UnicodeExpr, OpExpr, UnaryExpr, LambdaExpr, TempNode, SymbolTableNode,
Context, Decorator, PrintStmt, BreakStmt, PassStmt, ContinueStmt,
ComparisonExpr, StarExpr, EllipsisExpr, RefExpr, PromoteExpr,
Import, ImportFrom, ImportAll, ImportBase,
ARG_POS, ARG_STAR, LITERAL_TYPE, MDEF, GDEF,
CONTRAVARIANT, COVARIANT, INVARIANT,
)
from mypy import nodes
from mypy.literals import literal, literal_hash
from mypy.typeanal import has_any_from_unimported_type, check_for_explicit_any
......@@ -35,7 +30,7 @@ from mypy.types import (
Type, AnyType, CallableType, FunctionLike, Overloaded, TupleType, TypedDictType,
Instance, NoneTyp, strip_type, TypeType, TypeOfAny,
UnionType, TypeVarId, TypeVarType, PartialType, DeletedType, UninhabitedType, TypeVarDef,
true_only, false_only, function_type, is_named_instance, union_items
true_only, false_only, function_type, is_named_instance, union_items,
)
from mypy.sametypes import is_same_type, is_same_types
from mypy.messages import MessageBuilder, make_inferred_type_note
......@@ -60,6 +55,7 @@ from mypy.meet import is_overlapping_types
from mypy.options import Options
from mypy.plugin import Plugin, CheckerPluginInterface
from mypy.sharedparse import BINARY_MAGIC_METHODS
from mypy.scope import Scope
from mypy import experiments
......@@ -136,7 +132,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
# Helper for type checking expressions
expr_checker = None # type: mypy.checkexpr.ExpressionChecker
scope = None # type: Scope
tscope = None # type: Scope
scope = None # type: CheckerScope
# Stack of function return types
return_types = None # type: List[Type]
# Flags; true for dynamically typed functions
......@@ -191,7 +188,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
self.msg = MessageBuilder(errors, modules)
self.plugin = plugin
self.expr_checker = mypy.checkexpr.ExpressionChecker(self, self.msg, self.plugin)
self.scope = Scope(tree)
self.tscope = Scope()
self.scope = CheckerScope(tree)
self.binder = ConditionalTypeBinder()
self.globals = tree.names
self.return_types = []
......@@ -216,6 +214,24 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
# for processing module top levels in fine-grained incremental mode.
self.recurse_into_functions = True
def reset(self) -> None:
"""Cleanup stale state that might be left over from a typechecking run.
This allows us to reuse TypeChecker objects in fine-grained
incremental mode.
"""
# TODO: verify this is still actually worth it over creating new checkers
self.partial_reported.clear()
self.module_refs.clear()
self.binder = ConditionalTypeBinder()
self.type_map.clear()
assert self.inferred_attribute_types is None
assert self.partial_types == []
assert self.deferred_nodes == []
assert len(self.scope.stack) == 1
assert self.partial_types == []
def check_first_pass(self) -> None:
"""Type check the entire file, but defer functions with unresolved references.
......@@ -228,7 +244,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
"""
self.recurse_into_functions = True
with experiments.strict_optional_set(self.options.strict_optional):
self.errors.set_file(self.path, self.tree.fullname())
self.errors.set_file(self.path, self.tree.fullname(), scope=self.tscope)
self.tscope.enter_file(self.tree.fullname())
with self.enter_partial_types():
with self.binder.top_frame_context():
for d in self.tree.defs:
......@@ -250,6 +267,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
self.fail(messages.ALL_MUST_BE_SEQ_STR.format(str_seq_s, all_s),
all_node)
self.tscope.leave()
def check_second_pass(self, todo: Optional[List[DeferredNode]] = None) -> bool:
"""Run second or following pass of type checking.
......@@ -259,7 +278,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
with experiments.strict_optional_set(self.options.strict_optional):
if not todo and not self.deferred_nodes:
return False
self.errors.set_file(self.path, self.tree.fullname())
self.errors.set_file(self.path, self.tree.fullname(), scope=self.tscope)
self.tscope.enter_file(self.tree.fullname())
self.pass_num += 1
if not todo:
todo = self.deferred_nodes
......@@ -274,9 +294,10 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
# print("XXX in pass %d, class %s, function %s" %
# (self.pass_num, type_name, node.fullname() or node.name()))
done.add(node)
with self.errors.enter_type(type_name) if type_name else nothing():
with self.tscope.class_scope(active_typeinfo) if active_typeinfo else nothing():
with self.scope.push_class(active_typeinfo) if active_typeinfo else nothing():
self.check_partial(node)
self.tscope.leave()
return True
def check_partial(self, node: Union[FuncDef,
......@@ -359,6 +380,10 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
def visit_overloaded_func_def(self, defn: OverloadedFuncDef) -> None:
if not self.recurse_into_functions:
return
with self.tscope.function_scope(defn):
self._visit_overloaded_func_def(defn)
def _visit_overloaded_func_def(self, defn: OverloadedFuncDef) -> None:
num_abstract = 0
if not defn.items:
# In this case we have already complained about none of these being
......@@ -576,9 +601,13 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
return AnyType(TypeOfAny.special_form)
def visit_func_def(self, defn: FuncDef) -> None:
"""Type check a function definition."""
if not self.recurse_into_functions:
return
with self.tscope.function_scope(defn):
self._visit_func_def(defn)
def _visit_func_def(self, defn: FuncDef) -> None:
"""Type check a function definition."""
self.check_func_item(defn, name=defn.name())
if defn.info:
if not defn.is_dynamic():
......@@ -629,15 +658,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
If type_override is provided, use it as the function type.
"""
# We may be checking a function definition or an anonymous function. In
# the first case, set up another reference with the precise type.
fdef = None # type: Optional[FuncDef]
if isinstance(defn, FuncDef):
fdef = defn
self.dynamic_funcs.append(defn.is_dynamic() and not type_override)
with self.errors.enter_function(fdef.name()) if fdef else nothing():
with self.enter_partial_types(is_function=True):
typ = self.function_type(defn)
if type_override:
......@@ -861,12 +883,12 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
body = block.body
# Skip a docstring
if (isinstance(body[0], ExpressionStmt) and
if (body and isinstance(body[0], ExpressionStmt) and
isinstance(body[0].expr, (StrExpr, UnicodeExpr))):
body = block.body[1:]
if len(body) == 0:
# There's only a docstring.
# There's only a docstring (or no body at all).
return True
elif len(body) > 1:
return False
......@@ -875,9 +897,11 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
(isinstance(stmt, ExpressionStmt) and
isinstance(stmt.expr, EllipsisExpr)))
def check_reverse_op_method(self, defn: FuncItem, typ: CallableType,
method: str, context: Context) -> None:
def check_reverse_op_method(self, defn: FuncItem,
reverse_type: CallableType, reverse_name: str,
context: Context) -> None:
"""Check a reverse operator method such as __radd__."""
# Decides whether it's worth calling check_overlapping_op_methods().
# This used to check for some very obscure scenario. It now
# just decides whether it's worth calling
......@@ -890,46 +914,49 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
[None, None],
AnyType(TypeOfAny.special_form),
self.named_type('builtins.function'))
if not is_subtype(typ, method_type):
self.msg.invalid_signature(typ, context)
if not is_subtype(reverse_type, method_type):
self.msg.invalid_signature(reverse_type, context)
return
if method in ('__eq__', '__ne__'):
if reverse_name in ('__eq__', '__ne__'):
# These are defined for all objects => can't cause trouble.
return
# With 'Any' or 'object' return type we are happy, since any possible
# return value is valid.
ret_type = typ.ret_type
ret_type = reverse_type.ret_type
if isinstance(ret_type, AnyType):
return
if isinstance(ret_type, Instance):
if ret_type.type.fullname() == 'builtins.object':
return
if len(typ.arg_types) == 2:
# TODO check self argument kind
# Check for the issue described above.
arg_type = typ.arg_types[1]
other_method = nodes.normal_from_reverse_op[method]
if isinstance(arg_type, Instance):
if not arg_type.type.has_readable_member(other_method):
return
elif isinstance(arg_type, AnyType):
if reverse_type.arg_kinds[0] == ARG_STAR:
reverse_type = reverse_type.copy_modified(arg_types=[reverse_type.arg_types[0]] * 2,
arg_kinds=[ARG_POS] * 2,
arg_names=[reverse_type.arg_names[0], "_"])
assert len(reverse_type.arg_types) == 2
forward_name = nodes.normal_from_reverse_op[reverse_name]
forward_inst = reverse_type.arg_types[1]
if isinstance(forward_inst, TypeVarType):
forward_inst = forward_inst.upper_bound
if isinstance(forward_inst, (FunctionLike, TupleType, TypedDictType)):
forward_inst = forward_inst.fallback
if isinstance(forward_inst, TypeType):
item = forward_inst.item
if isinstance(item, Instance):
opt_meta = item.type.metaclass_type
if opt_meta is not None:
forward_inst = opt_meta
if not (isinstance(forward_inst, (Instance, UnionType))
and forward_inst.has_readable_member(forward_name)):
return
elif isinstance(arg_type, UnionType):
if not arg_type.has_readable_member(other_method):
return
else:
return
typ2 = self.expr_checker.analyze_external_member_access(
other_method, arg_type, defn)
self.check_overlapping_op_methods(
typ, method, defn.info,
typ2, other_method, cast(Instance, arg_type),
defn)
forward_base = reverse_type.arg_types[1]
forward_type = self.expr_checker.analyze_external_member_access(forward_name, forward_base,
context=defn)
self.check_overlapping_op_methods(reverse_type, reverse_name, defn.info,
forward_type, forward_name, forward_base,
context=defn)
def check_overlapping_op_methods(self,
reverse_type: CallableType,
......@@ -937,7 +964,7 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
reverse_class: TypeInfo,
forward_type: Type,
forward_name: str,
forward_base: Instance,
forward_base: Type,
context: Context) -> None:
"""Check for overlapping method and reverse method signatures.
......@@ -998,8 +1025,8 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
if is_unsafe_overlapping_signatures(forward_tweaked,
reverse_tweaked):
self.msg.operator_method_signatures_overlap(
reverse_class.name(), reverse_name,
forward_base.type.name(), forward_name, context)
reverse_class, reverse_name,
forward_base, forward_name, context)
elif isinstance(forward_item, Overloaded):
for item in forward_item.items():
self.check_overlapping_op_methods(
......@@ -1250,7 +1277,7 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
typ = defn.info
if typ.is_protocol and typ.defn.type_vars:
self.check_protocol_variance(defn)
with self.errors.enter_type(defn.name), self.enter_partial_types(is_class=True):
with self.tscope.class_scope(defn.info), self.enter_partial_types(is_class=True):
old_binder = self.binder
self.binder = ConditionalTypeBinder()
with self.binder.top_frame_context():
......@@ -2567,6 +2594,7 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
return
if self.recurse_into_functions:
with self.tscope.function_scope(e.func):
self.check_func_item(e.func, name=e.func.name())
# Process decorators from the inside out to determine decorated signature, which
......@@ -3677,7 +3705,7 @@ def is_node_static(node: Optional[Node]) -> Optional[bool]:
return None
class Scope:
class CheckerScope:
# We keep two stacks combined, to maintain the relative order
stack = None # type: List[Union[TypeInfo, FuncItem, MypyFile]]
......@@ -3705,7 +3733,7 @@ class Scope:
top = self.top_function()
assert top, "This method must be called from inside a function"
index = self.stack.index(top)
assert index, "Scope stack must always start with a module"
assert index, "CheckerScope stack must always start with a module"
enclosing = self.stack[index - 1]
if isinstance(enclosing, TypeInfo):
return enclosing
......
......@@ -17,6 +17,7 @@ import time
from typing import Any, Callable, Dict, List, Mapping, Optional, Sequence, Tuple, TypeVar
from mypy.dmypy_util import STATUS_FILE, receive
from mypy.util import write_junit_xml
# Argument parser. Subparsers are tied to action functions by the
# @action(subparse) decorator.
......@@ -29,6 +30,8 @@ subparsers = parser.add_subparsers()
start_parser = p = subparsers.add_parser('start', help="Start daemon")
p.add_argument('--log-file', metavar='FILE', type=str,
help="Direct daemon stdout/stderr to FILE")
p.add_argument('--timeout', metavar='TIMEOUT', type=int,
help="Server shutdown timeout (in seconds)")
p.add_argument('flags', metavar='FLAG', nargs='*', type=str,
help="Regular mypy flags (precede with --)")
......@@ -36,6 +39,8 @@ restart_parser = p = subparsers.add_parser('restart',
help="Restart daemon (stop or kill followed by start)")
p.add_argument('--log-file', metavar='FILE', type=str,
help="Direct daemon stdout/stderr to FILE")
p.add_argument('--timeout', metavar='TIMEOUT', type=int,
help="Server shutdown timeout (in seconds)")
p.add_argument('flags', metavar='FLAG', nargs='*', type=str,
help="Regular mypy flags (precede with --)")
......@@ -50,16 +55,20 @@ check_parser = p = subparsers.add_parser('check',
help="Check some files (requires running daemon)")
p.add_argument('-v', '--verbose', action='store_true', help="Print detailed status")
p.add_argument('-q', '--quiet', action='store_true', help=argparse.SUPPRESS) # Deprecated
p.add_argument('--junit-xml', help="write junit.xml to the given file")
p.add_argument('files', metavar='FILE', nargs='+', help="File (or directory) to check")
recheck_parser = p = subparsers.add_parser('recheck',
help="Check the same files as the most previous check run (requires running daemon)")
p.add_argument('-v', '--verbose', action='store_true', help="Print detailed status")
p.add_argument('-q', '--quiet', action='store_true', help=argparse.SUPPRESS) # Deprecated
p.add_argument('--junit-xml', help="write junit.xml to the given file")
hang_parser = p = subparsers.add_parser('hang', help="Hang for 100 seconds")
daemon_parser = p = subparsers.add_parser('daemon', help="Run daemon in foreground")
p.add_argument('--timeout', metavar='TIMEOUT', type=int,
help="Server shutdown timeout (in seconds)")
p.add_argument('flags', metavar='FLAG', nargs='*', type=str,
help="Regular mypy flags (precede with --)")
......@@ -144,8 +153,9 @@ def do_restart(args: argparse.Namespace) -> None:
def start_server(args: argparse.Namespace) -> None:
"""Start the server from command arguments and wait for it."""
# Lazy import so this import doesn't slow down other commands.
from mypy.dmypy_server import daemonize, Server
if daemonize(Server(args.flags).serve, args.log_file) != 0:
from mypy.dmypy_server import daemonize, Server, process_start_options
if daemonize(Server(process_start_options(args.flags), timeout=args.timeout).serve,
args.log_file) != 0:
sys.exit(1)
wait_for_server()
......@@ -221,7 +231,7 @@ def do_check(args: argparse.Namespace) -> None:
response = request('check', files=args.files)
t1 = time.time()
response['roundtrip_time'] = t1 - t0
check_output(response, args.verbose)
check_output(response, args.verbose, args.junit_xml)
@action(recheck_parser)
......@@ -234,10 +244,10 @@ def do_recheck(args: argparse.Namespace) -> None:
response = request('recheck')
t1 = time.time()
response['roundtrip_time'] = t1 - t0
check_output(response, args.verbose)
check_output(response, args.verbose, args.junit_xml)
def check_output(response: Dict[str, Any], verbose: bool) -> None:
def check_output(response: Dict[str, Any], verbose: bool, junit_xml: Optional[str]) -> None:
"""Print the output from a check or recheck command.
Call sys.exit() unless the status code is zero.
......@@ -252,6 +262,9 @@ def check_output(response: Dict[str, Any], verbose: bool) -> None:
sys.stderr.write(err)
if verbose:
show_stats(response)
if junit_xml:
messages = (out + err).splitlines()
write_junit_xml(response['roundtrip_time'], bool(err), messages, junit_xml)
if status_code:
sys.exit(status_code)
......@@ -277,8 +290,8 @@ def do_hang(args: argparse.Namespace) -> None:
def do_daemon(args: argparse.Namespace) -> None:
"""Serve requests in the foreground."""
# Lazy import so this import doesn't slow down other commands.
from mypy.dmypy_server import Server
Server(args.flags).serve()
from mypy.dmypy_server import Server, process_start_options
Server(process_start_options(args.flags), timeout=args.timeout).serve()
@action(help_parser)
......
......@@ -3,7 +3,7 @@
Highly experimental! Only supports UNIX-like systems.
This manages a daemon process which keeps useful state in memory
rather than having to read it back from disk on each run.
to enable fine-grained incremental reprocessing of changes.
"""
import gc
......@@ -20,11 +20,12 @@ from typing import Any, Callable, Dict, List, Mapping, Optional, Sequence, Tuple
import mypy.build
import mypy.errors
import mypy.main
import mypy.server.update
from mypy.server.update import FineGrainedBuildManager
from mypy.dmypy_util import STATUS_FILE, receive
from mypy.gclogger import GcLogger
from mypy.fscache import FileSystemCache
from mypy.fswatcher import FileSystemWatcher, FileData
from mypy.options import Options
def daemonize(func: Callable[[], None], log_file: Optional[str] = None) -> int:
......@@ -78,34 +79,50 @@ def daemonize(func: Callable[[], None], log_file: Optional[str] = None) -> int:
SOCKET_NAME = 'dmypy.sock' # In current directory.
class Server:
# NOTE: the instance is constructed in the parent process but
# serve() is called in the grandchild (by daemonize()).
def __init__(self, flags: List[str]) -> None:
"""Initialize the server with the desired mypy flags."""
self.saved_cache = {} # type: mypy.build.SavedCache
self.fine_grained_initialized = False
def process_start_options(flags: List[str]) -> Options:
import mypy.main
sources, options = mypy.main.process_options(['-i'] + flags,
require_targets=False,
server_options=True)
self.fine_grained = options.fine_grained_incremental
if sources:
sys.exit("dmypy: start/restart does not accept sources")
if options.report_dirs:
sys.exit("dmypy: start/restart cannot generate reports")
if options.junit_xml:
sys.exit("dmypy: start/restart does not support --junit-xml; "
"pass it to check/recheck instead")
if not options.incremental:
sys.exit("dmypy: start/restart should not disable incremental mode")
if options.quick_and_dirty:
sys.exit("dmypy: start/restart should not specify quick_and_dirty mode")
if options.use_fine_grained_cache and not options.fine_grained_incremental:
sys.exit("dmypy: fine-grained cache can only be used in experimental mode")
# Our file change tracking can't yet handle changes to files that aren't
# specified in the sources list.
if options.follow_imports not in ('skip', 'error'):
sys.exit("dmypy: follow-imports must be 'skip' or 'error'")
return options
class Server:
# NOTE: the instance is constructed in the parent process but
# serve() is called in the grandchild (by daemonize()).
def __init__(self, options: Options,
timeout: Optional[int] = None,
alt_lib_path: Optional[str] = None) -> None:
"""Initialize the server with the desired mypy flags."""
self.options = options
self.timeout = timeout
self.alt_lib_path = alt_lib_path
self.fine_grained_manager = None # type: Optional[FineGrainedBuildManager]
if os.path.isfile(STATUS_FILE):
os.unlink(STATUS_FILE)
if self.fine_grained:
options.incremental = True
options.fine_grained_incremental = True
options.show_traceback = True
if options.use_fine_grained_cache:
options.cache_fine_grained = True # set this so that cache options match
......@@ -119,13 +136,23 @@ class Server:
"""Serve requests, synchronously (no thread or fork)."""
try:
sock = self.create_listening_socket()
if self.timeout is not None:
sock.settimeout(self.timeout)
try:
with open(STATUS_FILE, 'w') as f:
json.dump({'pid': os.getpid(), 'sockname': sock.getsockname()}, f)
f.write('\n') # I like my JSON with trailing newline
while True:
try:
conn, addr = sock.accept()
except socket.timeout:
print("Exiting due to inactivity.")
sys.exit(0)
try:
data = receive(conn)
except OSError as err:
conn.close() # Maybe the client hung up
continue
resp = {} # type: Dict[str, Any]
if 'command' not in data:
resp = {'error': "No command found in request"}
......@@ -149,7 +176,7 @@ class Server:
finally:
os.unlink(self.sockname)
exc_info = sys.exc_info()
if exc_info[0]:
if exc_info[0] and exc_info[0] is not SystemExit:
traceback.print_exception(*exc_info) # type: ignore
def create_listening_socket(self) -> socket.socket:
......@@ -208,59 +235,25 @@ class Server:
return {'error': "Command 'recheck' is only valid after a 'check' command"}
return self.check(self.last_sources)
# Needed by tests.
last_manager = None # type: Optional[mypy.build.BuildManager]
def check(self, sources: List[mypy.build.BuildSource],
alt_lib_path: Optional[str] = None) -> Dict[str, Any]:
if self.fine_grained:
return self.check_fine_grained(sources)
else:
return self.check_default(sources, alt_lib_path)
def check_default(self, sources: List[mypy.build.BuildSource],
alt_lib_path: Optional[str] = None) -> Dict[str, Any]:
"""Check using the default (per-file) incremental mode."""
self.last_manager = None
with GcLogger() as gc_result:
try:
# saved_cache is mutated in place.
res = mypy.build.build(sources, self.options,
saved_cache=self.saved_cache,
alt_lib_path=alt_lib_path)
msgs = res.errors
self.last_manager = res.manager # type: Optional[mypy.build.BuildManager]
except mypy.errors.CompileError as err:
msgs = err.messages
if msgs:
msgs.append("")
response = {'out': "\n".join(msgs), 'err': "", 'status': 1}
else:
response = {'out': "", 'err': "", 'status': 0}
response.update(gc_result.get_stats())
response.update(get_meminfo())
if self.last_manager is not None:
response.update(self.last_manager.stats_summary())
return response
def check_fine_grained(self, sources: List[mypy.build.BuildSource]) -> Dict[str, Any]:
def check(self, sources: List[mypy.build.BuildSource]) -> Dict[str, Any]:
"""Check using fine-grained incremental mode."""
if not self.fine_grained_initialized:
if not self.fine_grained_manager:
return self.initialize_fine_grained(sources)
else:
return self.fine_grained_increment(sources)
def initialize_fine_grained(self, sources: List[mypy.build.BuildSource]) -> Dict[str, Any]:
self.fscache = FileSystemCache(self.options.python_version)
self.fswatcher = FileSystemWatcher(self.fscache)
# The file system cache we create gets passed off to
# BuildManager, and thence to FineGrainedBuildManager, which
# assumes responsibility for clearing it after updates.
fscache = FileSystemCache(self.options.python_version)
self.fswatcher = FileSystemWatcher(fscache)
self.update_sources(sources)
if not self.options.use_fine_grained_cache:
# Stores the initial state of sources as a side effect.
self.fswatcher.find_changed()
try:
# TODO: alt_lib_path
result = mypy.build.build(sources=sources,
options=self.options)
options=self.options,
fscache=fscache,
alt_lib_path=self.alt_lib_path)
except mypy.errors.CompileError as e:
output = ''.join(s + '\n' for s in e.messages)
if e.use_stdout:
......@@ -269,73 +262,75 @@ class Server:
out, err = '', output
return {'out': out, 'err': err, 'status': 2}
messages = result.errors
manager = result.manager
graph = result.graph
self.fine_grained_manager = mypy.server.update.FineGrainedBuildManager(manager, graph)
self.fine_grained_initialized = True
self.fine_grained_manager = FineGrainedBuildManager(result)
self.previous_sources = sources
self.fscache.flush()
# If we are using the fine-grained cache, build hasn't actually done
# the typechecking on the updated files yet.
# Run a fine-grained update starting from the cached data
if self.options.use_fine_grained_cache:
if result.used_cache:
# Pull times and hashes out of the saved_cache and stick them into
# the fswatcher, so we pick up the changes.
for meta, mypyfile, type_map in manager.saved_cache.values():
if meta.mtime is None: continue
for state in self.fine_grained_manager.graph.values():
meta = state.meta
if meta is None: continue
assert state.path is not None
self.fswatcher.set_file_data(
mypyfile.path,
state.path,
FileData(st_mtime=float(meta.mtime), st_size=meta.size, md5=meta.hash))
changed, removed = self.find_changed(sources)
# Find anything that has had its dependency list change
for state in self.fine_grained_manager.graph.values():
if not state.is_fresh():
assert state.path is not None
changed.append((state.id, state.path))
# Run an update
changed = self.find_changed(sources)
if changed:
messages = self.fine_grained_manager.update(changed)
self.fscache.flush()
messages = self.fine_grained_manager.update(changed, removed)
else:
# Stores the initial state of sources as a side effect.
self.fswatcher.find_changed()
fscache.flush()
status = 1 if messages else 0
self.previous_messages = messages[:]
return {'out': ''.join(s + '\n' for s in messages), 'err': '', 'status': status}
def fine_grained_increment(self, sources: List[mypy.build.BuildSource]) -> Dict[str, Any]:
assert self.fine_grained_manager is not None
t0 = time.time()
self.update_sources(sources)
changed = self.find_changed(sources)
changed, removed = self.find_changed(sources)
t1 = time.time()
if not changed:
# Nothing changed -- just produce the same result as before.
messages = self.previous_messages
else:
messages = self.fine_grained_manager.update(changed)
messages = self.fine_grained_manager.update(changed, removed)
t2 = time.time()
self.fine_grained_manager.manager.log(
"fine-grained increment: find_changed: {:.3f}s, update: {:.3f}s".format(
t1 - t0, t2 - t1))
status = 1 if messages else 0
self.previous_messages = messages[:]
self.previous_sources = sources
self.fscache.flush()
return {'out': ''.join(s + '\n' for s in messages), 'err': '', 'status': status}
def update_sources(self, sources: List[mypy.build.BuildSource]) -> None:
paths = [source.path for source in sources if source.path is not None]
self.fswatcher.add_watched_paths(paths)
def find_changed(self, sources: List[mypy.build.BuildSource]) -> List[Tuple[str, str]]:
def find_changed(self, sources: List[mypy.build.BuildSource]) -> Tuple[List[Tuple[str, str]],
List[Tuple[str, str]]]:
changed_paths = self.fswatcher.find_changed()
changed = [(source.module, source.path)
for source in sources
if source.path in changed_paths]
modules = {source.module for source in sources}
omitted = [source for source in self.previous_sources if source.module not in modules]
removed = []
for source in omitted:
path = source.path
assert path
# Note that a file could be removed from the list of root sources but have no changes.
if path in changed_paths:
changed.append((source.module, path))
return changed
removed.append((source.module, path))
return changed, removed
def cmd_hang(self) -> Dict[str, object]:
"""Hang for 100 seconds, as a debug hack."""
......
......@@ -6,6 +6,7 @@ from contextlib import contextmanager
from typing import Tuple, List, TypeVar, Set, Dict, Iterator, Optional, cast
from mypy.scope import Scope
from mypy.options import Options
from mypy.version import __version__ as mypy_version
......@@ -21,7 +22,7 @@ class ErrorInfo:
# related to this error. Each item is a (path, line number) tuple.
import_ctx = None # type: List[Tuple[str, int]]
# The source file that was the source of this error.
# The path to source file that was the source of this error.
file = ''
# The fully-qualified id of the source module for this error.
......@@ -51,7 +52,7 @@ class ErrorInfo:
# Only report this particular messages once per program.
only_once = False
# Actual origin of the error message
# Actual origin of the error message as tuple (path, line number)
origin = None # type: Tuple[str, int]
# Fine-grained incremental target where this was reported
......@@ -110,12 +111,6 @@ class Errors:
# Path to current file.
file = None # type: str
# Stack of short names of currents types (or None).
type_name = None # type: List[Optional[str]]
# Stack of short names of current functions or members (or None).
function_or_member = None # type: List[Optional[str]]
# Ignore errors on these lines of each file.
ignored_lines = None # type: Dict[str, Set[int]]
......@@ -136,18 +131,9 @@ class Errors:
# State for keeping track of the current fine-grained incremental mode target.
# (See mypy.server.update for more about targets.)
#
# Current module id.
target_module = None # type: Optional[str]
# Partially qualified name of target class; without module prefix (examples: 'C' is top-level,
# 'C.D' nested).
target_class = None # type: Optional[str]
# Short name of the outermost function/method.
target_function = None # type: Optional[str]
# Nesting depth of functions/classes within the outermost function/method. These aren't
# separate targets and they are included in the surrounding function, but we this counter
# for internal bookkeeping.
target_ignore_depth = 0
scope = None # type: Optional[Scope]
def __init__(self, show_error_context: bool = False,
show_column_numbers: bool = False) -> None:
......@@ -165,10 +151,8 @@ class Errors:
self.used_ignored_lines = defaultdict(set)
self.ignored_files = set()
self.only_once_messages = set()
self.scope = None
self.target_module = None
self.target_class = None
self.target_function = None
self.target_ignore_depth = 0
def reset(self) -> None:
self.initialize()
......@@ -180,9 +164,7 @@ class Errors:
new.type_name = self.type_name[:]
new.function_or_member = self.function_or_member[:]
new.target_module = self.target_module
new.target_class = self.target_class
new.target_function = self.target_function
new.target_ignore_depth = self.target_ignore_depth
new.scope = self.scope
return new
def set_ignore_prefix(self, prefix: str) -> None:
......@@ -198,7 +180,8 @@ class Errors:
return remove_path_prefix(file, self.ignore_prefix)
def set_file(self, file: str,
module: Optional[str]) -> None:
module: Optional[str],
scope: Optional[Scope] = None) -> None:
"""Set the path and module id of the current file."""
# The path will be simplified later, in render_messages. That way
# * 'file' is always a key that uniquely identifies a source file
......@@ -208,9 +191,7 @@ class Errors:
# processed.
self.file = file
self.target_module = module
self.target_class = None
self.target_function = None
self.target_ignore_depth = 0
self.scope = scope
def set_file_ignored_lines(self, file: str,
ignored_lines: Set[int],
......@@ -219,71 +200,16 @@ class Errors:
if ignore_all:
self.ignored_files.add(file)
def push_function(self, name: str) -> None:
"""Set the current function or member short name (it can be None)."""
if self.target_function is None:
self.target_function = name
else:
self.target_ignore_depth += 1
self.function_or_member.append(name)
def pop_function(self) -> None:
self.function_or_member.pop()
if self.target_ignore_depth > 0:
self.target_ignore_depth -= 1
else:
self.target_function = None
@contextmanager
def enter_function(self, name: str) -> Iterator[None]:
self.push_function(name)
yield
self.pop_function()
def push_type(self, name: str) -> None:
"""Set the short name of the current type (it can be None)."""
if self.target_function is not None:
self.target_ignore_depth += 1
elif self.target_class is None:
self.target_class = name
else:
self.target_class += '.' + name
self.type_name.append(name)
def pop_type(self) -> None:
self.type_name.pop()
if self.target_ignore_depth > 0:
self.target_ignore_depth -= 1
else:
assert self.target_class is not None
if '.' in self.target_class:
self.target_class = '.'.join(self.target_class.split('.')[:-1])
else:
self.target_class = None
def current_target(self) -> Optional[str]:
if self.target_module is None:
return None
target = self.target_module
if self.target_function is not None:
# Only include class name if we are inside a method, since a class
# target also includes all methods, which is not what we want
# here. Instead, the current target for a class body is the
# enclosing module top level.
if self.target_class is not None:
target += '.' + self.target_class
target += '.' + self.target_function
return target
"""Retrieves the current target from the associated scope.
def current_module(self) -> Optional[str]:
If there is no associated scope, use the target module."""
if self.scope is not None:
return self.scope.current_target()
return self.target_module
@contextmanager
def enter_type(self, name: str) -> Iterator[None]:
"""Set the short name of the current type (it can be None)."""
self.push_type(name)
yield
self.pop_type()
def current_module(self) -> Optional[str]:
return self.target_module
def import_context(self) -> List[Tuple[str, int]]:
"""Return a copy of the import context."""
......@@ -314,15 +240,21 @@ class Errors:
only_once: if True, only report this exact message once per build
origin_line: if non-None, override current context as origin
"""
type = self.type_name[-1] # type: Optional[str]
if len(self.function_or_member) > 2:
if self.scope:
type = self.scope.current_type_name()
if self.scope.ignored > 0:
type = None # Omit type context if nested function
function = self.scope.current_function_name()
else:
type = None
function = None
if file is None:
file = self.file
if offset:
message = " " * offset + message
info = ErrorInfo(self.import_context(), file, self.current_module(), type,
self.function_or_member[-1], line, column, severity, message,
function, line, column, severity, message,
blocker, only_once,
origin=(self.file, origin_line) if origin_line else None,
target=self.current_target())
......@@ -349,6 +281,17 @@ class Errors:
self.only_once_messages.add(info.message)
self._add_error_info(file, info)
def clear_errors_in_targets(self, path: str, targets: Set[str]) -> None:
"""Remove errors in specific fine-grained targets within a file."""
if path in self.error_info_map:
new_errors = []
for info in self.error_info_map[path]:
if info.target not in targets:
new_errors.append(info)
elif info.only_once:
self.only_once_messages.remove(info.message)
self.error_info_map[path] = new_errors
def generate_unused_ignore_notes(self, file: str) -> None:
ignored_lines = self.ignored_lines[file]
if not self.is_typeshed_file(file):
......@@ -605,13 +548,6 @@ class CompileError(Exception):
self.module_with_blocker = module_with_blocker
class DecodeError(Exception):
"""Exception raised when a file cannot be decoded due to an unknown encoding type.
Essentially a wrapper for the LookupError raised by `bytearray.decode`
"""
def remove_path_prefix(path: str, prefix: str) -> str:
"""If path starts with prefix, return copy of path with the prefix removed.
Otherwise, return path. If path is None, return None.
......
......@@ -22,8 +22,7 @@ def fixup_module_pass_one(tree: MypyFile, modules: Dict[str, MypyFile],
node_fixer.visit_symbol_table(tree.names)
def fixup_module_pass_two(tree: MypyFile, modules: Dict[str, MypyFile],
quick_and_dirty: bool) -> None:
def fixup_module_pass_two(tree: MypyFile, modules: Dict[str, MypyFile]) -> None:
compute_all_mros(tree.names, modules)
......
......@@ -30,45 +30,21 @@ advantage of the benefits.
import os
import stat
from typing import Tuple, Dict, List
from typing import Tuple, Dict, List, Optional
from mypy.util import read_with_python_encoding
from mypy.build import read_with_python_encoding
from mypy.errors import DecodeError
class FileSystemCache:
def __init__(self, pyversion: Tuple[int, int]) -> None:
self.pyversion = pyversion
class FileSystemMetaCache:
def __init__(self) -> None:
self.flush()
def flush(self) -> None:
"""Start another transaction and empty all caches."""
self.stat_cache = {} # type: Dict[str, os.stat_result]
self.stat_error_cache = {} # type: Dict[str, Exception]
self.read_cache = {} # type: Dict[str, str]
self.read_error_cache = {} # type: Dict[str, Exception]
self.hash_cache = {} # type: Dict[str, str]
self.listdir_cache = {} # type: Dict[str, List[str]]
self.listdir_error_cache = {} # type: Dict[str, Exception]
def read_with_python_encoding(self, path: str) -> str:
if path in self.read_cache:
return self.read_cache[path]
if path in self.read_error_cache:
raise self.read_error_cache[path]
# Need to stat first so that the contents of file are from no
# earlier instant than the mtime reported by self.stat().
self.stat(path)
try:
data, md5hash = read_with_python_encoding(path, self.pyversion)
except Exception as err:
self.read_error_cache[path] = err
raise
self.read_cache[path] = data
self.hash_cache[path] = md5hash
return data
self.isfile_case_cache = {} # type: Dict[str, bool]
def stat(self, path: str) -> os.stat_result:
if path in self.stat_cache:
......@@ -97,11 +73,40 @@ class FileSystemCache:
return results
def isfile(self, path: str) -> bool:
try:
st = self.stat(path)
except OSError:
return False
return stat.S_ISREG(st.st_mode)
def isfile_case(self, path: str) -> bool:
"""Return whether path exists and is a file.
On case-insensitive filesystems (like Mac or Windows) this returns
False if the case of the path's last component does not exactly
match the case found in the filesystem.
TODO: We should maybe check the case for some directory components also,
to avoid permitting wrongly-cased *packages*.
"""
if path in self.isfile_case_cache:
return self.isfile_case_cache[path]
head, tail = os.path.split(path)
if not tail:
res = False
else:
try:
names = self.listdir(head)
res = tail in names and self.isfile(path)
except OSError:
res = False
self.isfile_case_cache[path] = res
return res
def isdir(self, path: str) -> bool:
try:
st = self.stat(path)
except OSError:
return False
return stat.S_ISDIR(st.st_mode)
def exists(self, path: str) -> bool:
......@@ -111,6 +116,38 @@ class FileSystemCache:
return False
return True
class FileSystemCache(FileSystemMetaCache):
def __init__(self, pyversion: Tuple[int, int]) -> None:
self.pyversion = pyversion
self.flush()
def flush(self) -> None:
"""Start another transaction and empty all caches."""
super().flush()
self.read_cache = {} # type: Dict[str, str]
self.read_error_cache = {} # type: Dict[str, Exception]
self.hash_cache = {} # type: Dict[str, str]
def read_with_python_encoding(self, path: str) -> str:
if path in self.read_cache:
return self.read_cache[path]
if path in self.read_error_cache:
raise self.read_error_cache[path]
# Need to stat first so that the contents of file are from no
# earlier instant than the mtime reported by self.stat().
self.stat(path)
try:
data, md5hash = read_with_python_encoding(path, self.pyversion)
except Exception as err:
self.read_error_cache[path] = err
raise
self.read_cache[path] = data
self.hash_cache[path] = md5hash
return data
def md5(self, path: str) -> str:
if path not in self.hash_cache:
self.read_with_python_encoding(path)
......
......@@ -51,6 +51,14 @@ def main(script_path: Optional[str], args: Optional[List[str]] = None) -> None:
args: Custom command-line arguments. If not given, sys.argv[1:] will
be used.
"""
# Check for known bad Python versions.
if sys.version_info[:2] < (3, 4):
sys.exit("Running mypy with Python 3.3 or lower is not supported; "
"please upgrade to 3.4 or newer")
if sys.version_info[:3] == (3, 5, 0):
sys.exit("Running mypy with Python 3.5.0 is not supported; "
"please upgrade to 3.5.1 or newer")
t0 = time.time()
# To log stat() calls: os.stat = stat_proxy
if script_path:
......@@ -396,6 +404,7 @@ def process_options(args: List[str],
dest='special-opts:no_fast_parser',
help=argparse.SUPPRESS)
if server_options:
# TODO: This flag is superfluous; remove after a short transition (2018-03-16)
parser.add_argument('--experimental', action='store_true', dest='fine_grained_incremental',
help="enable fine-grained incremental mode")
parser.add_argument('--use-fine-grained-cache', action='store_true',
......@@ -411,16 +420,16 @@ def process_options(args: List[str],
code_group = parser.add_argument_group(title='How to specify the code to type check')
code_group.add_argument('-m', '--module', action='append', metavar='MODULE',
default=[],
dest='special-opts:modules',
help="type-check module; can repeat for more modules")
# TODO: `mypy -p A -p B` currently silently ignores A
# (last option wins). Perhaps -c, -m and -p could just be
# command-line flags that modify how we interpret self.files?
code_group.add_argument('-p', '--package', action='append', metavar='PACKAGE',
default=[],
dest='special-opts:packages',
help="type-check package recursively; can be repeated")
code_group.add_argument('-c', '--command', action='append', metavar='PROGRAM_TEXT',
dest='special-opts:command',
help="type-check program passed in as string")
code_group.add_argument('-p', '--package', metavar='PACKAGE', dest='special-opts:package',
help="type-check all files in a directory")
code_group.add_argument(metavar='files', nargs='*', dest='special-opts:files',
help="type-check given files or directories")
......@@ -484,14 +493,13 @@ def process_options(args: List[str],
# Check for invalid argument combinations.
if require_targets:
code_methods = sum(bool(c) for c in [special_opts.modules,
code_methods = sum(bool(c) for c in [special_opts.modules + special_opts.packages,
special_opts.command,
special_opts.package,
special_opts.files])
if code_methods == 0:
parser.error("Missing target module, package, files, or command.")
elif code_methods > 1:
parser.error("May only specify one of: module, package, files, or command.")
parser.error("May only specify one of: module/package, files, or command.")
# Set build flags.
if options.strict_optional_whitelist is not None:
......@@ -517,19 +525,21 @@ def process_options(args: List[str],
options.incremental = True
# Set target.
if special_opts.modules:
options.build_type = BuildType.MODULE
targets = [BuildSource(None, m, None) for m in special_opts.modules]
return targets, options
elif special_opts.package:
if os.sep in special_opts.package or os.altsep and os.altsep in special_opts.package:
fail("Package name '{}' cannot have a slash in it."
.format(special_opts.package))
if special_opts.modules + special_opts.packages:
options.build_type = BuildType.MODULE
lib_path = [os.getcwd()] + build.mypy_path()
targets = build.find_modules_recursive(special_opts.package, lib_path)
if not targets:
fail("Can't find package '{}'".format(special_opts.package))
targets = []
# TODO: use the same cache as the BuildManager will
cache = build.FindModuleCache()
for p in special_opts.packages:
if os.sep in p or os.altsep and os.altsep in p:
fail("Package name '{}' cannot have a slash in it.".format(p))
p_targets = cache.find_modules_recursive(p, lib_path)
if not p_targets:
fail("Can't find package '{}'".format(p))
targets.extend(p_targets)
for m in special_opts.modules:
targets.append(BuildSource(None, m, None))
return targets, options
elif special_opts.command:
options.build_type = BuildType.PROGRAM_TEXT
......@@ -540,6 +550,7 @@ def process_options(args: List[str],
return targets, options
# TODO: use a FileSystemCache for this
def create_source_list(files: Sequence[str], options: Options) -> List[BuildSource]:
targets = []
for f in files:
......@@ -766,6 +777,11 @@ def parse_section(prefix: str, template: Options,
continue
if key.startswith('x_'):
continue # Don't complain about `x_blah` flags
elif key == 'strict':
print("%s: Strict mode is not supported in configuration files: specify "
"individual flags instead (see 'mypy -h' for the list of flags enabled "
"in strict mode)" % prefix, file=sys.stderr)
else:
print("%s: Unrecognized option: %s = %s" % (prefix, key, section[orig_key]),
file=sys.stderr)
continue
......
......@@ -932,12 +932,12 @@ class MessageBuilder:
'of signature {}'.format(index1), context)
def operator_method_signatures_overlap(
self, reverse_class: str, reverse_method: str, forward_class: str,
self, reverse_class: TypeInfo, reverse_method: str, forward_class: Type,
forward_method: str, context: Context) -> None:
self.fail('Signatures of "{}" of "{}" and "{}" of "{}" '
self.fail('Signatures of "{}" of "{}" and "{}" of {} '
'are unsafely overlapping'.format(
reverse_method, reverse_class,
forward_method, forward_class),
reverse_method, reverse_class.name(),
forward_method, self.format(forward_class)),
context)
def forward_operator_not_callable(
......
......@@ -315,16 +315,50 @@ class ImportAll(ImportBase):
"""from m import *"""
id = None # type: str
relative = None # type: int
imported_names = None # type: List[str]
def __init__(self, id: str, relative: int) -> None:
super().__init__()
self.id = id
self.relative = relative
self.imported_names = []
def accept(self, visitor: StatementVisitor[T]) -> T:
return visitor.visit_import_all(self)
class ImportedName(SymbolNode):
"""Indirect reference to a fullname stored in symbol table.
This node is not present in the original program as such. This is
just a temporary artifact in binding imported names. After semantic
analysis pass 2, these references should be replaced with direct
reference to a real AST node.
Note that this is neither a Statement nor an Expression so this
can't be visited.
"""
def __init__(self, target_fullname: str) -> None:
self.target_fullname = target_fullname
def name(self) -> str:
return self.target_fullname.split('.')[-1]
def fullname(self) -> str:
return self.target_fullname
def serialize(self) -> JsonDict:
assert False, "ImportedName leaked from semantic analysis"
@classmethod
def deserialize(cls, data: JsonDict) -> 'ImportedName':
assert False, "ImportedName should never be serialized"
def __str__(self) -> str:
return 'ImportedName(%s)' % self.target_fullname
class FuncBase(Node):
"""Abstract base class for function-like nodes"""
......@@ -424,7 +458,7 @@ class Argument(Node):
class FuncItem(FuncBase):
arguments = [] # type: List[Argument]
arguments = [] # type: List[Argument] # Note: Can be None if deserialized (type is a lie!)
arg_names = [] # type: List[str]
arg_kinds = [] # type: List[int]
# Minimum number of arguments
......@@ -2080,6 +2114,11 @@ class TypeInfo(SymbolNode):
return (left, right) in self._cache
return (left, right) in self._cache_proper
def reset_subtype_cache(self) -> None:
for item in self.mro:
item._cache = set()
item._cache_proper = set()
def __getitem__(self, name: str) -> 'SymbolTableNode':
n = self.get(name)
if n:
......@@ -2090,15 +2129,9 @@ class TypeInfo(SymbolNode):
def __repr__(self) -> str:
return '<TypeInfo %s>' % self.fullname()
# IDEA: Refactor the has* methods to be more consistent and document
# them.
def has_readable_member(self, name: str) -> bool:
return self.get(name) is not None
def has_method(self, name: str) -> bool:
return self.get_method(name) is not None
def get_method(self, name: str) -> Optional[FuncBase]:
if self.mro is None: # Might be because of a previous error.
return None
......@@ -2122,6 +2155,7 @@ class TypeInfo(SymbolNode):
self.is_enum = self._calculate_is_enum()
# The property of falling back to Any is inherited.
self.fallback_to_any = any(baseinfo.fallback_to_any for baseinfo in self.mro)
self.reset_subtype_cache()
def calculate_metaclass_type(self) -> 'Optional[mypy.types.Instance]':
declared = self.declared_metaclass
......@@ -2206,11 +2240,18 @@ class TypeInfo(SymbolNode):
if isinstance(node, Var) and node.type:
description += ' ({})'.format(type_str(node.type))
names.append(description)
return mypy.strconv.dump_tagged(
['Name({})'.format(self.fullname()),
items = [
'Name({})'.format(self.fullname()),
base,
mro,
('Names', names)],
('Names', names),
]
if self.declared_metaclass:
items.append('DeclaredMetaclass({})'.format(type_str(self.declared_metaclass)))
if self.metaclass_type:
items.append('MetaclassType({})'.format(type_str(self.metaclass_type)))
return mypy.strconv.dump_tagged(
items,
head,
str_conv=str_conv)
......
......@@ -52,7 +52,7 @@ class Options:
# -- build options --
self.build_type = BuildType.STANDARD
self.python_version = defaults.PYTHON3_VERSION
self.python_version = sys.version_info[:2] # type: Tuple[int, int]
self.platform = sys.platform
self.custom_typing_module = None # type: Optional[str]
self.custom_typeshed_dir = None # type: Optional[str]
......