Skip to content
Commits on Source (6)
Metadata-Version: 2.1
Name: mypy
Version: 0.750
Version: 0.761
Summary: Optional static typing for Python
Home-page: http://www.mypy-lang.org/
Author: Jukka Lehtosalo
......
......@@ -221,7 +221,7 @@ This submodule contains types for the Python standard library.
Due to the way git submodules work, you'll have to do
```
git submodule update typeshed
git submodule update mypy/typeshed
```
whenever you change branches, merge, rebase, or pull.
......
mypy (0.761-1) unstable; urgency=medium
* Demote riscv64 to -O0
* Demote hppa to -O2
* New upstream version
* Remove stubgen-fix.patch, it was applied upstream
-- Michael R. Crusoe <michael.crusoe@gmail.com> Wed, 01 Jan 2020 22:04:59 +0100
mypy (0.750-1) unstable; urgency=low
* Give up trying to build on the alpha architecture, switch to pure Python
......
......@@ -9,7 +9,7 @@ https://github.com/python/mypy/blob/master/mypyc/doc/dev-intro.md
So it doesn't sound ready for general release yet
--- mypy.orig/setup.py
+++ mypy/setup.py
@@ -182,7 +182,6 @@
@@ -179,7 +179,6 @@
'mypyc', 'mypyc.test',
],
package_data={'mypy': package_data},
......
ignore_mypyc
proper_plugin
verbose
stubgen-fix.patch
Origin: https://github.com/python/mypy/pull/8003
Applied-Upstream: commit:https://github.com/python/mypy/pull/8003/commits/7f0b1ac239512e90d9f4f7700b1ffccac78f203d
From: "Michael J. Sullivan" <sully@msully.net>
Date: Fri, 22 Nov 2019 15:12:24 -0800
Subject: [PATCH] Remove a semi-broken stubgen assert
C modules in the standard library don't seem to have a __file__ in
the Python distributions that ship with Ubuntu. I think maybe they
are built into the binary? I don't know.
---
mypy/test/teststubgen.py | 1 -
1 file changed, 1 deletion(-)
diff --git a/mypy/test/teststubgen.py b/mypy/test/teststubgen.py
index 6a1a287943..e77c83070b 100644
--- a/mypy/test/teststubgen.py
+++ b/mypy/test/teststubgen.py
@@ -861,7 +861,6 @@ def test_c_module(self) -> None:
p = m.get_package_properties('_socket')
assert p is not None
assert p.name == '_socket'
- assert p.file
assert p.path is None
assert p.is_c_module is True
assert p.subpackages == []
......@@ -2,7 +2,7 @@ Author: Michael R. Crusoe <michael.crusoe@gmail.com>
Description: make the build more verbose
--- mypy.orig/setup.py
+++ mypy/setup.py
@@ -149,6 +149,7 @@
@@ -146,6 +146,7 @@
# Use multi-file compliation mode on windows because without it
# our Appveyor builds run out of memory sometimes.
multi_file=sys.platform == 'win32' or force_multifile,
......
......@@ -13,9 +13,11 @@ endif
export DEB_BUILD_MAINT_OPTIONS=hardening=+all
include /usr/share/dpkg/default.mk
ifneq (,$(filter $(DEB_HOST_ARCH),armel armhf riscv64))
ifneq (,$(filter $(DEB_BUILD_ARCH),hppa))
export MYPYC_OPT_LEVEL=2
else ifneq (,$(filter $(DEB_BUILD_ARCH),armel armhf))
export MYPYC_OPT_LEVEL=1
else ifneq (,$(filter $(DEB_HOST_ARCH),m68k))
else ifneq (,$(filter $(DEB_BUILD_ARCH),m68k riscv64))
export MYPYC_OPT_LEVEL=0
endif
......@@ -43,7 +45,7 @@ debian/stubgen_options.rst: docs/source/stubgen.rst
sed -n -e '/stubgen --help/,$$ {/stubgen --help/d; p}' $< > $@
override_dh_auto_build: manpages
ifneq (,$(filter $(DEB_HOST_ARCH),mips64el mipsel alpha ia64 powerpc sh4))
ifneq (,$(filter $(DEB_BUILD_ARCH),mips64el mipsel alpha ia64 powerpc sh4))
MYPY_USE_MYPYC=0 dh_auto_build
else
MYPY_USE_MYPYC=1 dh_auto_build
......
......@@ -896,7 +896,7 @@ dictionary value depends on the key:
.. code-block:: python
from mypy_extensions import TypedDict
from typing_extensions import TypedDict
Movie = TypedDict('Movie', {'name': str, 'year': int})
......@@ -972,17 +972,19 @@ a subtype of (that is, compatible with) ``Mapping[str, object]``, since
.. note::
You need to install ``mypy_extensions`` using pip to use ``TypedDict``:
Unless you are on Python 3.8 or newer (where ``TypedDict`` is available in
standard library :py:mod:`typing` module) you need to install ``typing_extensions``
using pip to use ``TypedDict``:
.. code-block:: text
python3 -m pip install --upgrade mypy-extensions
python3 -m pip install --upgrade typing-extensions
Or, if you are using Python 2:
.. code-block:: text
pip install --upgrade mypy-extensions
pip install --upgrade typing-extensions
Totality
--------
......@@ -1071,7 +1073,7 @@ in Python 3.6 and later:
.. code-block:: python
from mypy_extensions import TypedDict
from typing_extensions import TypedDict
class Movie(TypedDict):
name: str
......
......@@ -21,8 +21,8 @@ you'll find errors sooner.
.. note::
The mypy daemon is experimental. In particular, the command-line
interface may change in future mypy releases.
The command-line of interface of mypy daemon may change in future mypy
releases.
.. note::
......@@ -63,8 +63,8 @@ changed a few files. You can use :ref:`remote caching <remote-cache>`
to speed up the initial run. The speedup can be significant if
you have a large codebase.
Additional features
*******************
Daemon client commands
**********************
While ``dmypy run`` is sufficient for most uses, some workflows
(ones using :ref:`remote caching <remote-cache>`, perhaps),
......@@ -87,6 +87,11 @@ require more precise control over the lifetime of the daemon process:
* ``dmypy check <files>`` checks a set of files using an already
running daemon.
* ``dmypy recheck`` checks the same set of files as the most recent
``check`` or ``recheck`` command. (You can also use the :option:`--update`
and :option:`--remove` options to alter the set of files, and to define
which files should be processed.)
* ``dmypy status`` checks whether a daemon is running. It prints a
diagnostic and exits with ``0`` if there is a running daemon.
......@@ -94,6 +99,157 @@ Use ``dmypy --help`` for help on additional commands and command-line
options not discussed here, and ``dmypy <command> --help`` for help on
command-specific options.
Additional daemon flags
***********************
.. option:: --status-file FILE
Use ``FILE`` as the status file for storing daemon runtime state. This is
normally a JSON file that contains information about daemon process and
connection. The default path is ``.dmypy.json`` in the current working
directory.
.. option:: --log-file FILE
Direct daemon stdout/stderr to ``FILE``. This is useful for debugging daemon
crashes, since the server traceback is not always printed by the client.
This is available for the ``start``, ``restart``, and ``run`` commands.
.. option:: --timeout TIMEOUT
Automatically shut down server after ``TIMEOUT`` seconds of inactivity.
This is available for the ``start``, ``restart``, and ``run`` commands.
.. option:: --update FILE
Re-check ``FILE``, or add it to the set of files being
checked (and check it). This option may be repeated, and it's only available for
the ``recheck`` command. By default, mypy finds and checks all files changed
since the previous run and files that depend on them. However, if you use this option
(and/or :option:`--remove`), mypy assumes that only the explicitly
specified files have changed. This is only useful to
speed up mypy if you type check a very large number of files, and use an
external, fast file system watcher, such as `watchman`_ or
`watchdog`_, to determine which files got edited or deleted.
*Note:* This option is never required and is only available for
performance tuning.
.. option:: --remove FILE
Remove ``FILE`` from the set of files being checked. This option may be
repeated. This is only available for the
``recheck`` command. See :option:`--update` above for when this may be useful.
*Note:* This option is never required and is only available for performance
tuning.
.. option:: --fswatcher-dump-file FILE
Collect information about the current internal file state. This is
only available for the ``status`` command. This will dump JSON to
``FILE`` in the format ``{path: [modification_time, size,
content_hash]}``. This is useful for debugging the built-in file
system watcher. *Note:* This is an internal flag and the format may
change.
.. option:: --perf-stats-file FILE
Write performance profiling information to ``FILE``. This is only available
for the ``check``, ``recheck``, and ``run`` commands.
Static inference of annotations
*******************************
The mypy daemon supports (as an experimental feature) statically inferring
draft function and method type annotations. Use ``dmypy suggest FUNCTION`` to
generate a draft signature in the format
``(param_type_1, param_type_2, ...) -> ret_type`` (types are included for all
arguments, including keyword-only arguments, ``*args`` and ``**kwargs``).
This is a low-level feature intended to be used by editor integrations,
IDEs, and other tools (for example, the `mypy plugin for PyCharm`_),
to automatically add annotations to source files, or to propose function
signatures.
In this example, the function ``format_id()`` has no annotation:
.. code-block:: python
def format_id(user):
return "User: {}".format(user)
root = format_id(0)
``dymypy suggest`` uses call sites, return statements, and other heuristics (such as
looking for signatures in base classes) to infer that ``format_id()`` accepts
an ``int`` argument and returns a ``str``. Use ``dmypy suggest module.format_id`` to
print the suggested signature for the function.
More generally, the target function may be specified in two ways:
* By its fully qualified name, i.e. ``[package.]module.[class.]function``.
* By its location in a source file, i.e. ``/path/to/file.py:line``. The path can be
absolute or relative, and ``line`` can refer to any line number within
the function body.
This command can also be used to find a more precise alternative for an existing,
imprecise annotation with some ``Any`` types.
The following flags customize various aspects of the ``dmypy suggest``
command.
.. option:: --json
Output the signature as JSON, so that `PyAnnotate`_ can read it and add
the signature to the source file. Here is what the JSON looks like:
.. code-block:: python
[{"func_name": "example.format_id",
"line": 1,
"path": "/absolute/path/to/example.py",
"samples": 0,
"signature": {"arg_types": ["int"], "return_type": "str"}}]
.. option:: --no-errors
Only produce suggestions that cause no errors in the checked code. By default,
mypy will try to find the most precise type, even if it causes some type errors.
.. option:: --no-any
Only produce suggestions that don't contain ``Any`` types. By default mypy
proposes the most precise signature found, even if it contains ``Any`` types.
.. option:: --flex-any FRACTION
Only allow some fraction of types in the suggested signature to be ``Any`` types.
The fraction ranges from ``0`` (same as ``--no-any``) to ``1``.
.. option:: --try-text
Try also using ``unicode`` wherever ``str`` is inferred. This flag may be useful
for annotating Python 2/3 straddling code.
.. option:: --callsites
Only find call sites for a given function instead of suggesting a type.
This will produce a list with line numbers and types of actual
arguments for each call: ``/path/to/file.py:line: (arg_type_1, arg_type_2, ...)``.
.. option:: --use-fixme NAME
Use a dummy name instead of plain ``Any`` for types that cannot
be inferred. This may be useful to emphasize to a user that a given type
couldn't be inferred and needs to be entered manually.
.. option:: --max-guesses NUMBER
Set the maximum number of types to try for a function (default: ``64``).
.. TODO: Add similar sections about go to definition, find usages, and
reveal type when added, and then move this to a separate file.
Limitations
***********
......@@ -102,3 +258,8 @@ Limitations
limitation. This can be defined
through the command line or through a
:ref:`configuration file <config-file>`.
.. _watchman: https://facebook.github.io/watchman/
.. _watchdog: https://pypi.org/project/watchdog/
.. _PyAnnotate: https://github.com/dropbox/pyannotate
.. _mypy plugin for PyCharm: https://github.com/dropbox/mypy-PyCharm-plugin
......@@ -104,6 +104,9 @@ Details of the options:
You can't mix paths and :option:`-m`/:option:`-p` options in the same stubgen
invocation.
Stubgen applies heuristics to avoid generating stubs for submodules
that include tests or vendored third-party packages.
Specifying how to generate stubs
********************************
......@@ -116,12 +119,13 @@ alter the default behavior:
.. option:: --no-import
Don't try to import modules. Instead use mypy's normal search mechanism to find
Don't try to import modules. Instead only use mypy's normal search mechanism to find
sources. This does not support C extension modules. This flag also disables
runtime introspection functionality, which mypy uses to find the value of
``__all__``. As result the set of exported imported names in stubs may be
incomplete. This flag is generally only useful when importing a module generates
an error for some reason.
incomplete. This flag is generally only useful when importing a module causes
unwanted side effects, such as the running of tests. Stubgen tries to skip test
modules even without this option, but this does not always work.
.. option:: --parse-only
......@@ -153,6 +157,12 @@ Additional flags
Include definitions that are considered private in stubs (with names such
as ``_foo`` with single leading underscore and no trailing underscores).
.. option:: --export-less
Don't export all names imported from other modules within the same package.
Instead, only export imported names that are not referenced in the module
that contains the import.
.. option:: --search-path PATH
Specify module search directories, separated by colons (only used if
......@@ -171,3 +181,11 @@ Additional flags
``./out`` directory. The output directory will be created if it doesn't
exist. Existing stubs in the output directory will be overwritten without
warning.
.. option:: -v, --verbose
Produce more verbose output.
.. option:: -q, --quiet
Produce less verbose output.
Metadata-Version: 2.1
Name: mypy
Version: 0.750
Version: 0.761
Summary: Optional static typing for Python
Home-page: http://www.mypy-lang.org/
Author: Jukka Lehtosalo
......
......@@ -81,7 +81,6 @@ mypy/gclogger.py
mypy/git.py
mypy/indirection.py
mypy/infer.py
mypy/interpreted_plugin.py
mypy/ipc.py
mypy/join.py
mypy/literals.py
......@@ -237,7 +236,6 @@ mypy/typeshed/stdlib/2/_sre.pyi
mypy/typeshed/stdlib/2/_struct.pyi
mypy/typeshed/stdlib/2/_symtable.pyi
mypy/typeshed/stdlib/2/_threading_local.pyi
mypy/typeshed/stdlib/2/_warnings.pyi
mypy/typeshed/stdlib/2/abc.pyi
mypy/typeshed/stdlib/2/ast.pyi
mypy/typeshed/stdlib/2/atexit.pyi
......@@ -360,6 +358,7 @@ mypy/typeshed/stdlib/2and3/_csv.pyi
mypy/typeshed/stdlib/2and3/_curses.pyi
mypy/typeshed/stdlib/2and3/_heapq.pyi
mypy/typeshed/stdlib/2and3/_random.pyi
mypy/typeshed/stdlib/2and3/_warnings.pyi
mypy/typeshed/stdlib/2and3/_weakref.pyi
mypy/typeshed/stdlib/2and3/_weakrefset.pyi
mypy/typeshed/stdlib/2and3/aifc.pyi
......@@ -410,10 +409,13 @@ mypy/typeshed/stdlib/2and3/keyword.pyi
mypy/typeshed/stdlib/2and3/linecache.pyi
mypy/typeshed/stdlib/2and3/locale.pyi
mypy/typeshed/stdlib/2and3/macpath.pyi
mypy/typeshed/stdlib/2and3/mailbox.pyi
mypy/typeshed/stdlib/2and3/mailcap.pyi
mypy/typeshed/stdlib/2and3/marshal.pyi
mypy/typeshed/stdlib/2and3/math.pyi
mypy/typeshed/stdlib/2and3/mimetypes.pyi
mypy/typeshed/stdlib/2and3/mmap.pyi
mypy/typeshed/stdlib/2and3/modulefinder.pyi
mypy/typeshed/stdlib/2and3/netrc.pyi
mypy/typeshed/stdlib/2and3/nis.pyi
mypy/typeshed/stdlib/2and3/ntpath.pyi
......@@ -585,7 +587,6 @@ mypy/typeshed/stdlib/3/_subprocess.pyi
mypy/typeshed/stdlib/3/_thread.pyi
mypy/typeshed/stdlib/3/_threading_local.pyi
mypy/typeshed/stdlib/3/_tracemalloc.pyi
mypy/typeshed/stdlib/3/_warnings.pyi
mypy/typeshed/stdlib/3/_winapi.pyi
mypy/typeshed/stdlib/3/abc.pyi
mypy/typeshed/stdlib/3/ast.pyi
......
typed_ast<1.5.0,>=1.4.0
typing_extensions>=3.7.4
mypy_extensions<0.5.0,>=0.4.0
mypy_extensions<0.5.0,>=0.4.3
[dmypy]
psutil>=4.0
......@@ -50,7 +50,7 @@ from mypy.parse import parse
from mypy.stats import dump_type_stats
from mypy.types import Type
from mypy.version import __version__
from mypy.plugin import Plugin, ChainedPlugin, plugin_types, ReportConfigContext
from mypy.plugin import Plugin, ChainedPlugin, ReportConfigContext
from mypy.plugins.default import DefaultPlugin
from mypy.fscache import FileSystemCache
from mypy.metastore import MetadataStore, FilesystemMetadataStore, SqliteMetadataStore
......@@ -280,7 +280,6 @@ CacheMeta = NamedTuple('CacheMeta',
('data_mtime', int), # mtime of data_json
('data_json', str), # path of <id>.data.json
('suppressed', List[str]), # dependencies that weren't imported
('child_modules', List[str]), # all submodules of the given module
('options', Optional[Dict[str, object]]), # build options
# dep_prios and dep_lines are in parallel with
# dependencies + suppressed.
......@@ -317,7 +316,6 @@ def cache_meta_from_dict(meta: Dict[str, Any], data_json: str) -> CacheMeta:
int(meta['data_mtime']) if 'data_mtime' in meta else sentinel,
data_json,
meta.get('suppressed', []),
meta.get('child_modules', []),
meta.get('options'),
meta.get('dep_prios', []),
meta.get('dep_lines', []),
......@@ -423,7 +421,7 @@ def load_plugins_from_config(
plugin_error(
'Type object expected as the return value of "plugin"; got {!r} (in {})'.format(
plugin_type, plugin_path))
if not issubclass(plugin_type, plugin_types):
if not issubclass(plugin_type, Plugin):
plugin_error(
'Return value of "plugin" must be a subclass of "mypy.plugin.Plugin" '
'(in {})'.format(plugin_path))
......@@ -1320,7 +1318,6 @@ def validate_meta(meta: Optional[CacheMeta], id: str, path: Optional[str],
'data_mtime': meta.data_mtime,
'dependencies': meta.dependencies,
'suppressed': meta.suppressed,
'child_modules': meta.child_modules,
'options': (manager.options.clone_for_module(id)
.select_options_affecting_cache()),
'dep_prios': meta.dep_prios,
......@@ -1364,7 +1361,7 @@ def json_dumps(obj: Any, debug_cache: bool) -> str:
def write_cache(id: str, path: str, tree: MypyFile,
dependencies: List[str], suppressed: List[str],
child_modules: List[str], dep_prios: List[int], dep_lines: List[int],
dep_prios: List[int], dep_lines: List[int],
old_interface_hash: str, source_hash: str,
ignore_all: bool, manager: BuildManager) -> Tuple[str, Optional[CacheMeta]]:
"""Write cache files for a module.
......@@ -1379,7 +1376,6 @@ def write_cache(id: str, path: str, tree: MypyFile,
tree: the fully checked module data
dependencies: module IDs on which this module depends
suppressed: module IDs which were suppressed as dependencies
child_modules: module IDs which are this package's direct submodules
dep_prios: priorities (parallel array to dependencies)
dep_lines: import line locations (parallel array to dependencies)
old_interface_hash: the hash from the previous version of the data cache file
......@@ -1469,7 +1465,6 @@ def write_cache(id: str, path: str, tree: MypyFile,
'data_mtime': data_mtime,
'dependencies': dependencies,
'suppressed': suppressed,
'child_modules': child_modules,
'options': options.select_options_affecting_cache(),
'dep_prios': dep_prios,
'dep_lines': dep_lines,
......@@ -1688,9 +1683,6 @@ class State:
# Parent package, its parent, etc.
ancestors = None # type: Optional[List[str]]
# A list of all direct submodules of a given module
child_modules = None # type: Set[str]
# List of (path, line number) tuples giving context for import
import_context = None # type: List[Tuple[str, int]]
......@@ -1797,7 +1789,6 @@ class State:
assert len(all_deps) == len(self.meta.dep_lines)
self.dep_line_map = {id: line
for id, line in zip(all_deps, self.meta.dep_lines)}
self.child_modules = set(self.meta.child_modules)
if temporary:
self.load_tree(temporary=True)
if not manager.use_fine_grained_cache():
......@@ -1824,7 +1815,6 @@ class State:
# Parse the file (and then some) to get the dependencies.
self.parse_file()
self.compute_dependencies()
self.child_modules = set()
@property
def xmeta(self) -> CacheMeta:
......@@ -1855,8 +1845,7 @@ class State:
# dependency is added back we find out later in the process.
return (self.meta is not None
and self.is_interface_fresh()
and self.dependencies == self.meta.dependencies
and self.child_modules == set(self.meta.child_modules))
and self.dependencies == self.meta.dependencies)
def is_interface_fresh(self) -> bool:
return self.externally_same
......@@ -2004,9 +1993,11 @@ class State:
self.path, os.strerror(ioerr.errno))],
module_with_blocker=self.id)
except (UnicodeDecodeError, DecodeError) as decodeerr:
raise CompileError([
"mypy: can't decode file '{}': {}".format(self.path, str(decodeerr))],
module_with_blocker=self.id)
if self.path.endswith('.pyd'):
err = "mypy: stubgen does not support .pyd files: '{}'".format(self.path)
else:
err = "mypy: can't decode file '{}': {}".format(self.path, str(decodeerr))
raise CompileError([err], module_with_blocker=self.id)
else:
assert source is not None
self.source_hash = compute_hash(source)
......@@ -2239,7 +2230,7 @@ class State:
"Duplicates in dependencies list for {} ({})".format(self.id, self.dependencies))
new_interface_hash, self.meta = write_cache(
self.id, self.path, self.tree,
list(self.dependencies), list(self.suppressed), list(self.child_modules),
list(self.dependencies), list(self.suppressed),
dep_prios, dep_lines, self.interface_hash, self.source_hash, self.ignore_all,
self.manager)
if new_interface_hash == self.interface_hash:
......@@ -2793,8 +2784,6 @@ def load_graph(sources: List[BuildSource], manager: BuildManager,
assert newst.id not in graph, newst.id
graph[newst.id] = newst
new.append(newst)
if dep in st.ancestors and dep in graph:
graph[dep].child_modules.add(st.id)
if dep in graph and dep in st.suppressed_set:
# Previously suppressed file is now visible
st.add_dependency(dep)
......
......@@ -2042,6 +2042,7 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
self.check_assignment_to_multiple_lvalues(lvalue.items, rvalue, rvalue,
infer_lvalue_type)
else:
self.try_infer_partial_generic_type_from_assignment(lvalue, rvalue)
lvalue_type, index_lvalue, inferred = self.check_lvalue(lvalue)
# If we're assigning to __getattr__ or similar methods, check that the signature is
# valid.
......@@ -2141,6 +2142,44 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
rvalue_type = remove_instance_last_known_values(rvalue_type)
self.infer_variable_type(inferred, lvalue, rvalue_type, rvalue)
def try_infer_partial_generic_type_from_assignment(self,
lvalue: Lvalue,
rvalue: Expression) -> None:
"""Try to infer a precise type for partial generic type from assignment.
Example where this happens:
x = []
if foo():
x = [1] # Infer List[int] as type of 'x'
"""
var = None
if (isinstance(lvalue, NameExpr)
and isinstance(lvalue.node, Var)
and isinstance(lvalue.node.type, PartialType)):
var = lvalue.node
elif isinstance(lvalue, MemberExpr):
var = self.expr_checker.get_partial_self_var(lvalue)
if var is not None:
typ = var.type
assert isinstance(typ, PartialType)
if typ.type is None:
return
# TODO: some logic here duplicates the None partial type counterpart
# inlined in check_assignment(), see # 8043.
partial_types = self.find_partial_types(var)
if partial_types is None:
return
rvalue_type = self.expr_checker.accept(rvalue)
rvalue_type = get_proper_type(rvalue_type)
if isinstance(rvalue_type, Instance):
if rvalue_type.type == typ.type and is_valid_inferred_type(rvalue_type):
var.type = rvalue_type
del partial_types[var]
elif isinstance(rvalue_type, AnyType):
var.type = fill_typevars_with_any(typ.type)
del partial_types[var]
def check_compatibility_all_supers(self, lvalue: RefExpr, lvalue_type: Optional[Type],
rvalue: Expression) -> bool:
lvalue_node = lvalue.node
......@@ -2771,16 +2810,17 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
def infer_partial_type(self, name: Var, lvalue: Lvalue, init_type: Type) -> bool:
init_type = get_proper_type(init_type)
if isinstance(init_type, NoneType):
partial_type = PartialType(None, name, [init_type])
partial_type = PartialType(None, name)
elif isinstance(init_type, Instance):
fullname = init_type.type.fullname
if (isinstance(lvalue, (NameExpr, MemberExpr)) and
(fullname == 'builtins.list' or
fullname == 'builtins.set' or
fullname == 'builtins.dict') and
fullname == 'builtins.dict' or
fullname == 'collections.OrderedDict') and
all(isinstance(t, (NoneType, UninhabitedType))
for t in get_proper_types(init_type.args))):
partial_type = PartialType(init_type.type, name, init_type.args)
partial_type = PartialType(init_type.type, name)
else:
return False
else:
......@@ -2815,10 +2855,14 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
We implement this here by giving x a valid type (replacing inferred <nothing> with Any).
"""
fallback = self.inference_error_fallback_type(type)
self.set_inferred_type(var, lvalue, fallback)
def inference_error_fallback_type(self, type: Type) -> Type:
fallback = type.accept(SetNothingToAny())
# Type variables may leak from inference, see https://github.com/python/mypy/issues/5738,
# we therefore need to erase them.
self.set_inferred_type(var, lvalue, erase_typevars(fallback))
return erase_typevars(fallback)
def check_simple_assignment(self, lvalue_type: Optional[Type], rvalue: Expression,
context: Context,
......@@ -2960,8 +3004,12 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
def try_infer_partial_type_from_indexed_assignment(
self, lvalue: IndexExpr, rvalue: Expression) -> None:
# TODO: Should we share some of this with try_infer_partial_type?
var = None
if isinstance(lvalue.base, RefExpr) and isinstance(lvalue.base.node, Var):
var = lvalue.base.node
elif isinstance(lvalue.base, MemberExpr):
var = self.expr_checker.get_partial_self_var(lvalue.base)
if isinstance(var, Var):
if isinstance(var.type, PartialType):
type_type = var.type.type
if type_type is None:
......@@ -2970,19 +3018,15 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
if partial_types is None:
return
typename = type_type.fullname
if typename == 'builtins.dict':
if typename == 'builtins.dict' or typename == 'collections.OrderedDict':
# TODO: Don't infer things twice.
key_type = self.expr_checker.accept(lvalue.index)
value_type = self.expr_checker.accept(rvalue)
full_key_type = make_simplified_union(
[key_type, var.type.inner_types[0]])
full_value_type = make_simplified_union(
[value_type, var.type.inner_types[1]])
if (is_valid_inferred_type(full_key_type) and
is_valid_inferred_type(full_value_type)):
if (is_valid_inferred_type(key_type) and
is_valid_inferred_type(value_type)):
if not self.current_node_deferred:
var.type = self.named_generic_type('builtins.dict',
[full_key_type, full_value_type])
var.type = self.named_generic_type(typename,
[key_type, value_type])
del partial_types[var]
def visit_expression_stmt(self, s: ExpressionStmt) -> None:
......@@ -3036,7 +3080,9 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
and not self.current_node_deferred
and not is_proper_subtype(AnyType(TypeOfAny.special_form), return_type)
and not (defn.name in BINARY_MAGIC_METHODS and
is_literal_not_implemented(s.expr))):
is_literal_not_implemented(s.expr))
and not (isinstance(return_type, Instance) and
return_type.type.fullname == 'builtins.object')):
self.msg.incorrectly_returning_any(return_type, s)
return
......@@ -4027,7 +4073,9 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
subtype = get_proper_type(subtype)
supertype = get_proper_type(supertype)
if self.msg.try_report_long_tuple_assignment_error(subtype, supertype, context, msg,
subtype_label, supertype_label, code=code):
return False
if self.should_suppress_optional_error([subtype]):
return False
extra_info = [] # type: List[str]
......@@ -4260,7 +4308,7 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
else:
return Instance(
typ.type,
[AnyType(TypeOfAny.unannotated) for _ in typ.inner_types])
[AnyType(TypeOfAny.unannotated)] * len(typ.type.type_vars))
def is_defined_in_base_class(self, var: Var) -> bool:
if var.info:
......@@ -4294,7 +4342,14 @@ class TypeChecker(NodeVisitor[None], CheckerPluginInterface):
# All scopes within the outermost function are active. Scopes out of
# the outermost function are inactive to allow local reasoning (important
# for fine-grained incremental mode).
scope_active = (not self.options.local_partial_types
disallow_other_scopes = self.options.local_partial_types
if isinstance(var.type, PartialType) and var.type.type is not None and var.info:
# This is an ugly hack to make partial generic self attributes behave
# as if --local-partial-types is always on (because it used to be like this).
disallow_other_scopes = True
scope_active = (not disallow_other_scopes
or scope.is_local == self.partial_types[-1].is_local)
return scope_active, scope.is_local, scope.map
return False, False, None
......
......@@ -537,18 +537,40 @@ class ExpressionChecker(ExpressionVisitor[Type]):
return callee
def get_partial_self_var(self, expr: MemberExpr) -> Optional[Var]:
"""Get variable node for a partial self attribute.
If the expression is not a self attribute, or attribute is not variable,
or variable is not partial, return None.
"""
if not (isinstance(expr.expr, NameExpr) and
isinstance(expr.expr.node, Var) and expr.expr.node.is_self):
# Not a self.attr expression.
return None
info = self.chk.scope.enclosing_class()
if not info or expr.name not in info.names:
# Don't mess with partial types in superclasses.
return None
sym = info.names[expr.name]
if isinstance(sym.node, Var) and isinstance(sym.node.type, PartialType):
return sym.node
return None
# Types and methods that can be used to infer partial types.
item_args = {'builtins.list': ['append'],
'builtins.set': ['add', 'discard'],
} # type: ClassVar[Dict[str, List[str]]]
container_args = {'builtins.list': {'extend': ['builtins.list']},
'builtins.dict': {'update': ['builtins.dict']},
'collections.OrderedDict': {'update': ['builtins.dict']},
'builtins.set': {'update': ['builtins.set', 'builtins.list']},
} # type: ClassVar[Dict[str, Dict[str, List[str]]]]
def try_infer_partial_type(self, e: CallExpr) -> None:
if isinstance(e.callee, MemberExpr) and isinstance(e.callee.expr, RefExpr):
var = e.callee.expr.node
if var is None and isinstance(e.callee.expr, MemberExpr):
var = self.get_partial_self_var(e.callee.expr)
if not isinstance(var, Var):
return
partial_types = self.chk.find_partial_types(var)
......@@ -566,10 +588,8 @@ class ExpressionChecker(ExpressionVisitor[Type]):
if (typename in self.item_args and methodname in self.item_args[typename]
and e.arg_kinds == [ARG_POS]):
item_type = self.accept(e.args[0])
full_item_type = make_simplified_union(
[item_type, partial_type.inner_types[0]])
if mypy.checker.is_valid_inferred_type(full_item_type):
var.type = self.chk.named_generic_type(typename, [full_item_type])
if mypy.checker.is_valid_inferred_type(item_type):
var.type = self.chk.named_generic_type(typename, [item_type])
del partial_types[var]
elif (typename in self.container_args
and methodname in self.container_args[typename]
......@@ -578,15 +598,10 @@ class ExpressionChecker(ExpressionVisitor[Type]):
if isinstance(arg_type, Instance):
arg_typename = arg_type.type.fullname
if arg_typename in self.container_args[typename][methodname]:
full_item_types = [
make_simplified_union([item_type, prev_type])
for item_type, prev_type
in zip(arg_type.args, partial_type.inner_types)
]
if all(mypy.checker.is_valid_inferred_type(item_type)
for item_type in full_item_types):
for item_type in arg_type.args):
var.type = self.chk.named_generic_type(typename,
list(full_item_types))
list(arg_type.args))
del partial_types[var]
def apply_function_plugin(self,
......@@ -1480,8 +1495,12 @@ class ExpressionChecker(ExpressionVisitor[Type]):
target = AnyType(TypeOfAny.from_error)
if not self.chk.should_suppress_optional_error(arg_types):
if not is_operator_method(callable_name):
code = None
else:
code = codes.OPERATOR
arg_messages.no_variant_matches_arguments(
plausible_targets, callee, arg_types, context)
plausible_targets, callee, arg_types, context, code=code)
result = self.check_call(target, args, arg_kinds, context, arg_names,
arg_messages=arg_messages,
......@@ -3221,6 +3240,7 @@ class ExpressionChecker(ExpressionVisitor[Type]):
self.chk.return_types.append(AnyType(TypeOfAny.special_form))
# Type check everything in the body except for the final return
# statement (it can contain tuple unpacking before return).
with self.chk.scope.push_function(e):
for stmt in e.body.body[:-1]:
stmt.accept(self.chk)
# Only type check the return expression, not the return statement.
......@@ -3235,6 +3255,7 @@ class ExpressionChecker(ExpressionVisitor[Type]):
self.chk.return_types.append(inferred_type.ret_type)
self.chk.check_func_item(e, type_override=type_override)
if e.expr() not in self.chk.type_map:
# TODO: return expression must be accepted before exiting function scope.
self.accept(e.expr(), allow_none_return=True)
ret_type = self.chk.type_map[e.expr()]
if isinstance(get_proper_type(ret_type), NoneType):
......@@ -4260,3 +4281,13 @@ def type_info_from_type(typ: Type) -> Optional[TypeInfo]:
# A complicated type. Too tricky, give up.
# TODO: Do something more clever here.
return None
def is_operator_method(fullname: Optional[str]) -> bool:
if fullname is None:
return False
short_name = fullname.split('.')[-1]
return (
short_name in nodes.op_methods.values() or
short_name in nodes.reverse_op_methods.values() or
short_name in nodes.unary_op_methods.values())
......@@ -572,6 +572,7 @@ def analyze_var(name: str,
mx.context, name, mx.msg)
signature = bind_self(signature, mx.self_type, var.is_classmethod)
expanded_signature = get_proper_type(expand_type_by_instance(signature, itype))
freeze_type_vars(expanded_signature)
if var.is_property:
# A property cannot have an overloaded type => the cast is fine.
assert isinstance(expanded_signature, CallableType)
......@@ -764,8 +765,8 @@ def analyze_class_attribute_access(itype: Instance,
t = get_proper_type(t)
if isinstance(t, FunctionLike) and is_classmethod:
t = check_self_arg(t, mx.self_type, False, mx.context, name, mx.msg)
result = add_class_tvars(t, itype, isuper, is_classmethod,
mx.builtin_type, mx.self_type, original_vars=original_vars)
result = add_class_tvars(t, isuper, is_classmethod,
mx.self_type, original_vars=original_vars)
if not mx.is_lvalue:
result = analyze_descriptor_access(mx.original_type, result, mx.builtin_type,
mx.msg, mx.context, chk=mx.chk)
......@@ -808,9 +809,8 @@ def analyze_class_attribute_access(itype: Instance,
return typ
def add_class_tvars(t: ProperType, itype: Instance, isuper: Optional[Instance],
def add_class_tvars(t: ProperType, isuper: Optional[Instance],
is_classmethod: bool,
builtin_type: Callable[[str], Instance],
original_type: Type,
original_vars: Optional[List[TypeVarDef]] = None) -> Type:
"""Instantiate type variables during analyze_class_attribute_access,
......@@ -821,17 +821,21 @@ def add_class_tvars(t: ProperType, itype: Instance, isuper: Optional[Instance],
def foo(cls: Type[Q]) -> Tuple[T, Q]: ...
class B(A[str]): pass
B.foo()
original_type is the value of the type B in the expression B.foo() or the corresponding
component in case if a union (this is used to bind the self-types); original_vars are type
variables of the class callable on which the method was accessed.
Args:
t: Declared type of the method (or property)
isuper: Current instance mapped to the superclass where method was defined, this
is usually done by map_instance_to_supertype()
is_classmethod: True if this method is decorated with @classmethod
original_type: The value of the type B in the expression B.foo() or the corresponding
component in case of a union (this is used to bind the self-types)
original_vars: Type variables of the class callable on which the method was accessed
Returns:
Expanded method type with added type variables (when needed).
"""
# TODO: verify consistency between Q and T
if is_classmethod:
assert isuper is not None
t = get_proper_type(expand_type_by_instance(t, isuper))
# We add class type variables if the class method is accessed on class object
# without applied type arguments, this matches the behavior of __init__().
# For example (continuing the example in docstring):
......@@ -847,13 +851,19 @@ def add_class_tvars(t: ProperType, itype: Instance, isuper: Optional[Instance],
if isinstance(t, CallableType):
tvars = original_vars if original_vars is not None else []
if is_classmethod:
t = freshen_function_type_vars(t)
t = bind_self(t, original_type, is_classmethod=True)
assert isuper is not None
t = cast(CallableType, expand_type_by_instance(t, isuper))
freeze_type_vars(t)
return t.copy_modified(variables=tvars + t.variables)
elif isinstance(t, Overloaded):
return Overloaded([cast(CallableType, add_class_tvars(item, itype, isuper, is_classmethod,
builtin_type, original_type,
return Overloaded([cast(CallableType, add_class_tvars(item, isuper,
is_classmethod, original_type,
original_vars=original_vars))
for item in t.items()])
if isuper is not None:
t = cast(ProperType, expand_type_by_instance(t, isuper))
return t
......
......@@ -108,6 +108,14 @@ def _infer_constraints(template: Type, actual: Type,
template = get_proper_type(template)
actual = get_proper_type(actual)
# Type inference shouldn't be affected by whether union types have been simplified.
# We however keep any ErasedType items, so that the caller will see it when using
# checkexpr.has_erased_component().
if isinstance(template, UnionType):
template = mypy.typeops.make_simplified_union(template.items, keep_erased=True)
if isinstance(actual, UnionType):
actual = mypy.typeops.make_simplified_union(actual.items, keep_erased=True)
# Ignore Any types from the type suggestion engine to avoid them
# causing us to infer Any in situations where a better job could
# be done otherwise. (This can produce false positives but that
......
......@@ -72,14 +72,14 @@ check_parser = p = subparsers.add_parser('check', formatter_class=AugmentedHelpF
p.add_argument('-v', '--verbose', action='store_true', help="Print detailed status")
p.add_argument('-q', '--quiet', action='store_true', help=argparse.SUPPRESS) # Deprecated
p.add_argument('--junit-xml', help="Write junit.xml to the given file")
p.add_argument('--perf-stats-file', help='write telemetry information to the given file')
p.add_argument('--perf-stats-file', help='write performance information to the given file')
p.add_argument('files', metavar='FILE', nargs='+', help="File (or directory) to check")
run_parser = p = subparsers.add_parser('run', formatter_class=AugmentedHelpFormatter,
help="Check some files, [re]starting daemon if necessary")
p.add_argument('-v', '--verbose', action='store_true', help="Print detailed status")
p.add_argument('--junit-xml', help="Write junit.xml to the given file")
p.add_argument('--perf-stats-file', help='write telemetry information to the given file')
p.add_argument('--perf-stats-file', help='write performance information to the given file')
p.add_argument('--timeout', metavar='TIMEOUT', type=int,
help="Server shutdown timeout (in seconds)")
p.add_argument('--log-file', metavar='FILE', type=str,
......@@ -88,13 +88,13 @@ p.add_argument('flags', metavar='ARG', nargs='*', type=str,
help="Regular mypy flags and files (precede with --)")
recheck_parser = p = subparsers.add_parser('recheck', formatter_class=AugmentedHelpFormatter,
help="Re-check the previous list of files, with optional modifications (requires daemon).")
help="Re-check the previous list of files, with optional modifications (requires daemon)")
p.add_argument('-v', '--verbose', action='store_true', help="Print detailed status")
p.add_argument('-q', '--quiet', action='store_true', help=argparse.SUPPRESS) # Deprecated
p.add_argument('--junit-xml', help="Write junit.xml to the given file")
p.add_argument('--perf-stats-file', help='write telemetry information to the given file')
p.add_argument('--perf-stats-file', help='write performance information to the given file')
p.add_argument('--update', metavar='FILE', nargs='*',
help="Files in the run to add or check again (default: all from previous run)..")
help="Files in the run to add or check again (default: all from previous run)")
p.add_argument('--remove', metavar='FILE', nargs='*',
help="Files to remove from the run")
......