Commit 8641eb14 authored by Michael Fladischer's avatar Michael Fladischer

New upstream version 2.2.2

parent 0b8b502f
Issues are for **concrete, actionable bugs and feature requests** only - if you're just asking for debugging help or technical support we have to direct you elsewhere. If you just have questions or support requests please use:
- Stack Overflow
- The Django Users mailing list django-users@googlegroups.com (https://groups.google.com/forum/#!forum/django-users)
We have to limit this because of limited volunteer time to respond to issues!
Please also try and include, if you can:
- Your OS and runtime environment, and browser if applicable
- A `pip freeze` output showing your package versions
- What you expected to happen vs. what actually happened
- How you're running Channels (runserver? daphne/runworker? Nginx/Apache in front?)
- Console logs and full tracebacks of any errors
......@@ -7,3 +7,5 @@ build/
.hypothesis
.cache
.eggs
test_layer*
test_consumer*
sudo: false
dist: trusty
language: python
python:
- "2.7"
- "3.4"
- "3.5"
- "3.6"
- '3.5'
- '3.6'
install: pip install tox tox-travis
env:
- TWISTED="twisted==17.5.0"
- TWISTED="twisted"
script: tox
install:
- pip install $TWISTED isort unify flake8 -e .[tests]
- pip freeze
script:
- pytest
- flake8
- isort --check-only --diff --recursive daphne tests
- unify --check-only --recursive --quote \" daphne tests
jobs:
include:
- stage: release
script: skip
deploy:
provider: pypi
user: andrewgodwin_bot
on:
tags: true
distributions: sdist bdist_wheel
password:
secure: IA+dvSmMKN+fT47rgRb6zdmrExhK5QCVEDH8kheC6kAacw80ORBZKo6sMX9GQBJ3BlfhTqrzAhItHkDUxonb579rJDvmlJ7FPg7axZpsY9Fmls6q1rJC/La8iGWx20+ctberejKSH3wSwa0LH0imJXGDoKKzf1DLmk5pEEWjG2QqhKdEtyAcnzOPnDWcRCs+DKfQcMzETH7lMFN8oe3aBhHLLtcg4yA78cN5CeyyH92lmbaVp7k/b1FqXXFgf16bi5tlgLrb6DhmcnNjwLMSHRafNoPCXkWQOwh6gEHeHRR3OsHsBueyJHIikuHNrpmgpAqjYlVQ5WqmfgMlhCfRm9xL+G4G+KK9n8AJNGAszUfxVlPvMTw+nkOSd/bmxKrdCqqYnDIvDLucXJ86TstNzklfAwr3FL+wBlucRtOMLhQlHIaPTXYcNpOuh6B4ELjC+WjDGh8EdRKvcsZz7+5AS5ZaDDccuviMzQFsXVcE2d4HiosbARVrkxJ7j3MWp0OGgWVxXgRO2EQIksbgGSIjI8PqFjBqht2WT6MhVZPCc9XHUlP2CiAR5+QY8JgTIztbEDuhpgr0cRAtiHwJEAxDR9tJR/j/v4X/Pau2ZdR0C0yW77lVgD75spLL0khAnU7q+qgiF0hyQ7gRRVy0tElT0HBenVbzjzHowdJX8lSPjRg=
1.4.2 (2018-01-05)
2.2.2 (2018-08-16)
------------------
* Bugfix for WebSocket protocol when X-Forwarded-For is turned on.
* X-Forwarded-Proto support is now present and enabled if you turn on the
--proxy-headers flag
* ASGI applications are no longer instantiated in a thread (the ASGI spec
was finalised to say all constructors must be non-blocking on the main thread)
1.4.1 (2018-01-02)
2.2.1 (2018-07-22)
------------------
* Python 3.7 compatability is flagged and ensured by using Twisted 18.7 and
above as a dependency.
* The send() awaitable in applications no longer blocks if the connection is
closed.
* Fixed a race condition where applications would be cleaned up before they
had even started.
2.2.0 (2018-06-13)
------------------
* HTTP timeouts have been removed by default, as they were only needed
with ASGI/Channels 1. You can re-enable them with the --http-timeout
argument to Daphne.
* Occasional errors on application timeout for non-fully-opened sockets
and for trying to read closed requests under high load are fixed.
* X-Forwarded-For headers are now correctly decoded in all environments
and no longer have unicode matching issues.
2.1.2 (2018-05-24)
------------------
* Fixed spurious errors caused by websockets disconnecting before their
application was instantiated.
* Stronger checking for type-safety of headers as bytestrings
2.1.1 (2018-04-18)
------------------
* ASGI application constructors are now run in a threadpool as they might
contain blocking synchronous code.
2.1.0 (2018-03-05)
------------------
* Removed subprotocol support from server, as it never really worked. Subprotocols
can instead be negotiated by ASGI applications now.
* Non-ASCII query strings now raise a 400 Bad Request error rather than silently
breaking the logger
2.0.4 (2018-02-21)
------------------
* Ping timeouts no longer reset on outgoing data, only incoming data
* No more errors when connections close prematurely
2.0.3 (2018-02-07)
------------------
* Unix socket listening no longer errors during startup (introduced in 2.0.2)
* ASGI Applications are now not immediately killed on disconnection but instead
given --application-close-timeout seconds to exit (defaults to 10)
2.0.2 (2018-02-04)
------------------
* Bugfix for a bad merge of HTTPFactory for X-Forwarded-Proto causing Daphne
to not start.
* WebSockets are no longer closed after the duration of http_timeout
1.4.0 (2018-01-02)
2.0.1 (2018-02-03)
------------------
* The X-Forwarded-Proto header can now be used to pass along protocol from
a reverse proxy.
* Updated logging to correctly route exceptions through the main Daphne logger
2.0.0 (2018-02-01)
------------------
* WebSocket headers are now correctly always passed as bytestrings.
* Major rewrite to the new async-based ASGI specification and to support
Channels 2. Not backwards compatible.
1.3.0 (2017-06-16)
......
......@@ -13,4 +13,4 @@ endif
git tag $(version)
git push
git push --tags
python setup.py sdist bdist_wheel upload
#python setup.py sdist bdist_wheel upload
......@@ -8,32 +8,36 @@ daphne
:target: https://pypi.python.org/pypi/daphne
Daphne is a HTTP, HTTP2 and WebSocket protocol server for
`ASGI <https://channels.readthedocs.io/en/latest/asgi.html>`_, and developed
to power Django Channels.
`ASGI <https://github.com/django/asgiref/blob/master/specs/asgi.rst>`_ and
`ASGI-HTTP <https://github.com/django/asgiref/blob/master/specs/www.rst>`_,
developed to power Django Channels.
It supports automatic negotiation of protocols; there's no need for URL
prefixing to determine WebSocket endpoints versus HTTP endpoints.
*Note:* Daphne 2 is not compatible with Channels 1.x applications, only with
Channels 2.x and other ASGI applications. Install a 1.x version of Daphne
for Channels 1.x support.
Running
-------
Simply point Daphne to your ASGI channel layer instance, and optionally
Simply point Daphne to your ASGI application, and optionally
set a bind address and port (defaults to localhost, port 8000)::
daphne -b 0.0.0.0 -p 8001 django_project.asgi:channel_layer
daphne -b 0.0.0.0 -p 8001 django_project.asgi:application
If you intend to run daphne behind a proxy server you can use UNIX
sockets to communicate between the two::
daphne -u /tmp/daphne.sock django_project.asgi:channel_layer
daphne -u /tmp/daphne.sock django_project.asgi:application
If daphne is being run inside a process manager such as
`Circus <https://github.com/circus-tent/circus/>`_ you might
If daphne is being run inside a process manager, you might
want it to bind to a file descriptor passed down from a parent process.
To achieve this you can use the --fd flag::
daphne --fd 5 django_project.asgi:channel_layer
daphne --fd 5 django_project.asgi:application
If you want more control over the port/socket bindings you can fall back to
using `twisted's endpoint description strings
......@@ -42,33 +46,33 @@ by using the `--endpoint (-e)` flag, which can be used multiple times.
This line would start a SSL server on port 443, assuming that `key.pem` and `crt.pem`
exist in the current directory (requires pyopenssl to be installed)::
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:channel_layer
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:application
Endpoints even let you use the ``txacme`` endpoint syntax to get automatic certificates
from Let's Encrypt, which you can read more about at http://txacme.readthedocs.io/en/stable/.
To see all available command line options run daphne with the *-h* flag.
To see all available command line options run daphne with the ``-h`` flag.
HTTP/2 Support
--------------
Daphne 1.1 and above supports terminating HTTP/2 connections natively. You'll
Daphne supports terminating HTTP/2 connections natively. You'll
need to do a couple of things to get it working, though. First, you need to
make sure you install the Twisted ``http2`` and ``tls`` extras::
pip install -U 'Twisted[tls,http2]'
pip install -U Twisted[tls,http2]
Next, because all current browsers only support HTTP/2 when using TLS, you will
need to start Daphne with TLS turned on, which can be done using the Twisted endpoint syntax::
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:channel_layer
daphne -e ssl:443:privateKey=key.pem:certKey=crt.pem django_project.asgi:application
Alternatively, you can use the ``txacme`` endpoint syntax or anything else that
enables TLS under the hood.
You will also need to be on a system that has **OpenSSL 1.0.2 or greater**; if you are
using Ubuntu, this means you need at least 16.04.
using Ubuntu, this means you need at least Ubuntu 16.04.
Now, when you start up Daphne, it should tell you this in the log::
......@@ -105,13 +109,13 @@ WSGI ``SCRIPT_NAME`` setting, you have two options:
The header takes precedence if both are set. As with ``SCRIPT_ALIAS``, the value
should start with a slash, but not end with one; for example::
daphne --root-path=/forum django_project.asgi:channel_layer
daphne --root-path=/forum django_project.asgi:application
Dependencies
------------
All Channels projects currently support Python 2.7, 3.4 and 3.5. `daphne` requires Twisted 17.1 or
greater.
Python Support
--------------
Daphne requires Python 3.5 or later.
Contributing
......@@ -119,7 +123,12 @@ Contributing
Please refer to the
`main Channels contributing docs <https://github.com/django/channels/blob/master/CONTRIBUTING.rst>`_.
That also contains advice on how to set up the development environment and run the tests.
To run tests, make sure you have installed the ``tests`` extra with the package::
cd daphne/
pip install -e .[tests]
pytest
Maintenance and Security
......
__version__ = "1.4.2"
__version__ = "2.2.2"
......@@ -17,34 +17,34 @@ class AccessLogGenerator(object):
# HTTP requests
if protocol == "http" and action == "complete":
self.write_entry(
host=details['client'],
host=details["client"],
date=datetime.datetime.now(),
request="%(method)s %(path)s" % details,
status=details['status'],
length=details['size'],
status=details["status"],
length=details["size"],
)
# Websocket requests
elif protocol == "websocket" and action == "connecting":
self.write_entry(
host=details['client'],
host=details["client"],
date=datetime.datetime.now(),
request="WSCONNECTING %(path)s" % details,
)
elif protocol == "websocket" and action == "rejected":
self.write_entry(
host=details['client'],
host=details["client"],
date=datetime.datetime.now(),
request="WSREJECT %(path)s" % details,
)
elif protocol == "websocket" and action == "connected":
self.write_entry(
host=details['client'],
host=details["client"],
date=datetime.datetime.now(),
request="WSCONNECT %(path)s" % details,
)
elif protocol == "websocket" and action == "disconnected":
self.write_entry(
host=details['client'],
host=details["client"],
date=datetime.datetime.now(),
request="WSDISCONNECT %(path)s" % details,
)
......
This diff is collapsed.
def build_endpoint_description_strings(
host=None,
port=None,
unix_socket=None,
file_descriptor=None
):
"""
Build a list of twisted endpoint description strings that the server will listen on.
This is to streamline the generation of twisted endpoint description strings from easier
to use command line args such as host, port, unix sockets etc.
"""
socket_descriptions = []
if host and port is not None:
host = host.strip("[]").replace(":", "\:")
socket_descriptions.append("tcp:port=%d:interface=%s" % (int(port), host))
elif any([host, port]):
raise ValueError("TCP binding requires both port and host kwargs.")
if unix_socket:
socket_descriptions.append("unix:%s" % unix_socket)
if file_descriptor is not None:
socket_descriptions.append("fd:fileno=%d" % int(file_descriptor))
return socket_descriptions
This diff is collapsed.
This diff is collapsed.
import logging
import multiprocessing
import os
import pickle
import tempfile
import traceback
from concurrent.futures import CancelledError
from twisted.internet import reactor
from .endpoints import build_endpoint_description_strings
from .server import Server
class DaphneTestingInstance:
"""
Launches an instance of Daphne in a subprocess, with a host and port
attribute allowing you to call it.
Works as a context manager.
"""
startup_timeout = 2
def __init__(self, xff=False, http_timeout=None):
self.xff = xff
self.http_timeout = http_timeout
self.host = "127.0.0.1"
def __enter__(self):
# Clear result storage
TestApplication.delete_setup()
TestApplication.delete_result()
# Option Daphne features
kwargs = {}
# Optionally enable X-Forwarded-For support.
if self.xff:
kwargs["proxy_forwarded_address_header"] = "X-Forwarded-For"
kwargs["proxy_forwarded_port_header"] = "X-Forwarded-Port"
kwargs["proxy_forwarded_proto_header"] = "X-Forwarded-Proto"
if self.http_timeout:
kwargs["http_timeout"] = self.http_timeout
# Start up process
self.process = DaphneProcess(
host=self.host,
application=TestApplication,
kwargs=kwargs,
setup=self.process_setup,
teardown=self.process_teardown,
)
self.process.start()
# Wait for the port
if self.process.ready.wait(self.startup_timeout):
self.port = self.process.port.value
return self
else:
if self.process.errors.empty():
raise RuntimeError("Daphne did not start up, no error caught")
else:
error, traceback = self.process.errors.get(False)
raise RuntimeError("Daphne did not start up:\n%s" % traceback)
def __exit__(self, exc_type, exc_value, traceback):
# Shut down the process
self.process.terminate()
del self.process
def process_setup(self):
"""
Called by the process just before it starts serving.
"""
pass
def process_teardown(self):
"""
Called by the process just after it stops serving
"""
pass
def get_received(self):
"""
Returns the scope and messages the test application has received
so far. Note you'll get all messages since scope start, not just any
new ones since the last call.
Also checks for any exceptions in the application. If there are,
raises them.
"""
try:
inner_result = TestApplication.load_result()
except FileNotFoundError:
raise ValueError("No results available yet.")
# Check for exception
if "exception" in inner_result:
raise inner_result["exception"]
return inner_result["scope"], inner_result["messages"]
def add_send_messages(self, messages):
"""
Adds messages for the application to send back.
The next time it receives an incoming message, it will reply with these.
"""
TestApplication.save_setup(
response_messages=messages,
)
class DaphneProcess(multiprocessing.Process):
"""
Process subclass that launches and runs a Daphne instance, communicating the
port it ends up listening on back to the parent process.
"""
def __init__(self, host, application, kwargs=None, setup=None, teardown=None):
super().__init__()
self.host = host
self.application = application
self.kwargs = kwargs or {}
self.setup = setup or (lambda: None)
self.teardown = teardown or (lambda: None)
self.port = multiprocessing.Value("i")
self.ready = multiprocessing.Event()
self.errors = multiprocessing.Queue()
def run(self):
try:
# Create the server class
endpoints = build_endpoint_description_strings(host=self.host, port=0)
self.server = Server(
application=self.application,
endpoints=endpoints,
signal_handlers=False,
**self.kwargs
)
# Set up a poller to look for the port
reactor.callLater(0.1, self.resolve_port)
# Run with setup/teardown
self.setup()
try:
self.server.run()
finally:
self.teardown()
except Exception as e:
# Put the error on our queue so the parent gets it
self.errors.put((e, traceback.format_exc()))
def resolve_port(self):
if self.server.listening_addresses:
self.port.value = self.server.listening_addresses[0][1]
self.ready.set()
else:
reactor.callLater(0.1, self.resolve_port)
class TestApplication:
"""
An application that receives one or more messages, sends a response,
and then quits the server. For testing.
"""
setup_storage = os.path.join(tempfile.gettempdir(), "setup.testio")
result_storage = os.path.join(tempfile.gettempdir(), "result.testio")
def __init__(self, scope):
self.scope = scope
self.messages = []
async def __call__(self, send, receive):
# Receive input and send output
logging.debug("test app coroutine alive")
try:
while True:
# Receive a message and save it into the result store
self.messages.append(await receive())
logging.debug("test app received %r", self.messages[-1])
self.save_result(self.scope, self.messages)
# See if there are any messages to send back
setup = self.load_setup()
self.delete_setup()
for message in setup["response_messages"]:
await send(message)
logging.debug("test app sent %r", message)
except Exception as e:
if isinstance(e, CancelledError):
# Don't catch task-cancelled errors!
raise
else:
self.save_exception(e)
@classmethod
def save_setup(cls, response_messages):
"""
Stores setup information.
"""
with open(cls.setup_storage, "wb") as fh:
pickle.dump(
{
"response_messages": response_messages,
},
fh,
)
@classmethod
def load_setup(cls):
"""
Returns setup details.
"""
try:
with open(cls.setup_storage, "rb") as fh:
return pickle.load(fh)
except FileNotFoundError:
return {"response_messages": []}
@classmethod
def save_result(cls, scope, messages):
"""
Saves details of what happened to the result storage.
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump(
{
"scope": scope,
"messages": messages,
},
fh,
)
@classmethod
def save_exception(cls, exception):
"""
Saves details of what happened to the result storage.
We could use pickle here, but that seems wrong, still, somehow.
"""
with open(cls.result_storage, "wb") as fh:
pickle.dump(
{
"exception": exception,
},
fh,
)
@classmethod
def load_result(cls):
"""
Returns result details.
"""
with open(cls.result_storage, "rb") as fh:
return pickle.load(fh)
@classmethod
def delete_setup(cls):
"""
Clears setup storage files.
"""
try:
os.unlink(cls.setup_storage)
except OSError:
pass
@classmethod
def delete_result(cls):
"""
Clears result storage files.
"""
try:
os.unlink(cls.result_storage)
except OSError:
pass
from hypothesis import HealthCheck, settings
settings.register_profile(
'daphne',
settings(suppress_health_check=[HealthCheck.too_slow]),
)
settings.load_profile('daphne')
# coding=utf-8
channel_layer = {}
from __future__ import unicode_literals
import six
from six.moves.urllib import parse
from asgiref.inmemory import ChannelLayer
from twisted.test import proto_helpers
from daphne.http_protocol import HTTPFactory
def message_for_request(method, path, params=None, headers=None, body=None):
"""
Constructs a HTTP request according to the given parameters, runs
that through daphne and returns the emitted channel message.
"""
request = _build_request(method, path, params, headers, body)
message, factory, transport = _run_through_daphne(request, 'http.request')
return message
def response_for_message(message):
"""
Returns the raw HTTP response that Daphne constructs when sending a reply
to a HTTP request.
The current approach actually first builds a HTTP request (similar to
message_for_request) because we need a valid reply channel. I'm sure
this can be streamlined, but it works for now.
"""
request = _build_request('GET', '/')
request_message, factory, transport = _run_through_daphne(request, 'http.request')
factory.dispatch_reply(request_message['reply_channel'], message)
return transport.value()
def _build_request(method, path, params=None, headers=None, body=None):
"""
Takes request parameters and returns a byte string of a valid HTTP/1.1 request.
We really shouldn't manually build a HTTP request, and instead try to capture
what e.g. urllib or requests would do. But that is non-trivial, so meanwhile
we hope that our request building doesn't mask any errors.
This code is messy, because urllib behaves rather different between Python 2
and 3. Readability is further obstructed by the fact that Python 3.4 doesn't
support % formatting for bytes, so we need to concat everything.
If we run into more issues with this, the python-future library has a backport
of Python 3's urllib.
:param method: ASCII string of HTTP method.
:param path: unicode string of URL path.
:param params: List of two-tuples of bytestrings, ready for consumption for
urlencode. Encode to utf8 if necessary.
:param headers: List of two-tuples ASCII strings of HTTP header, value.
:param body: ASCII string of request body.
ASCII string is short for a unicode string containing only ASCII characters,
or a byte string with ASCII encoding.
"""
if headers is None:
headers = []
else:
headers = headers[:]
if six.PY3:
quoted_path = parse.quote(path)
if params:
quoted_path += '?' + parse.urlencode(params)
quoted_path = quoted_path.encode('ascii')
else:
quoted_path = parse.quote(path.encode('utf8'))
if params:
quoted_path += b'?' + parse.urlencode(params)
request = method.encode('ascii') + b' ' + quoted_path + b" HTTP/1.1\r\n"
for name, value in headers:
request += header_line(name, value)
request += b'\r\n'
if body:
request += body.encode('ascii')
return request
def build_websocket_upgrade(path, params, headers):
ws_headers = [
('Host', 'somewhere.com'),
('Upgrade', 'websocket'),
('Connection', 'Upgrade'),
('Sec-WebSocket-Key', 'x3JJHMbDL1EzLkh9GBhXDw=='),