This commit is contained in:
Víðir Valberg Guðmundsson 2023-05-17 14:48:39 +02:00
parent 783c4d9203
commit c8259c3743
3 changed files with 214 additions and 46 deletions

View file

@ -19,61 +19,86 @@ And the documentation has been expanded with the following [1]:
> When serving under ASGI, however, a [`StreamingHttpResponse`](https://docs.djangoproject.com/en/4.2/ref/request-response/#django.http.StreamingHttpResponse "django.http.StreamingHttpResponse") need not stop other requests from being served whilst waiting for I/O. This opens up the possibility of long-lived requests for streaming content and implementing patterns such as long-polling, and server-sent events. > When serving under ASGI, however, a [`StreamingHttpResponse`](https://docs.djangoproject.com/en/4.2/ref/request-response/#django.http.StreamingHttpResponse "django.http.StreamingHttpResponse") need not stop other requests from being served whilst waiting for I/O. This opens up the possibility of long-lived requests for streaming content and implementing patterns such as long-polling, and server-sent events.
Being a sucker for simplicity I got quite intrigued by the possibility to serve Being a sucker for simplicity I got quite intrigued by the possibility to serve
server-sent events (also known as SSE) directly from Django, with no need for server-sent events (also known as SSE) from Django in an asynchronous manner.
additional infrastructure like Redis.
## What are server-sent events and why do we want to use them? So I set out to write a small, drumroll please, chat application!
The code for the chat application can be found at
[github.com/valberg/django-sse](https://github.com/valberg/django-sse).
### What are server-sent events and why do we want to use them?
Server-sent events is "old tech", as in that is has been supported in major Server-sent events is "old tech", as in that is has been supported in major
browser since around 2010-2011 [2]. The idea is that the client "subscribes" to browser since around 2010-2011 [2]. The idea is that the client "subscribes" to
a HTTP endpoint, and the server can then issue data to the client as long as an HTTP endpoint, and the server can then issue data to the client as long as
the connection is open. This is a great performance boost compared to other the connection is open. This is a great performance boost compared to other
techniques as for instance polling the server. techniques as for instance polling the server.
_But wait, isn't websockets "shinier"?_ _But wait, isn't websockets "shinier"?_
It depends. In many situations when it comes to developing web applications, we It depends. In many situations when it comes to developing web applications, we
just want a way to push data to the client, and here a bi-directional just want a way to push data to the client, and here a bidirectional
connection like websockets feel like an overkill. Also I would argue that using connection like websockets feel like an overkill. Also, I would argue that using
POST/PUT requests from the client and SSE to the client might be "just enough" POST/PUT requests from the client and SSE to the client might be "just enough"
compared to websockets. compared to websockets.
## A simple implementation SSE also has the added benefit of having a built-in reconnection mechanism,
which is something we would have to implement ourselves with websockets.
So lets get to some code! The following is something along the lines of my All in all SSE is a much simpler solution than websockets, and in many (most?)
initial attempt. First we have to define the view, which in fact will not cases that is all we need.
change for the remainder of this blog post. The juicy bits are in the next
part. ### A simple implementation
So lets get to some code!
First we need our model for storing the chat messages:
:::python :::python
async def stream_foo_view(request: HttpRequest) -> StreamingHttpResponse: class ChatMessage(models.Model):
user = models.CharField(max_length=255)
text = models.CharField(max_length=255)
With the model defined we can write our view to stream the messages.
The following is something along the lines of my initial attempt. First we have
to define the view, which in fact will not change for the remainder of this
blog post. The juicy bits are in the `stream_messages()` function. Note that
the view is an async view, denoted by the `async` keyword.
:::python
async def stream_messages_view(request: HttpRequest) -> StreamingHttpResponse:
return StreamingHttpResponse( return StreamingHttpResponse(
streaming_content=stream_foos(), streaming_content=stream_messages(),
content_type="text/event-stream", content_type="text/event-stream",
) )
We tell the `StreamingHttpResponse` class to get its streaming content from the We tell the `StreamingHttpResponse` class to get its streaming content from the
`stream_foos` function. I implemented this as follows initially: `stream_messages` function. I implemented this as follows initially:
::python ::python
async def stream_foos() -> AsyncGenerator[str, None]: async def stream_messages() -> AsyncGenerator[str, None]:
latest_foo = None latest_message = None
while True: while True:
current_foo = await Foo.objects.order_by("-id").afirst() current_message = await ChatMessage.objects.order_by("-id").afirst()
# If we have a new foo yield that # If we have a new foo yield that
if latest_foo != current_foo: if latest_message != current_message:
yield "data: {current_foo.text}\n\n" yield "data: {current_message.text}\n\n"
latest_foo = current_foo latest_message = current_message
await asyncio.sleep(5) await asyncio.sleep(5)
So we've gotten rid of the HTTP overhead of polling by not having to do a So we've gotten rid of the HTTP overhead of polling by not having to do a
request from the client every 5 seconds. But we are still doing a query to the request from the client every 5 seconds. But we are still doing a query to the
database every 5 seconds, and that for each client. database every 5 seconds, and that for each client. This is not ideal and is
probably something we could have done with a synchronous view.
### Aside: Use an ASGI server for development Let's see if we can do better. But first we'll have to talk about how to run
this code.
#### Aside: Use an ASGI server for development
One thing that took me some time to realise is that the Django runserver is not One thing that took me some time to realise is that the Django runserver is not
capable of running async views returning `StreamingHttpResponse`. capable of running async views returning `StreamingHttpResponse`.
@ -88,11 +113,14 @@ Running the above view with the runserver results in the following error:
So I had to result to installing uvicorn and run my project as so: So I had to result to installing uvicorn and run my project as so:
:::bash :::bash
$ uvicorn --log-level debug --reload project.asgi:application` $ uvicorn --log-level debug --reload --timeout-graceful-shutdown 0 project.asgi:application`
The `--reload` part is particulary important when doing development. The `--reload` part is particulary important when doing development, but it
does not work well when working with open connections since the server will
wait for the connection to close before reloading. This is why
`--timeout-graceful-shutdown 0` is needed.
## More old tech to the rescue: PostgreSQL LISTEN/NOTIFY ### More old tech to the rescue: PostgreSQL LISTEN/NOTIFY
This is where we could reach for more infrastructure which could help us giving This is where we could reach for more infrastructure which could help us giving
the database a break. This could be listening for data in Redis (this is what the database a break. This could be listening for data in Redis (this is what
@ -107,7 +135,7 @@ LISTEN to a channel and then anyone can NOTIFY on that same channel.
This seems like something we can use - but psycopg2 isn't async, so I'm not This seems like something we can use - but psycopg2 isn't async, so I'm not
even sure if `sync_to_async` would help us here. even sure if `sync_to_async` would help us here.
## Enter psycopg 3 ### Enter psycopg 3
I had put the whole thing on ice until I realized that another big thing (maybe I had put the whole thing on ice until I realized that another big thing (maybe
a bit bigger than StreamingHttpResponse) in Django 4.2 is the support for a bit bigger than StreamingHttpResponse) in Django 4.2 is the support for
@ -126,7 +154,8 @@ So I went for a stroll in the psycopg 3 documentation and found this gold[3]:
gen.close() gen.close()
print("there, I stopped") print("there, I stopped")
This does almost what we want! It just isn't async and isn't getting connection info from Django. This does almost what we want! It just isn't async and isn't getting connection
info from Django.
So by combining the snippet from the psycopg 3 documentation and my previous So by combining the snippet from the psycopg 3 documentation and my previous
`stream_foos` I came up with this: `stream_foos` I came up with this:
@ -155,19 +184,7 @@ I was almost about to give up again, since this approach didn't work initially.
All because I for some reason had removed the `autocommit=True` in my attempts All because I for some reason had removed the `autocommit=True` in my attempts
to async-ify the snippet from the psycopg 3 documentation. to async-ify the snippet from the psycopg 3 documentation.
### Issuing the NOTIFY #### Aside: Difference between 4.2 and 4.2.1
- using a signal handler
- setting up triggers manually - django-pgtrigger is psycopg2 only
### Frontend stuff
- Simple `EventSource` example
- Use HTMX
### Difference between 4.2 and 4.2.1
the code worked initially in 4.2, but 4.2.1 fixed a regression regarding the code worked initially in 4.2, but 4.2.1 fixed a regression regarding
setting a custom cursor in the database configuration. setting a custom cursor in the database configuration.
@ -260,6 +277,161 @@ So that now looks like so:
async for notify in gen: async for notify in gen:
yield f"data: {notify.payload}\n\n" yield f"data: {notify.payload}\n\n"
### Test the endpoint with curl
So now we've got the `LISTEN` part in place.
If we connect to the endpoint using curl (`-N` disables buffering and is a way to consume streming content with curl):
:::console
$ curl -N http://localhost:8000/messages/
And connect to our database and run:
:::sql
NOTIFY new_message, 'Hello, world!';
We, excitingly, get the following result :
:::text
data: Hello, world!
Amazing!
### Issuing the NOTIFY
But we want the `NOTIFY` command to be issued when a new chat message is submitted.
For this we'll have a small utility function which does the heavy lifting. Note
that this is just a very simple synchronous function since everything is just
happening within a single request-response cycle.
:::python
from django.db import connection
def notify(*, channel: str, event: str, payload: str) -> None:
payload = json.dumps({
"event": event,
"content": payload,
})
with connection.cursor() as cursor:
cursor.execute(
f"NOTIFY {channel}, '{payload}'",
)
And then we can use this in our view (I'm using `@csrf_exempt` here since this is just a quick proof of concept):
:::python
@csrf_exempt
@require_POST
def post_message_view(request: HttpRequest) -> HttpResponse:
message = request.POST.get("message")
user = request.POST.get("user")
message = ChatMessage.objects.create(user=user, text=message)
notify(
channel="lobby",
event="message_created",
content=json.dumps({
"text": message.text,
"user": message.user,
})
)
return HttpResponse("OK")
The keen observer will notice that we are storing the payload content as a JSON string within a JSON string.
This is because we have two recipients of the payload. The first is the `stream_messages` function which is going to
send the payload to the client with a `event`, and the second is the browser which is going to parse the payload and use
the `event` to determine what to do with the payload.
For this we'll have to update our `stream_messages` function as follows:
:::python
async def stream_messages() -> AsyncGenerator[str, None]:
connection_params = connection.get_connection_params()
# Remove the cursor_factory parameter since I can't get
# the default from Django 4.2.1 to work.
# Django 4.2 didn't have the parameter and that worked.
connection_params.pop('cursor_factory')
aconnection = await psycopg.AsyncConnection.connect(
**connection_params,
autocommit=True,
)
channel_name = "lobby"
async with aconnection.cursor() as acursor:
await acursor.execute(f"LISTEN {channel_name}")
gen = aconnection.notifies()
async for notify in gen:
payload = json.loads(notify.payload)
event = payload.pop("event")
data = payload.pop("data")
yield f"event: {event}\ndata: {data}\n\n"
Everything is the same except that we now parse the payload from the `NOTIFY` command and construct the SSE payload with
an `event` and a `data` field. This will come in handy when dealing with the frontend.
Another way to do this would be to use Django's
[signals](https://docs.djangoproject.com/en/4.2/topics/signals/) or event
writing a PostgreSQL
[trigger](https://www.postgresql.org/docs/15/plpgsql-trigger.html) which issues
the `NOTIFY` command.
### Frontend stuff
Now that we've got the backend in place, we can get something up and running on
the frontend.
We could use HTMX's [SSE
extension](https://htmx.org/extensions/server-sent-events/) but for this
example we'll just use the
[EventSource](https://developer.mozilla.org/en-US/docs/Web/API/EventSource) API
directly.
:::html
<template id="message">
<div style="border: 1px solid black; margin: 5px; padding: 5px;">
<strong class="user"></strong>: <span class="message"></span>
</div>
</template>
<div id="messages"></div>
<script>
const source = new EventSource("/messages/");
// Note that the event we gave our notify utility function is called "message_created"
// so that's what we listen for here.
source.addEventListener("message_created", function(evt) {
// Parse the payload
let payload = JSON.parse(evt.data);
// Get and clone our template
let template = document.getElementById('message');
let clone = template.content.cloneNode(true);
// Update our cloned template
clone.querySelector('.user').innerText = payload.user;
clone.querySelector('.message').innerText = payload.text;
// Append the cloned template to our list of messages
document.getElementById('messages').appendChild(clone);
});
</script>
And that's it! We can now open two browser windows and see the messages appear in real time.
Check out the repo for the full code where I've also added a simple form for submitting new messages.
### Conclusion
Django might not be the shiniest framework out there, but it is solid and boring - which is a good thing!
And with the continued work on async support, it is becoming a viable option for doing real time stuff, especially when paired with other solid and boring tech like PostgreSQL and SSE!
[0]: https://docs.djangoproject.com/en/4.2/releases/4.2/#requests-and-responses [0]: https://docs.djangoproject.com/en/4.2/releases/4.2/#requests-and-responses
[1]: https://docs.djangoproject.com/en/4.2/ref/request-response/#django.http.StreamingHttpResponse [1]: https://docs.djangoproject.com/en/4.2/ref/request-response/#django.http.StreamingHttpResponse

View file

@ -27,8 +27,6 @@
<h2 class="entry-title"> <h2 class="entry-title">
<a href="{{ SITEURL }}/{{ article.url }}" rel="bookmark" <a href="{{ SITEURL }}/{{ article.url }}" rel="bookmark"
title="Permalink to {{ article.title|striptags }}">{{ article.title }}</a></h2> title="Permalink to {{ article.title|striptags }}">{{ article.title }}</a></h2>
{% import 'translations.html' as translations with context %}
{{ translations.translations_for(article) }}
</header> </header>
<footer class="post-info"> <footer class="post-info">
<time class="published" datetime="{{ article.date.isoformat() }}"> <time class="published" datetime="{{ article.date.isoformat() }}">
@ -49,8 +47,6 @@
{% endif %} {% endif %}
</footer><!-- /.post-info --> </footer><!-- /.post-info -->
<hr>
<div class="entry-content"> <div class="entry-content">
{{ article.content }} {{ article.content }}
</div><!-- /.entry-content --> </div><!-- /.entry-content -->

View file

@ -70,7 +70,7 @@
</style> </style>
</head> </head>
<body id="index" class="home bg-light"> <body id="index" class="home">
<header class="container d-flex justify-content-center"> <header class="container d-flex justify-content-center">
<h1> <h1>
@ -84,7 +84,7 @@
{% block jumbotron %} {% block jumbotron %}
{% endblock %} {% endblock %}
<div class="container bg-body h-100"> <div class="container h-100">
<div class="row"> <div class="row">
<div class="col-8 offset-2 pt-5 p-3"> <div class="col-8 offset-2 pt-5 p-3">