diff --git a/content/django-server-sent-events.md b/content/django-server-sent-events.md
index 93f869b..616bd63 100644
--- a/content/django-server-sent-events.md
+++ b/content/django-server-sent-events.md
@@ -19,61 +19,86 @@ And the documentation has been expanded with the following [1]:
> When serving under ASGI, however, a [`StreamingHttpResponse`](https://docs.djangoproject.com/en/4.2/ref/request-response/#django.http.StreamingHttpResponse "django.http.StreamingHttpResponse") need not stop other requests from being served whilst waiting for I/O. This opens up the possibility of long-lived requests for streaming content and implementing patterns such as long-polling, and server-sent events.
Being a sucker for simplicity I got quite intrigued by the possibility to serve
-server-sent events (also known as SSE) directly from Django, with no need for
-additional infrastructure like Redis.
+server-sent events (also known as SSE) from Django in an asynchronous manner.
-## What are server-sent events and why do we want to use them?
+So I set out to write a small, drumroll please, chat application!
+
+The code for the chat application can be found at
+[github.com/valberg/django-sse](https://github.com/valberg/django-sse).
+
+### What are server-sent events and why do we want to use them?
Server-sent events is "old tech", as in that is has been supported in major
browser since around 2010-2011 [2]. The idea is that the client "subscribes" to
-a HTTP endpoint, and the server can then issue data to the client as long as
+an HTTP endpoint, and the server can then issue data to the client as long as
the connection is open. This is a great performance boost compared to other
techniques as for instance polling the server.
_But wait, isn't websockets "shinier"?_
It depends. In many situations when it comes to developing web applications, we
-just want a way to push data to the client, and here a bi-directional
-connection like websockets feel like an overkill. Also I would argue that using
+just want a way to push data to the client, and here a bidirectional
+connection like websockets feel like an overkill. Also, I would argue that using
POST/PUT requests from the client and SSE to the client might be "just enough"
compared to websockets.
-## A simple implementation
+SSE also has the added benefit of having a built-in reconnection mechanism,
+which is something we would have to implement ourselves with websockets.
-So lets get to some code! The following is something along the lines of my
-initial attempt. First we have to define the view, which in fact will not
-change for the remainder of this blog post. The juicy bits are in the next
-part.
+All in all SSE is a much simpler solution than websockets, and in many (most?)
+cases that is all we need.
+
+### A simple implementation
+
+So lets get to some code!
+
+First we need our model for storing the chat messages:
:::python
- async def stream_foo_view(request: HttpRequest) -> StreamingHttpResponse:
+ class ChatMessage(models.Model):
+ user = models.CharField(max_length=255)
+ text = models.CharField(max_length=255)
+
+With the model defined we can write our view to stream the messages.
+
+The following is something along the lines of my initial attempt. First we have
+to define the view, which in fact will not change for the remainder of this
+blog post. The juicy bits are in the `stream_messages()` function. Note that
+the view is an async view, denoted by the `async` keyword.
+
+ :::python
+ async def stream_messages_view(request: HttpRequest) -> StreamingHttpResponse:
return StreamingHttpResponse(
- streaming_content=stream_foos(),
+ streaming_content=stream_messages(),
content_type="text/event-stream",
)
We tell the `StreamingHttpResponse` class to get its streaming content from the
-`stream_foos` function. I implemented this as follows initially:
+`stream_messages` function. I implemented this as follows initially:
::python
- async def stream_foos() -> AsyncGenerator[str, None]:
- latest_foo = None
+ async def stream_messages() -> AsyncGenerator[str, None]:
+ latest_message = None
while True:
- current_foo = await Foo.objects.order_by("-id").afirst()
+ current_message = await ChatMessage.objects.order_by("-id").afirst()
# If we have a new foo yield that
- if latest_foo != current_foo:
- yield "data: {current_foo.text}\n\n"
- latest_foo = current_foo
+ if latest_message != current_message:
+ yield "data: {current_message.text}\n\n"
+ latest_message = current_message
await asyncio.sleep(5)
So we've gotten rid of the HTTP overhead of polling by not having to do a
request from the client every 5 seconds. But we are still doing a query to the
-database every 5 seconds, and that for each client.
+database every 5 seconds, and that for each client. This is not ideal and is
+probably something we could have done with a synchronous view.
-### Aside: Use an ASGI server for development
+Let's see if we can do better. But first we'll have to talk about how to run
+this code.
+
+#### Aside: Use an ASGI server for development
One thing that took me some time to realise is that the Django runserver is not
capable of running async views returning `StreamingHttpResponse`.
@@ -88,11 +113,14 @@ Running the above view with the runserver results in the following error:
So I had to result to installing uvicorn and run my project as so:
:::bash
- $ uvicorn --log-level debug --reload project.asgi:application`
+ $ uvicorn --log-level debug --reload --timeout-graceful-shutdown 0 project.asgi:application`
-The `--reload` part is particulary important when doing development.
+The `--reload` part is particulary important when doing development, but it
+does not work well when working with open connections since the server will
+wait for the connection to close before reloading. This is why
+`--timeout-graceful-shutdown 0` is needed.
-## More old tech to the rescue: PostgreSQL LISTEN/NOTIFY
+### More old tech to the rescue: PostgreSQL LISTEN/NOTIFY
This is where we could reach for more infrastructure which could help us giving
the database a break. This could be listening for data in Redis (this is what
@@ -107,7 +135,7 @@ LISTEN to a channel and then anyone can NOTIFY on that same channel.
This seems like something we can use - but psycopg2 isn't async, so I'm not
even sure if `sync_to_async` would help us here.
-## Enter psycopg 3
+### Enter psycopg 3
I had put the whole thing on ice until I realized that another big thing (maybe
a bit bigger than StreamingHttpResponse) in Django 4.2 is the support for
@@ -126,7 +154,8 @@ So I went for a stroll in the psycopg 3 documentation and found this gold[3]:
gen.close()
print("there, I stopped")
-This does almost what we want! It just isn't async and isn't getting connection info from Django.
+This does almost what we want! It just isn't async and isn't getting connection
+info from Django.
So by combining the snippet from the psycopg 3 documentation and my previous
`stream_foos` I came up with this:
@@ -155,19 +184,7 @@ I was almost about to give up again, since this approach didn't work initially.
All because I for some reason had removed the `autocommit=True` in my attempts
to async-ify the snippet from the psycopg 3 documentation.
-### Issuing the NOTIFY
-
-- using a signal handler
-- setting up triggers manually - django-pgtrigger is psycopg2 only
-
-### Frontend stuff
-
-- Simple `EventSource` example
-- Use HTMX
-
-
-
-### Difference between 4.2 and 4.2.1
+#### Aside: Difference between 4.2 and 4.2.1
the code worked initially in 4.2, but 4.2.1 fixed a regression regarding
setting a custom cursor in the database configuration.
@@ -260,6 +277,161 @@ So that now looks like so:
async for notify in gen:
yield f"data: {notify.payload}\n\n"
+### Test the endpoint with curl
+
+So now we've got the `LISTEN` part in place.
+
+If we connect to the endpoint using curl (`-N` disables buffering and is a way to consume streming content with curl):
+
+ :::console
+ $ curl -N http://localhost:8000/messages/
+
+And connect to our database and run:
+
+ :::sql
+ NOTIFY new_message, 'Hello, world!';
+
+
+We, excitingly, get the following result :
+
+ :::text
+ data: Hello, world!
+
+Amazing!
+
+### Issuing the NOTIFY
+
+But we want the `NOTIFY` command to be issued when a new chat message is submitted.
+
+For this we'll have a small utility function which does the heavy lifting. Note
+that this is just a very simple synchronous function since everything is just
+happening within a single request-response cycle.
+
+ :::python
+ from django.db import connection
+
+
+ def notify(*, channel: str, event: str, payload: str) -> None:
+ payload = json.dumps({
+ "event": event,
+ "content": payload,
+ })
+ with connection.cursor() as cursor:
+ cursor.execute(
+ f"NOTIFY {channel}, '{payload}'",
+ )
+
+And then we can use this in our view (I'm using `@csrf_exempt` here since this is just a quick proof of concept):
+
+ :::python
+ @csrf_exempt
+ @require_POST
+ def post_message_view(request: HttpRequest) -> HttpResponse:
+ message = request.POST.get("message")
+ user = request.POST.get("user")
+ message = ChatMessage.objects.create(user=user, text=message)
+ notify(
+ channel="lobby",
+ event="message_created",
+ content=json.dumps({
+ "text": message.text,
+ "user": message.user,
+ })
+ )
+ return HttpResponse("OK")
+
+The keen observer will notice that we are storing the payload content as a JSON string within a JSON string.
+
+This is because we have two recipients of the payload. The first is the `stream_messages` function which is going to
+send the payload to the client with a `event`, and the second is the browser which is going to parse the payload and use
+the `event` to determine what to do with the payload.
+
+For this we'll have to update our `stream_messages` function as follows:
+
+ :::python
+ async def stream_messages() -> AsyncGenerator[str, None]:
+ connection_params = connection.get_connection_params()
+
+ # Remove the cursor_factory parameter since I can't get
+ # the default from Django 4.2.1 to work.
+ # Django 4.2 didn't have the parameter and that worked.
+ connection_params.pop('cursor_factory')
+
+ aconnection = await psycopg.AsyncConnection.connect(
+ **connection_params,
+ autocommit=True,
+ )
+ channel_name = "lobby"
+ async with aconnection.cursor() as acursor:
+ await acursor.execute(f"LISTEN {channel_name}")
+ gen = aconnection.notifies()
+ async for notify in gen:
+ payload = json.loads(notify.payload)
+ event = payload.pop("event")
+ data = payload.pop("data")
+ yield f"event: {event}\ndata: {data}\n\n"
+
+Everything is the same except that we now parse the payload from the `NOTIFY` command and construct the SSE payload with
+an `event` and a `data` field. This will come in handy when dealing with the frontend.
+
+Another way to do this would be to use Django's
+[signals](https://docs.djangoproject.com/en/4.2/topics/signals/) or event
+writing a PostgreSQL
+[trigger](https://www.postgresql.org/docs/15/plpgsql-trigger.html) which issues
+the `NOTIFY` command.
+
+### Frontend stuff
+
+Now that we've got the backend in place, we can get something up and running on
+the frontend.
+
+We could use HTMX's [SSE
+extension](https://htmx.org/extensions/server-sent-events/) but for this
+example we'll just use the
+[EventSource](https://developer.mozilla.org/en-US/docs/Web/API/EventSource) API
+directly.
+
+ :::html
+
+