After seeing how Supabase Realtime allows you to listen to changes in your PostgreSQL database I wanted to learn how this actually works.
The first step is to figure out how we can actually observe events happening
inside our database from the outside without simply spamming SELECT
queries.
PostgreSQL provides several features we can use to instead: triggers, channels
and notifications. Triggers will allow us to react to an event on a specific
database table, like an INSERT
, UPDATE
or DELETE
event. These triggers
allow us to run Postgres function, which allow us to send notifications into a
channel. These channels can then be subscribed to by other software - like an
API server.
Setting up a Postgres database
Firss thing we need is a PostgreSQL database. I’m a fan of simply using Docker
Compose to quickly define and run databases
for my tinkering projects. With the simple docker-compose.yml
file defined
below we can quickly spin up a database running at localhost:5432
using
docker compose up
.
# ./docker-compose.yml
version: "3.8"
services:
db:
image: postgres:14-alpine
restart: always
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=notify
ports:
- 5432:5432
volumes:
- db:/var/lib/postgresql/data
volumes:
db:
driver: local
Now that we have an empty database lets set up a table - and of course we use
the canonical todo list as an example. The SQL command below defines the table
todo_item
with an auto-incrementing id column and text, completed and created
columns.
CREATE TABLE todo_item (
id SERIAL PRIMARY KEY,
text VARCHAR NOT NULL,
completed BOOLEAN NOT NULL DEFAULT FALSE,
createdAt TIMESTAMP NOT NULL DEFAULT NOW()
);
You can run this using any software that can connect to your database and run
commands - I frequently make use of the open-source Beekeeper
Studio or JetBrains
DataGrip. But you can just as well use
the official Postgres interactive terminal
psql
.
With our database set up, lets get started with setting up the triggers.
Event-driven Postgres
Postgres triggers can be defined to listen to INSERT
, UPDATE
and DELETE
events on any table in our database. They can either run before or after the
triggering event, and for either each row or the entire statement that has
been executed. This means that we, for example, can react to multiple inserts
happening in one statement individually, but also to the collective insert
event.
To define a trigger we need to provide it with a callback function that it
executes when triggered. So we start with defining our function first using the
CREATE FUNCTION
command. What we want our function to do is send a
notification to a channel, which we then can listen to from the outside. To
do this Postgres provides the NOTIFY channel [ , payload ]
command. The docs also mention the pg_notify(text, text)
function, taking the
channel name as the first argument and a payload as the second. Either would be
fine to use for our purpose, so I went with the latter and have the function
PERFORM
the function call pg_notify('todo_item_created_event', 'Hello World!');
. It sends a notification to the channel todo_item_created_event
with the arbitrary payload Hello World!
.
CREATE OR REPLACE FUNCTION notify_new_todo()
RETURNS trigger AS
$psql$
BEGIN
PERFORM pg_notify('todo_item_created_event', 'Hello World!');
RETURN null;
END;
$psql$
LANGUAGE plpgsql;
Now that we have our callback function defined we can define our trigger using
the CREATE TRIGGER
command. I defined it to run AFTER
an INSERT
statement
on our todo_item
table - and to EXECTUTE
our above defined PROCEDURE notify_new_todo()
FOR EACH ROW
that is being inserted.
CREATE TRIGGER insert_todo_item
AFTER INSERT
ON todo_item
FOR EACH ROW
EXECUTE PROCEDURE notify_new_todo();
And thats about everything we have to do within PostgreSQL. Now we need to actually listen to the channel for events.
Listening to Postgres notifications from JavaScript
So lets create a tiny Node project using yarn init -y
. We need a way to
interact with our PostgreSQL database from JavaScript. We can use
node-postgres
for this by installing it using yarn add pg
. Afterwards, lets
create a listener.js
file and import pg
. We need to provide it with a
connection string to connect to our database. The username, password and
database name we have to use are based on the environment variables
POSTGRES_USER
, POSTGRES_PASSWORD
and POSTGRES_DB
defined in our
docker-compose.yml
file. We then instantiate a client instance we can use to
connect to the database and execute queries.
// ./listener.js
const pg = require("pg");
const connectionString = "postgres://postgres:postgres@localhost:5432/notify";
const client = new pg.Client(connectionString);
After connecting to our database using client.connect
, we can use the LISTEN
command provided by PostgreSQL to listen to any events in our channel
todo_item_created_event
. By calling client.query("LISTEN todo_item_created_event")
we will get a notification
everytime the
pg_notify
function sends a notification to our channel - so everytime we
insert a new todo item.
client.connect((err, client) => {
client.query("LISTEN todo_item_created_event");
client.on("notification", () =>
console.log("A new todo item has been inserted.")
);
});
We can use client.on("notification", () => {})
to run a callback function in
response to any notification coming into the channel. In our case a simple
console.log
statement suffices to test the system. Now we can run our little
program using node listener.js
. But to actually see anything we need to start
inserting rows into our table.
INSERT INTO todo_item (text) VALUES ('Write blog post');
Whenever we now run the above SQL statement, we insert a new row into our table.
This leads to our trigger to run send a notification to the channel we are
listening to from our little JavaScript program - which will log "A new todo item has been inserted."
in our console every time. All done!
Of course Supabase Realtime does much more, like using WebSockets to push the notifications onward to clients to provide real-time updates.