How I Solved Background Jobs using Supabase Tables and Edge Functions

For a long time, I struggled with how to handle "heavy" tasks in my apps. I’m talking about things like generating AI images, stitching videos, or even just sending a massive batch of emails.
Early on, I did what most people do: I just tried to run the code inside my API route. I’d hit "Submit," the browser would spin, I’d pray the connection didn't drop, and usually... it timed out. Then I tried adding setTimeout hacks, and eventually, I looked at complex tools like Redis or RabbitMQ, but they felt like trying to use a sledgehammer to hang a picture frame.
After a lot of trial and error (and many crashed functions), I found a "Goldilocks" solution using just Supabase. It’s simple, it’s robust, and it uses tools you’re already using.
The Mental Model: The Kitchen Ticket
Before we code, let’s get on the same page. I like to think of this system like a restaurant.
- The User is the hungry customer.
- The API (Your App) is the waiter.
- The Background Job is the order ticket.
- The Worker is the chef in the kitchen.
When a customer orders a steak (a heavy task), the waiter doesn't run to the kitchen and cook it themselves while the customer waits. No, they write a ticket (Create Job), stick it on the rail, and go back to help other customers.
In the back, the chef (Worker) sees the ticket, cooks the steak (Processes the Job), and rings a bell when it's done.
We are going to build exactly this.
Step 1: Create the Job (The Ticket)
First, we need a place to store our "tickets." In Supabase, this is just a standard Postgres table. We don't need fancy queue software yet; a table is perfectly fine for most use cases.
I usually create a table called background_jobs. Here is the SQL I use:
SQL
How to "Create a Job"
When your user wants to do something heavy—say, "Generate an AI summary of this PDF"—you don't call the AI API immediately. instead, you just insert a row into this table.
JavaScript
That’s it. The "waiter" has taken the order. The user gets an immediate response like "We're working on it!", and they are happy.
Step 2: The Worker (The Chef)
Now we need the chef. In the Supabase world, our chef is an Edge Function.
This function needs to do three things:
- Fetch the oldest "pending" job.
- Process it (do the heavy AI work).
- Update the status to "completed".
Here is a simplified version of what my Edge Function code (supabase/functions/process-job/index.ts) looks like. I use Deno.serve which is the modern standard for Supabase functions.
TypeScript
Use the Supabase CLI to deploy this: supabase functions deploy process-job.
Step 3: Polling (The Manager)
Now, we have a ticket system and a chef, but who tells the chef to start cooking? If we just leave the function alone, it never runs.
We need a Trigger.
There are two main ways to do this: Webhooks (Database Triggers) or Polling.
A lot of people jump straight to Webhooks ("Run the function every time a row is inserted!"). I actually recommend Polling for beginners.
Why Polling?
If you use Webhooks and 1,000 users upload a file at the exact same second, your database will try to fire 1,000 Edge Functions instantly. You might hit rate limits, timeout, or crash your database.
Polling is smoother. It’s like a manager walking into the kitchen every minute and saying, "Hey, any tickets on the rail? Grab one." It controls the pace.
We use pg_cron (a built-in Postgres extension in Supabase) to do this.
Go to your Supabase SQL Editor and run this:
SQL
Now, every single minute, your database will poke your Edge Function. The function will wake up, check if there is a "pending" job, process it, and go back to sleep. If there are no jobs, it just exits. It's efficient and safe.
Recap: The Flow
- User clicks "Generate Video".
- App inserts a row into
background_jobs(Status:pending). - pg_cron (every minute) wakes up the Edge Function.
- Edge Function finds the
pendingrow, changes it toprocessing, does the work, and changes it tocompleted. - User (on the frontend) can poll the table or listen to a Realtime subscription to see when their job turns to
completed.
Final Thoughts
This pattern has saved me so many headaches. It separates your "fast" user interface from your "slow" heavy logic. Whether you are resizing images, generating embeddings for RAG, or just syncing data to a third-party CRM, this simple table-based queue is often all you need.
Start simple. You don't need Kafka or RabbitMQ yet. Just a table and a cron job will get you a long way!
Happy coding!


