Home

AI Quickstarts

Generate Embeddings

Generate text embeddings using Edge Functions + Transformer.js.

This guide will walk you through how to generate high quality text embeddings in Edge Functions using Transformers.js. Inference is performed directly in Edge Functions using open source Hugging Face models, so no external API is required.

What is Transformers.js?#

Transformers.js is a library that allows you to perform inference using the JavaScript runtime. You can run Hugging Face embedding models through Transformers.js directly in Supabase Edge Functions.

Build the Edge Function#

Let's build an Edge Function that will accept an input string and generate an embedding for it. Edge Functions are server-side TypeScript HTTP endpoints that run on-demand closest to your users.

1

Set up Supabase locally

Make sure you have the latest version of the Supabase CLI installed.

Initialize Supabase in the root directory of your app and start your local stack.


_10
supabase init
_10
supabase start

2

Create Edge Function

Create an Edge Function that we will use to generate embeddings. We'll call this embed (you can name this anything you like).

This will create a new TypeScript file called index.ts under ./supabase/functions/embed.


_10
supabase functions new embed

3

Import & configure Transformer.js

Let's modify the Edge Function to import the Transformers.js client and configure it for the Deno runtime.

./supabase/functions/embed/index.ts

_10
import { serve } from 'https://deno.land/std@0.168.0/http/server.ts'
_10
import { env, pipeline } from 'https://cdn.jsdelivr.net/npm/@xenova/transformers@2.5.0'
_10
_10
// Configuration for Deno runtime
_10
env.useBrowserCache = false;
_10
env.allowLocalModels = false;

4

Construct embedding pipeline

Next let's construct the pipe() function we'll use to generate the embeddings. We use the Transformer.js pipeline() function to create this.

./supabase/functions/embed/index.ts

_10
const pipe = await pipeline(
_10
'feature-extraction',
_10
'Supabase/gte-small',
_10
);

Note the two arguments we pass to pipeline():

  1. The first argument specifies the type of inference task to perform. feature-extraction is used for embedding generation.
  2. The second argument specifies which Hugging Face model to use for the embedding generation. Supabase officially supports models available on the Supabase org.

note

We intentially construct the pipeline before the serve() function. This allows us to reuse the pre-loaded model in future warm-start requests, significantly speeding up future queries. Note that you'll only notice this speedup after deploying your function to Supabase - requests in local development will always be a cold-start.

5

Implement request handler

Modify our serve() request handler to accept an input string from the POST request JSON body.

Finally let's generate the embedding by:

  1. Calling pipe() to perform the embedding generation
  2. Extracting the embedding from the output as a number array
  3. Returning it as a JSON response
./supabase/functions/embed/index.ts

_19
serve(async (req) => {
_19
// Extract input string from JSON body
_19
const { input } = await req.json();
_19
_19
// Generate the embedding from the user input
_19
const output = await pipe(input, {
_19
pooling: 'mean',
_19
normalize: true,
_19
});
_19
_19
// Extract the embedding output
_19
const embedding = Array.from(output.data);
_19
_19
// Return the embedding
_19
return new Response(
_19
JSON.stringify({ embedding }),
_19
{ headers: { 'Content-Type': 'application/json' } }
_19
);
_19
});

Note the two options we pass to pipe():

  • pooling: The first option sets pooling to mean. Pooling referes to how token-level embedding representations are compressed into a single sentence embedding that reflects the meaning of the entire sentence. Average pooling is the most common type of pooling for sentence embeddings and is the method supported by Transformers.js.
  • normalize: The second option tells Transformers.js to normalize the embedding vector so that it can be used with distance measures like dot product. A normalized vector means its length (magnitude) is 1 - also referred to as a unit vector. A vector is normalized by dividing each element by the vector's length (magnitude), which maintains its direction but changes its length to 1.
6

Test it!

To test the Edge Function, first start a local functions server.


_10
supabase functions serve

Then in a new shell, create an HTTP request using cURL and pass in your input in the JSON body.


_10
curl --request POST 'http://localhost:54321/functions/v1/embed' \
_10
--header 'Authorization: Bearer ANON_KEY' \
_10
--header 'Content-Type: application/json' \
_10
--data '{ "input": "hello world" }'

Be sure to replace ANON_KEY with your project's anonymous key. You can get this key by running supabase status.

Next steps#