transformers.js/examples/semantic-image-search
Joshua Lochner f542c52c94
Update Next.js demos to 14.2.3 (#772)
2024-05-24 00:44:09 +02:00
..
public Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
scripts Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
src/app Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
.env.local.example Update .env.local.example 2023-08-01 18:55:46 +02:00
.eslintrc.json Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
.gitignore Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
Dockerfile Update Next.js Dockerfile HOSTNAME (#461) 2023-12-20 01:53:45 +02:00
README.md Update semantic image search example README 2023-08-01 18:55:41 +02:00
jsconfig.json Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
next.config.js Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
package-lock.json Update Next.js demos to 14.2.3 (#772) 2024-05-24 00:44:09 +02:00
package.json Update Next.js demos to 14.2.3 (#772) 2024-05-24 00:44:09 +02:00
postcss.config.js Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00
tailwind.config.js Add support for computing CLIP image and text embeddings separately (Closes #148) (#227) 2023-08-01 14:01:04 +02:00

README.md

Semantic Image Search

This example shows you how to use Transformers.js to create a semantic image search engine. Check out the demo here.

Semantic Image Search Demo

Getting Started

Dataset

This application uses images from The Unsplash Dataset, which you can download here. All you need for this demo is the photos.tsv000 TSV file, which contains the metadata for all the images.

Connecting to Supabase

After creating a new Supabase project, you'll need to:

  1. Create an images table and import the data from photos.tsv000.

  2. Add a column for image_embeddings:

    -- Add a new vector column with a dimension of 512
    alter table images add column image_embedding vector(512);
    
  3. Add your SUPABASE_URL, SUPABASE_ANON_KEY, and SUPABASE_SECRET_KEY keys to a .env.local file (see .env.local.example for template).

  4. Update the image embeddings in your database by running the following command:

    SUPABASE_URL=your-project-url \
    SUPABASE_SECRET_KEY=your-secret-key \
    node scripts/update-database.mjs
    

    Note: This will take a while. Also, since queries are capped at 1000 returned rows, you'll need to run this command multiple times to insert all 25000 rows.

  5. Create a new match_images database function:

    -- https://supabase.com/blog/openai-embeddings-postgres-vector
    create or replace function match_images (
        query_embedding vector(512),
        match_threshold float,
        match_count int
    )
    returns table (
        photo_id text,
        photo_url text,
        photo_image_url text,
        photo_width int,
        photo_height int,
        photo_aspect_ratio float,
        photo_description text,
        ai_description text,
        blur_hash text,
        similarity float
    )
    language sql stable
    as $$
    select
        photo_id,
        photo_url,
        photo_image_url,
        photo_width,
        photo_height,
        photo_aspect_ratio,
        photo_description,
        ai_description,
        blur_hash,
        1 - (image_embedding <=> query_embedding) as similarity
    from images
    where 1 - (image_embedding <=> query_embedding) > match_threshold
    order by similarity desc
    limit match_count;
    $$;
    
  6. Add a database policy to allow users to view the database:

    create policy "policy_name"
    on public.images
    for select using (
        true
    );
    

Development

You can now run the development server with:

npm run dev

Open http://localhost:3000 with your browser to see the result.