OpenRouter Embeddings [OpenAI Compatible] Large Language Models

openrouter/router/openai/v1/embeddings
The OpenRouter Embeddings API with fal, powered by OpenRouter, provides unified access to a wide range of large language models - including GPT, Claude, Gemini, and many others through a single API interface.
Inference
Commercial use

Input

Result

Idle

Waiting for your input...

You will be charged based on the number of input and output tokens.

Logs

๐Ÿงฉ Usage with OpenAI Client (Embeddings API)

python
from openai import OpenAI
import os

client = OpenAI(
    base_url="https://fal.run/openrouter/router/openai/v1",
    api_key="not-needed",
    default_headers={
        "Authorization": f"Key {os.environ['FAL_KEY']}",
    },
)

response = client.embeddings.create(
    model="openai/text-embedding-3-small",
    input="An AI that learns to dream in colors humanity has never seen."
)

embedding = response.data[0].embedding
print("Embedding length:", len(embedding))

# Multiple texts example (batch encoding)
texts = [
    "An AI that learns to dream.",
    "A robot that remembers forgotten memories.",
    "A neural network that writes its own mythology.",
]

batch_response = client.embeddings.create(
    model="openai/text-embedding-3-small",
    input=texts,
)

for i, item in enumerate(batch_response.data):
    print(f"Text {i} embedding length:", len(item.embedding))

๐Ÿงช Comparing Embeddings (Simple Similarity Example)

python
from openai import OpenAI
import os
import math

client = OpenAI(
    base_url="https://fal.run/openrouter/router/openai/v1",
    api_key="not-needed",
    default_headers={
        "Authorization": f"Key {os.environ['FAL_KEY']}",
    },
)

def cosine_sim(a, b):
    dot = sum(x * y for x, y in zip(a, b))
    na = math.sqrt(sum(x * x for x in a))
    nb = math.sqrt(sum(x * x for x in b))
    return dot / (na * nb)

texts = [
    "An AI that learns to dream.",
    "A machine that hallucinates new worlds.",
    "A recipe for chocolate cake."
]

resp = client.embeddings.create(
    model="openai/text-embedding-3-small",
    input=texts,
)

emb_ai = resp.data[0].embedding
emb_machine = resp.data[1].embedding
emb_recipe = resp.data[2].embedding

print("AI vs machine:", cosine_sim(emb_ai, emb_machine))
print("AI vs recipe:", cosine_sim(emb_ai, emb_recipe))

๐Ÿ“š Documentation

For more details, visit the official docs: