LLM Providers

Configure AI providers for Anthropic, OpenAI, Azure Foundry, or build your own custom provider.

LLM Providers

Each provider is a separate npm package. Install only what you need.

A

Anthropic

@inferagraph/anthropic-provider

import { InferaGraph, GraphStore, AIEngine } from '@inferagraph/core';
import { AnthropicProvider } from '@inferagraph/anthropic-provider';

const provider = new AnthropicProvider({
  apiKey: 'sk-ant-...',
  model: 'claude-sonnet-4-20250514',  // default
  maxTokens: 1024,                  // default
});

function App() {
  const [store] = useState(() => new GraphStore());
  const [ai] = useState(() => new AIEngine(store, provider));

  const handleQuery = async () => {
    const result = await ai.query('Who was Abraham's wife?');
    console.log(result);
  };

  return <InferaGraph data={store} layout="graph" />;
}
O

OpenAI

@inferagraph/openai-provider

import { InferaGraph, GraphStore, AIEngine } from '@inferagraph/core';
import { OpenAIProvider } from '@inferagraph/openai-provider';

const provider = new OpenAIProvider({
  apiKey: 'sk-...',
  model: 'gpt-4o',     // default
  organization: 'org-...', // optional
});

function App() {
  const [store] = useState(() => new GraphStore());
  const [ai] = useState(() => new AIEngine(store, provider));

  const handleQuery = async () => {
    const result = await ai.query('Tell me about Moses');
    console.log(result);
  };

  return <InferaGraph data={store} layout="graph" />;
}
Az

Azure Foundry

@inferagraph/azure-foundry-provider

import { InferaGraph, GraphStore, AIEngine } from '@inferagraph/core';
import { AzureFoundryProvider } from '@inferagraph/azure-foundry-provider';

const provider = new AzureFoundryProvider({
  endpoint: 'https://your-resource.services.ai.azure.com',
  apiKey: '...',           // or use credential
  deploymentName: 'gpt-4o', // optional
});

function App() {
  const [store] = useState(() => new GraphStore());
  const [ai] = useState(() => new AIEngine(store, provider));

  const handleQuery = async () => {
    const result = await ai.query('Describe the Exodus');
    console.log(result);
  };

  return <InferaGraph data={store} layout="graph" />;
}

Custom Provider

Extend LLMProvider to add any model.

import { LLMProvider, InferaGraph, GraphStore, AIEngine } from '@inferagraph/core';

class MyProvider extends LLMProvider {
  readonly name = 'my-provider';

  async complete(request) {
    // Call your model API
    return { content: '...' };
  }

  isConfigured() { return true; }
}

const provider = new MyProvider();

function App() {
  const [store] = useState(() => new GraphStore());
  const [ai] = useState(() => new AIEngine(store, provider));

  const handleQuery = async () => {
    const result = await ai.query('Tell me about Isaac');
    console.log(result);
  };

  return <InferaGraph data={store} layout="graph" />;
}