Quickstart

Get FOON running in your project in under 5 minutes.

Installation

npm install foon-sdk

Basic Usage

1. Define Your Target Schema

Start by defining the shape of data you want. FOON uses standard JSON Schema:

const customerSchema = {
  type: 'object',
  properties: {
    name: {
      type: 'object',
      properties: {
        given: { type: 'string' },
        family: { type: 'string' }
      },
      required: ['given', 'family']
    },
    email: { type: 'string', format: 'email' }
  },
  required: ['name', 'email']
};

2. Transform Data

Import the transform function and a provider, then transform your data:

import { transform, OpenAIProvider } from 'foon-sdk';

// Messy input from a webhook
const webhookPayload = {
  fullName: 'Jane Smith',
  email_addr: 'jane.smith@example.com'
};

// Transform to your schema
const result = await transform(
  webhookPayload,
  {
    schema: customerSchema,
    provider: new OpenAIProvider({
      apiKey: process.env.OPENAI_API_KEY,
      model: 'gpt-5-nano'
    }),
    confidenceThreshold: 0.85
  }
);

if (result.ok) {
  console.log(result.output);
  // {
  //   name: { given: 'Jane', family: 'Smith' },
  //   email: 'jane.smith@example.com'
  // }
} else {
  console.error(result.error);
}

Available Providers

FOON supports multiple AI providers:

import { GeminiProvider, OpenAIProvider, OllamaProvider } from 'foon-sdk';

// OpenAI (Recommended)
const openai = new OpenAIProvider({
  apiKey: process.env.OPENAI_API_KEY,
  model: 'gpt-5-nano'
});

// Google Gemini
const gemini = new GeminiProvider({
  apiKey: process.env.GEMINI_API_KEY,
  model: 'gemini-1.5-flash'
});

// Ollama (local)
const ollama = new OllamaProvider({
  model: 'llama2',
  baseUrl: 'http://localhost:11434'
});

Using Cache

Avoid redundant AI calls by caching mapping plans:

import { transform, OpenAIProvider, LRUCache } from 'foon-sdk';

const cache = new LRUCache({ max: 100, ttl: 3600000 }); // 100 entries, 1 hour TTL

const result = await transform(
  webhookPayload,
  {
    schema: customerSchema,
    provider: new OpenAIProvider({
      apiKey: process.env.OPENAI_API_KEY,
      model: 'gpt-5-nano'
    }),
    cache
  }
);

console.log('Cache hit:', result.trace.cache.hit);

Working with Results

Every transformation returns a result object with success status and trace:

const result = await transform(input, options);

if (result.ok) {
  // Success - use the transformed output
  const data = result.output;
  console.log('Confidence:', result.trace.confidenceSummary);
} else {
  // Handle error
  console.error('Category:', result.error.category);
  console.error('Message:', result.error.message);
  // Trace is still available for debugging
  console.log('Trace:', result.trace);
}

Next Steps