1. Getting Started
  2. Quickstart

Prerequisites

  • A Backboard account with an API key
  • Python 3.7+ or Node.js 14+ (or any HTTP client)

Step 1: Get Your API Key

  1. Log in to Backboard Dashboard
  2. Navigate to Settings → API Keys
  3. Create a new API key and copy it

Step 2: Install the SDK (Optional)

If you prefer using our SDK instead of raw HTTP requests, install it:

  • Python

  • JavaScript

  • TypeScript

pip install backboard-sdk

Step 3: Create an Assistant

An Assistant is an AI agent with specific instructions. Create one with a simple request:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

import asyncio
from backboard import BackboardClient

async def main():
    client = BackboardClient(api_key="YOUR_API_KEY")

    assistant = await client.create_assistant(
        name="My First Assistant",
        system_prompt="You are a helpful assistant that responds concisely."
    )
    print(f"Created assistant: {assistant.assistant_id}")

asyncio.run(main())

Step 4: Create a Thread

A Thread represents a conversation session. Create one for your assistant:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

thread = await client.create_thread(assistant.assistant_id)
print(f"Created thread: {thread.thread_id}")

Step 5.1: Send a Message (Non-Streaming)

Send a message and wait for the complete response:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

response = await client.add_message(
    thread_id=thread.thread_id,
    content="What is the capital of France?",
    stream=False
)
print(f"Assistant: {response.content}")

Step 5.2: Send a Message (Streaming)

Stream the response in real-time as it’s generated:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

full_content = ""
async for chunk in await client.add_message(
    thread_id=thread.thread_id,
    content="What is the capital of France?",
    stream=True
):
    if chunk.get("type") == "content_streaming":
        content_piece = chunk.get("content", "")
        full_content += content_piece
        print(content_piece, end="", flush=True)
    elif chunk.get("type") == "run_ended":
        break

print(f"\nFull response: {full_content}")

Step 6.1: Continue the Conversation (Non-Streaming)

The thread maintains context, so you can have a natural conversation:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

response = await client.add_message(
    thread_id=thread.thread_id,
    content="What is its population?",
    stream=False
)
print(f"Assistant: {response.content}")  # Will know you're asking about Paris

Step 6.2: Continue the Conversation (Streaming)

The thread maintains context, so you can have a natural conversation:

  • Python SDK

  • JavaScript SDK

  • TypeScript SDK

  • REST API

full_content = ""
async for chunk in await client.add_message(
    thread_id=thread.thread_id,
    content="What is its population?",
    stream=True
):
    if chunk.get("type") == "content_streaming":
        content_piece = chunk.get("content", "")
        full_content += content_piece
        print(content_piece, end="", flush=True)
    elif chunk.get("type") == "run_ended":
        break

print(f"\nAssistant: {full_content}")  # Will know you're asking about Paris

Next Steps