TechnologyNovember 15, 2023

Getting Started with the Astra Assistants API

Getting Started with the Astra Assistants API

Today we released the Astra Assistants API, a new service and a drop-in replacement for OpenAI's powerful new Assistants API (read more about the news here). The service is designed for users and organizations that want additional control over their data. The system uses DataStax Astra DB to persist messages, assistants, threads, runs, and files, and Astra DB's vector functionality to store and query embeddings for retrieval augmented generation (RAG). It calls out to OpenAI for large language model-related tasks like embedding generation and chat completion.

Although users interact with the service through the OpenAI SDKs, their proprietary data is stored in their own database and can be managed, accessed, and secured like other Astra DB databases.

Pointing an existing OpenAI app to the Astra Assistants API is as easy as changing a single line of code.

Get started

To grab an Astra token, register for Astra DB and click Generate Token in the main welcome page. Then, in your application, replace the client constructor:

client = OpenAI()

with:

client = OpenAI(

    base_url="https://open-assistant-ai.astra.datastax.com/v1",      

    default_headers={

        "astra-api-token": <YOUR_ASTRA_DB_TOKEN>

    }

)

The base_url tells the client to hit Astra instead of OpenAI and the default header provided is used to authenticate against Astra. The OpenAI token, usually passed via an env var, is still required because the app will make calls to OpenAI's LLM.

The first time you hit the API we will stand up your database (with the name `assistant_api_db`) for you so expect a short delay (a few minutes) on the first request. This will only happen once.

Optionally, if you have an existing Astra DB vector database that you want to use, you can pass your database id in a second header as shown below:

client = OpenAI(

    base_url="https://open-assistant-ai.astra.datastax.com/v1",      

    default_headers={

        "astra-api-token": <YOUR_ASTRA_DB_TOKEN>,

        "astra-db-id": <YOUR_ASTRA_DB_ID>

    }

)

Note: when going this route, make sure your database is vector-enabled.

At this point you're ready to create an assistant and start coding:

assistant = client.beta.assistants.create(

  instructions="You are a personal math tutor. When asked a math question, write and run code to answer the question.",

  model="gpt-4-1106-preview",

  tools=[{"type": "code_interpreter"}]

)

Coverage

The OpenAI API currently supports 57 endpoints; the current version of the Astra Assistants API supports about 70% of those and we will continue to add more after this initial release. We’ill keep a version of the following table up to date on GitHub:

Endpoint

Implemented

Stateless / Proxy

Roadmap

/chat/completions - post

X

X

 

/completions - post

X

X

 

/edits - post

X

X

 

/images/generations - post

X

X

 

/images/edits - post

X

X

 

/images/variations - post

X

X

 

/embeddings - post

X

X

 

/audio/speech - post

X

X

 

/audio/transcriptions - post

X

X

 

/audio/translations - post

X

X

 

/files - get

X

X

 

/files - post

X

X

 

/files/{file_id} - delete

X

X

 

/files/{file_id} - get

X

X

 

/files/{file_id}/content - get

X

X

 

/fine_tuning/jobs - post

X

X

 

/fine_tuning/jobs - get

X

X

 

/fine_tuning/jobs/{fine_tuning_job_id} - get

X

X

 

/fine_tuning/jobs/{fine_tuning_job_id}/events - get

X

X

 

/fine_tuning/jobs/{fine_tuning_job_id}/cancel - post

X

X

 

/fine-tunes - post

X

X

 

/fine-tunes - get

X

X

 

/fine-tunes/{fine_tune_id} - get

X

X

 

/fine-tunes/{fine_tune_id}/cancel - post

X

X

 

/fine-tunes/{fine_tune_id}/events - get

X

X

 

/models - get

X

X

 

/models/{model} - get

X

X

 

/models/{model} - delete

X

X

 

/moderations - post

X

X

 

/assistants - get

X

   

/assistants - post

X

   

/assistants/{assistant_id} - get

X

   

/assistants/{assistant_id} - post

X

   

/assistants/{assistant_id} - delete

X

   

/threads - post

X

   

/threads/{thread_id} - get

   

X

/threads/{thread_id} - post

   

X

/threads/{thread_id} - delete

   

X

/threads/{thread_id}/messages - get

X

   

/threads/{thread_id}/messages - post

X

   

/threads/{thread_id}/messages/{message_id} - get

   

X

/threads/{thread_id}/messages/{message_id} - post

   

X

/threads/runs - post

   

X

/threads/{thread_id}/runs - get

X

   

/threads/{thread_id}/runs - post

X

   

/threads/{thread_id}/runs/{run_id} - get

X

   

/threads/{thread_id}/runs/{run_id} - post

   

X

/threads/{thread_id}/runs/{run_id}/submit_tool_outputs - post

   

X

/threads/{thread_id}/runs/{run_id}/cancel - post

   

X

/threads/{thread_id}/runs/{run_id}/steps - get

   

X

/threads/{thread_id}/runs/{run_id}/steps/{step_id} - get

   

X

/assistants/{assistant_id}/files - get

   

X

/assistants/{assistant_id}/files - post

   

X

/assistants/{assistant_id}/files/{file_id} - get

   

X

/assistants/{assistant_id}/files/{file_id} - delete

   

X

/threads/{thread_id}/messages/{message_id}/files - get

   

X

/threads/{thread_id}/messages/{message_id}/files/{file_id} - get

   

X

 

40 out of 57 endpoints are implemented (70% coverage)

If you don't already have an AI app you're working on, take a look at OpenAI's fantastic cookbook repository or our version of the Assistants API notebook with the Astra connection strings—and get coding.


By enabling this preview, you agree to the DataStax Preview Terms.

Discover more
OpenAI
Share
JUMP TO SECTION

Get started

Coverage

One-stop Data API for Production GenAI

Astra DB gives JavaScript developers a complete data API and out-of-the-box integrations that make it easier to build production RAG apps with high relevancy and low latency.