AIKit - My open source Xojo module for interfacing with LLMs

AIKit is a free, open source, Xojo module for interacting with large language models (LLMs) from a variety of open source and proprietary providers.

Repository

Homepage

https://garrypettet.com/projects/aikit.html

About

LLMs are popular in the tech world as of the time of writing (March 2025) and many programmers are building impressive tools on top of them. AIKit provides a way for Xojo programmers to chat (using both text and images) with LLMs from Xojo code both synchronously and asynchronously using a standardised Chat object. The Chat object abstracts away the API complexities of different providers and even allows switching between providers within the same conversation.

Usage

Everything needed is contained within the AIKit module. There are no external code dependencies - the module is 100% native Xojo code and therefore should work on any platform Xojo supports.

To get started, simply copy the AIKit module into your project.

Basic synchronous usage

You can talk with a LLM synchronously like this:

Const API_KEY = "Your Anthropic Key here"
Const MODEL = "claude-3-7-sonnet-20250219"
Var chat As New AIKit.Chat(MODEL, AIKit.Providers.Anthropic, API_KEY, "")
Var response As AIKit.ChatResponse = chat.Ask("What is 1 + 2?")

This will either return an AIKit.ChatResponse object containing the model’s response, token usage, etc or will raise an AIKit.APIException if something went wrong.

You can follow up the conversation by just continuing to ask questions:

response = chat.Ask("Add 5 to that and give me the answer")

You can even switch models and/providers mid-conversation and the conversation history will be preserved:

chat.WithModel("o1-mini", AIKit.Providers.OpenAI, OPENAI_API_KEY, "")
response = chat.Ask("Double that value please")

Since LLMs can take a while to respond, it is highly recommended that you use AIKit asynchronously otherwise your app may hang whilst a response is awaited (unless you use a thread).

Asynchronous usage

When used asynchronously, the AIKit.Chat object will call delegates (also known as callbacks) you provide when certain events occur. Delegates are methods that you “attach” to a Chat object. These methods must have a particular signature. You can read more about Xojo delegates in Xojo’s documentation but an example is provided below:

// Assume this code is in the Opening event of a window and the window has a property called `Chat` of
// type `AIKit.Chat`.

// Create a new chat instance with a local LLM using the Ollama provider.
Const OLLAMA_ENDPOINT = "Your Ollama API endpoint ending with `/`"
Chat = New AIKit.Chat("deepseek-r1:14b", AIKit.Providers.Ollama, "", OLLAMA_ENDPOINT)

// Attach delegates to handle the various events that the chat object will create.
// You don't have to assign a delegate  to all of these. If you don't, you simply 
// won't be notified when an event occurs.

// APIError() is a method that will be called when an API error happens.
Chat.APIErrorDelegate = AddressOf APIError

// ContentReceived() is my method that will be called when new message content is received.
Chat.ContentReceivedDelegate = AddressOf ContentReceived

// MaxTokensReached() is my method that's called when the maximum token limit has been reached.
Chat.MaxTokensReachedDelegate = AddressOf MaxTokensReached

// MessageStarted() is a message called when a new message is beginning.
Chat.MessageStartedDelegate = AddressOf MessageStarted

// MessageFinished() will be called when a message has just been finished.
Chat.MessageFinishedDelegate = AddressOf MessageFinished

// ThinkingReceived() will be called by some models as thinking content is generated.
Chat.ThinkingReceivedDelegate = AddressOf ThinkingReceived

// Once the chat is setup, we just ask away and handle the responses in the above methods
// as they are received:
chat.Ask("Hello")

Provider support

AIKit uses the concept of Providers. A Provider is a vendor of an LLM. At present, the following providers are supported:

  • Anthropic (specifically Claude) via AnthropicProvider
  • Ollama (for locally hosted LLMs) via OllamaProvider
  • OpenAI (ChatGPT, o1/o3, etc) via OpenAIProvider

I may add support for other providers in the future but I encourage you to add your own and create a pull request via GitHub so we can all benefit. Adding new providers is fairly easy. If you look at the included provider classes (e.g. AnthropicProvider) you’ll see they all implement the AIKit.ChatProvider interface. There are a couple of other spots in the code that would need modifying (mostly in constructors and the AIKit.Providers enumeration).

Most people won’t need to interact with provider classes directly as they are abstracted away by the Chat object.

Demo application

Included in the repo is the AIKit module and a demo application that allows you to chat with any of the supported LLMs. You will need to provide your own API keys and Ollama endpoints for the demo to work correctly (since I don’t want to share mine!). To do this, create a folder called ignore in the same directory as the AIKit src folder. In this folder, create a JSON file called private.json with this structure:

{
	"apiKeys" : {
		"anthropic" : "your-key",
		"openai" : "your-key"
	},
	"endPoints" : {
		"ollama" : "the endpoint, e.g http://localhost:11434/api/"
	}
}

This will provide the KeySafe module in the demo app with access to your API keys. This is not needed when using AIKit in your own projects - just to make the demo work.

On macOS, you’ll need to add an entry to the plist for any project using AIKit to give permission to the app to call any URL. To do this I have bundled an Info.plist within the resources/ folder of the repo. You can either drop this into your project and Xojo will include it in the build app plist or, if you’re using Xojo 2025r1 or greater, I’ve added the required plist keys in the IDE’s plist editor. If you don’t do this you’ll see Xojo network exceptions.

29 Likes

Thanks for this Garry, always excellent contributions.

1 Like

Amazing. Thank you.

1 Like

Thanks Gary. Great work as always.

1 Like

Really very very cool Gary!
Thank you!

1 Like

The readme explains what the contents of the private.json needs to be, but I was a bit confused about the location as well.

This is where it needs to live:

I also had to comment out the Anthropic and Ollama providers in WinDemo.PopupProvider since I don’t have keys for those and it was trying to initialize by using the Anthropic API.

2 Likes

Fantastic module and already in use,thanx for the good work. I would advise on a more simple straightforward use of the key storage.

Everything else = fantastic

1 Like

Thanks for the feedback and I love the integration of the module into your RichText control!

I agree the handling of API keys is a little clunky for the demo but it’s difficult to make it easy when you don’t want to expose your own keys publically on GitHub. I will amend the documentation momentarily to make it a little clear.

Question: I have considered moving storage of API keys to the AIKit module itself which would negate the need to pass API keys to Chat objects. The downside of this is it would make it very hard for the use case where people have multiple API keys or multiple Ollama endpoints.

The reason I engineered it so an endpoint needs to be passed to a Chat object is because I have two instances of Ollama running on different machines and two Chat instances that talk to them. If I centralised the storage of API keys and endpoints it would be clunkier to achieve what I’m doing.

1 Like

My 10 cents… pros and cons.

Current
Your current architecture—passing the API key and endpoint directly to each Chat object—is a sound and scalable approach, especially given your use case with multiple Ollama instances running on separate machines. This pattern allows multiple chat sessions to operate independently, flexible, better support for future scalability .. if it comes to that.

My 10 cents :slight_smile:

  • Maintain per-instance configuration (as you do now) so you continue to allow Chat objects to be initialized with specific keys and endpoints.
  • BUT, you might add a centralized configuration in AIKit which provides a “default” API key/endpoint store in the module. If no key/endpoint is passed to a Chat object, it uses the centralized ones.
  • Let your AIKit support “named” configurations using a dictionary for example

So something like this could be used

AIKit.AddConfiguration("default", apiKey, endpoint)
AIKit.AddConfiguration("ollama2", otherApiKey, otherEndpoint)

Then a Chat object could be initialized like:

ChatInstance = New Chat("ollama2")

Internally, the Chat constructor would look up the configuration by name.

… Something of this nature. Like I said… 10 cents. :slight_smile:

Svenni MD.

Version 1.1.0

A little update to AIKit to simplify creating new Chat objects by adding support for default keys to the AIKit module.

In addition to being able to pass an API key (or in the case of Ollama, an API endpoint) to a Chatinstance you can now omit both the API key and endpoint parameters and the Chat object will attempt to use default keys for the requested provider. The default keys are stored in the AIKit module itself:

// Set the keys you want to use in your app. A good place to do this is 
// in `App.Opening` but it'll work so long as they are set before you 
// create any `Chat` instances.
AIKit.Credentials.Anthropic = "your-anthropic-key"
AIKit.Credentials.Ollama = "the-ollama-endpoint-url"
AIKit.Credentials.OpenAI = "the-openai-key"

// Now creating a chat is much cleaner since we only pass two parameters:
Var chatgpt As New Chat("o1-mini", AIKit.Providers.OpenAI)
chatgpt.Ask("Hi")

// Switch models mid-conversation:
chatgpt.WithModel("o3-mini", AIKit.Providers.OpenAI) chatgpt.Ask("Are you smarter now?")

I took a different approach to that suggested by @Sveinn_Runar_Sigurdsson because this approach suits my app a little better.

3 Likes

Fantastic! Using your module in our main app already. Love it !