Did you know that macOS 26 (and iOS 26) come with a small local LLM from Apple?

Since it runs local there is no cost involved and no network access. So why not tap into this resource when available to enhance your Xojo application?
You could use the LLM for:
- Generate text based on key words
- Summarize texts
- Tone editing to make some text sound more professional.
- Translate text
With the FoundationModels classes in MBS Xojo Plugins, you can load a model and send in requests. Let us show you a sample code, which uses our FoundationModelsMBS module to check if this is available and then start a session with the system model:
if not FoundationModelsMBS.Available then
MessageBox "Please run on macOS 26."
quit
end if
SystemLanguageModels = new SystemLanguageModelsMBS
if SystemLanguageModels.Available then
'MessageBox "Available"
else
var u as integer = SystemLanguageModels.UnavailableReason
Select case u
case SystemLanguageModels.UnavailableReasonAppleIntelligenceNotEnabled
MessageBox "Apple Intelligence not enabled."
case SystemLanguageModels.UnavailableReasonDeviceNotEligible
MessageBox "Device not eligible."
case SystemLanguageModels.UnavailableReasonModelNotReady
MessageBox "Model not ready."
else
MessageBox "UnavailableReason: "+u.ToString
end Select
quit
end if
// and let's start a session
var instructions as string = "Be polite. You run within a Xojo application. "
session = new LanguageModelSessionsMBS(SystemLanguageModels, instructions)
This loads a model. When picking a model Apple let’s you provide a use case. Either you pick the one for content tagging or the general model. Then there is the choice on whether to use guardrails for the model or the one for permissive content transformations.
Then you can start a session and provide instructions. The idea is that instructions tell the LLM what to do and then provide the text to work on as prompt. You may run FoundationModels.Respond multiple times to ask questions within the session and refer to previous prompts and responses.
We pass our input text to the model to get an asynchronous response:
Sub SendRequest()
AskButton.Enabled = false
var p as PromptMBS = PromptMBS.promptString(InputField.Text)
AddText InputField.Text
session.respond( p, AddressOf Responded)
InputField.Text = ""
End Sub
Later you get the callback to the Responded method:
Sub Responded(Response as ResponseMBS, error as string, tag as Variant)
if Response <> nil then
var content as string = Response.Content
AddText "> " + content
end if
if error <> "" then
AddText "> " + error
end if
AskButton.Enabled = true
End Sub
This allows you to run a chat bot. Or pass in some instructions to summarize, translate or complete and then the text to work on.
Please try the new functions in 15.5 version and let us know. And always remember that Apple will probably improve the models over time, so this is the worst one shipping. We look forward to enhancements in upcoming macOS versions.