Please remember though, that this approach is limited to the few languages and regions enabled for Apple Intelligence. Much better and universal way is to use the llama.cpp wrapper, where you don’t have any imposed artificial limitations.
1 Like