Introducing AnyLanguageModel: One API for Local and Remote LLMs on Apple Platforms

Introducing AnyLanguageModel: One API for Local and Remote LLMs on Apple Platforms

Hugging Face
Hugging FaceNov 20, 2025

Why It Matters

The author emphasizes reducing integration friction, provides a demo chat‑ui‑swift app, and invites community contributions as the project evolves toward full feature parity and advanced agentic workflows.

Summary

The post announces AnyLanguageModel, a Swift package that lets Apple developers swap the Foundation Models import for a unified API supporting local (Core ML, MLX, llama.cpp, Ollama) and cloud (OpenAI, Anthropic, Gemini, Hugging Face) LLM providers with minimal code changes. By building on Apple’s Foundation Models framework and using Swift 6.1 package traits, the library avoids dependency bloat and enables easy experimentation, even extending the API to handle image inputs despite current platform limitations. The author emphasizes reducing integration friction, provides a demo chat‑ui‑swift app, and invites community contributions as the project evolves toward full feature parity and advanced agentic workflows.

Introducing AnyLanguageModel: One API for Local and Remote LLMs on Apple Platforms

Comments

Want to join the conversation?

Loading comments...