News

Apple Launches On-Device AI Framework, LLM Tools, and OS Redesign for Developers

Apple on Monday introduced a broad set of developer-focused software tools aimed at enhancing the integration of artificial intelligence and generative models across its ecosystem. The announcement, made during the company's annual Worldwide Developers Conference (WWDC), includes the debut of a new Foundation Models framework, enhanced support for large language models (LLMs) in Xcode 26, an expanded App Intents APIand a redesigned interface called Liquid Glass.

Foundation Models: Local AI with Global Ambitions

At the core of Apple's AI push is the Foundation Models framework, which enables developers to incorporate generative AI features using just a few lines of Swift code. The models operate on-device via Apple silicon, forming part of a broader privacy-first initiative branded as Apple Intelligence. The framework includes capabilities such as tool calling, guided generation, and natural language inference, all without relying on cloud processing or third-party infrastructure.

Apple says the framework is free to use and optimized for latency and energy efficiency. Early adopters include Automattic, which is using it to build privacy-aware journaling features into its Day One app.

"The Foundation Model framework has helped us rethink what's possible with journaling," said Paul Mayne, head of Day One at Automattic, in a statement. "Now we can bring intelligence and privacy together in ways that deeply respect our users."

Xcode 26: ChatGPT Comes to Mac

Apple also announced that Xcode 26its flagship IDE, now includes built-in support for ChatGPT and other large language models. Developers can use these models to auto-generate code, fix bugs, write tests and documentation, or even iterate on designs—all from within the IDE.

Developers have the option to use Apple's integrated ChatGPT interface without creating an OpenAI account, or they can supply their own API keys for external LLMs, including self-hosted ones running on Apple silicon. Xcode's Coding Tools also offer inline prompts, suggestions, and contextual actions such as preview generation and playground setup. Apple says these tools are designed to help developers "stay in flow."

Accessibility also got a boost: developers can now dictate Swift code and navigate Xcode entirely by voice using enhanced Voice Control features.

Liquid Glass: Apple's Design Reboot

Beyond AI, Apple is rolling out a visual refresh across all platforms—iOS 26, iPadOS 26, macOS Tahoe, watchOS 26 and tvOS 26—centered around a new UI material called Liquid Glass. This dynamic interface element mimics the visual texture of glass while enabling fluid motion and refined hierarchy across content layers. Developers can adopt the new design using native frameworks such as SwiftUI.

To support the new design system, Apple also launched an Icon Composer tool to help designers create scalable, high-fidelity app icons with features like layer annotations, blur tuning, and real-time previews.

App Intents and Visual Intelligence

The App Intents API, which connects app actions to system features like Siri, Spotlight, and widgets, is gaining a new capability: Visual Intelligence. This allows apps to participate in image-based search results at the OS level. Retail platform Etsy is among the first developers to adopt it, enabling users to find products through image context even if they don't know what to search for.

Swift 6.2 and Containerization

Apple also previewed Swift 6.2which includes improved concurrency support and new compatibility with WebAssembly. Developers can now configure modules to run on the main actor by default, simplifying async programming. Meanwhile, the new Containerization Framework allows developers to create and run Linux containers directly on Mac using Apple silicon—offering secure app packaging and resource isolation for advanced dev workflows.

A Growing AI Toolkit—Without the Noise

Across the board, Apple's message was clear: developers are getting more power and flexibility, but with the same emphasis on privacy and efficiency. Apple says its developer ecosystem now includes over 250,000 APIs spanning categories like AR, health, graphics, and ML.

"Developers play a vital role in shaping the experiences customers love across Apple platforms," said Susan Prescott, Apple's vice president of Worldwide Developer Relations. "With access to the on-device Apple Intelligence foundation model and new intelligence features in Xcode 26, we're empowering developers to build richer, more intuitive apps for users everywhere."

The upgrades come amid broader industry shifts toward model transparency and multimodal AI, with Google and Meta offering cloud-first or open-source approaches. Apple's strategy diverges: a closed but highly optimized system focused on local intelligence and developer integration.

The next generation of Apple platforms may look familiar—but they're thinking harder than ever before.

About the Author

John K. Waters is the editor in chief of a number of Converge360.com sites, with a focus on high-end development, AI and future tech. He's been writing about cutting-edge technologies and culture of Silicon Valley for more than two decades, and he's written more than a dozen books. He also co-scripted the documentary film Silicon Valley: A 100 Year Renaissance, which aired on PBS.  He can be reached at [email protected].