Meta has launched Muse Spark, the foundational model now powering Meta AI across the Meta AI app and meta.ai, with expanded integration into WhatsApp, Instagram, Facebook, Messenger, Threads, and AI glasses. This rollout introduces faster voice responses, smarter shopping assistance, and real-time visual recognition through the device camera.
Today we’re introducing Meta AI Voice Conversations powered by Muse Spark that let you talk naturally to Meta AI (interrupt, switch topics, or swap languages), and as you talk, Meta AI can generate images and pull up recommendations from Reels, maps, and more. We’re also bringing… pic.twitter.com/97PpSEplgo
— Meta Newsroom (@MetaNewsroom) May 12, 2026
Voice conversations enable users to speak naturally with Meta AI, switch topics or languages mid-discussion, and receive on-the-fly image generation and relevant recommendations. Shopping mode now aggregates Facebook Marketplace and internet-wide listings, offering map-based browsing, price and style filters, and direct brand content access in a grid layout. Live AI features allow users to point their camera at objects or landmarks for immediate context and help.
The initial rollout targets users in the US and Canada, with gradual expansion to Ray-Ban Meta and Oakley Meta glasses and broader app integration. Muse Spark is designed as a compact, fast model capable of advanced reasoning in fields like science, math, and health, including visual coding and multimodal perception. This distinguishes it from previous Meta models and competitors by enabling real-world contextual understanding and multitasking through subagents.
Early reactions from AI experts highlight Muse Spark’s step toward more contextual personal assistants. Meta Superintelligence Labs, the team behind this release, rebuilt the AI stack for faster, more capable models, aiming to usher in personal superintelligence while implementing safety and privacy safeguards.