Meta to release Muse Spark in Voice Mode and Meta Glasses

What's new? Meta launched Muse Spark to power Meta AI across apps like WhatsApp, Instagram, Facebook and Messenger; it offers faster voice replies, live camera recognition and shopping mode;

· 1 min read
Meta

Meta has launched Muse Spark, the foundational model now powering Meta AI across the Meta AI app and meta.ai, with expanded integration into WhatsApp, Instagram, Facebook, Messenger, Threads, and AI glasses. This rollout introduces faster voice responses, smarter shopping assistance, and real-time visual recognition through the device camera.

Voice conversations enable users to speak naturally with Meta AI, switch topics or languages mid-discussion, and receive on-the-fly image generation and relevant recommendations. Shopping mode now aggregates Facebook Marketplace and internet-wide listings, offering map-based browsing, price and style filters, and direct brand content access in a grid layout. Live AI features allow users to point their camera at objects or landmarks for immediate context and help.

0:00
/0:27

The initial rollout targets users in the US and Canada, with gradual expansion to Ray-Ban Meta and Oakley Meta glasses and broader app integration. Muse Spark is designed as a compact, fast model capable of advanced reasoning in fields like science, math, and health, including visual coding and multimodal perception. This distinguishes it from previous Meta models and competitors by enabling real-world contextual understanding and multitasking through subagents.

Early reactions from AI experts highlight Muse Spark’s step toward more contextual personal assistants. Meta Superintelligence Labs, the team behind this release, rebuilt the AI stack for faster, more capable models, aiming to usher in personal superintelligence while implementing safety and privacy safeguards.

Source