Meta Unveils $1,300 AI Smart Glasses to Challenge Apple
Meta Platforms Inc., the parent company of facebook and Instagram, is accelerating the development of a high-end luxury version of its popular smart glasses. This new model is expected to feature advanced AI capabilities and gesture control, along with a high-definition screen for displaying photos and other applications. According to insiders, meta plans to launch its first AI smart glasses with a high-definition electronic screen by the end of this year. This product is seen as a crucial step in Meta's challenge to Apple's dominance in the smartphone and mobile consumer electronics market.
The new smart glasses, codenamed "Hypernova," are expected to be priced above $1,000, with a potential range of $1,300 to $1,400. The final price is likely to be confirmed closer to the release date. Meta's current popular AI smart glasses, the Ray-Ban Meta, start at $299 and have exceeded market expectations. Meta plans to continue selling this entry-level product while leveraging its popularity to drive users towards more premium models.
Other tech giants, such as Amazon, are also committed to releasing new AI-powered smart glasses to compete with Meta in this growing market. The global shift towards digital lifestyles and the increasing demand for convenient, real-time information services have driven the growth of the smart glasses market. These devices offer immersive, personalized, and real-time information services in various scenarios, including work, travel, entertainment, and health monitoring. The integration of generative AI functions is expected to further enhance these applications, providing personalized recommendations, real-time translations, and improving overall efficiency.
For Meta, the AI smart glasses market represents a significant opportunity to benefit from advancements in AI large models. By leveraging cloud-based and consumer-end generative AI models, Meta aims to provide users with a more intuitive, immersive, and personalized AI experience. This could mark the company's "NVIDIA moment"—a period of rapid growth in both stock price and sales.
The high price of Meta's new smart glasses is largely due to its innovative high-definition screen configuration. This single-eye display is located in the lower right quadrant of the right lens, showing information only in front of the wearer's right eye, with the clearest view when looking slightly downward. Insiders reveal that Meta is already developing a second-generation product, codenamed Hypernova 2, which will feature a more advanced dual-eye display system with two high-definition electronic screens.
Meta's smart glasses are set to become a milestone in the company's journey towards true "AR+AI" smart glasses. The company previewed related technologies last year. The Hypernova prototype offers a glimpse into its post-launch smart glass operation logic: upon startup, the screen will display a "startup screen" with Meta and partner logos. The main interface will feature horizontally aligned circular icons, similar to Apple's app dock or Meta Quest's mixed reality headset layout. The glasses will include dedicated applications for photography, viewing photos, and map navigation, with real-time notifications from paired smartphone apps like Messenger and WhatsApp.
Other features, such as capturing real-time images/videos, calling AI functions via an internal microphone, connecting to smartphone calls, and playing music, will be similar to the current Ray-Ban Meta Wayfarer style smart glasses. The new version is expected to heavily rely on the Meta View smartphone application. The glasses will run a deeply customized version of Google's Android operating system, with the possibility of an exclusive app store. Users can operate the glasses through capacitive touch on the temple, sliding to browse applications or photos and tapping to select specific content.
Meta is also planning to introduce the "Neuro Wristband" series, allowing users to control the smart glasses with gestures, such as rotating the palm to scroll content or pinching fingers to select options. This accessory is expected to be bundled with the glasses. Meta is also upgrading its smart camera system, aiming to match the quality of the 2021 iPhone 13 with a new 12-megapixel camera. Additionally, a triangular folding storage box, codenamed Heres, is in development.
While the Hypernova smart glasses are still months away from release, the current plan may still be subject to change. Meta is known for making significant changes to products in the development phase or even canceling projects. About 18 months ago, the company unexpectedly canceled the release of the no-camera version of Ray-Ban Meta, codenamed Luna, which was designed to reduce costs and enhance privacy. Apart from Hypernova, Meta is also developing the screenless smart glasses Supernova 2, which features a similar operating logic to the current Ray-Ban Meta but with an Oakley design, targeting scenarios like cycling. Public environment testing for this product has already begun.
The Hypernova 2, scheduled for release in 2027, will partially overlap with the development of Meta's true "AR+AI" mode smart glasses. These glasses will overlay interactive images, videos, and information onto the real world, requiring significantly higher technology and development costs than Hypernova's simple high-definition display. Meta's AR mode smart glasses prototype, Orion, is currently used for internal software and application development testing and may eventually be opened to developers. The first consumer-facing product, Artemis, is expected to be released no earlier than 2027.
Meta's Reality Labs division, responsible for developing AI smart glasses and AR consumer electronics, is still discussing product development plans and details, including whether to merge the Artemis and Hypernova product lines or release them at different price points. Smart glasses are poised to become the ideal end-side carrier for AI technology, offering real-time data processing and seamless interaction with cloud-based AI models. This capability allows smart glasses to collect real-time environmental data, such as visual, audio, and location information, and process it locally or in conjunction with cloud-based AI models to enable real-time generative AI applications, such as voice interaction, real-time translation, AR navigation, and contextual information overlay.

Ask Aime: What are the implications of Meta launching its first AI smart glasses by the end of the year on the smartphone market?