Zhitong Finance App learned that Meta Platforms Inc. (META.US), the parent company of Facebook and Instagram, is speeding up the development of high-end luxury versions of its popular smart glasses. It plans to embed a more advanced and more powerful AI model system, and will add gesture control functions and high-definition screens that display photos and other applications separately. According to people familiar with the matter, Facebook parent company Meta will launch the first AI smart glasses with a high-definition electronic screen as early as the end of this year — this product is seen as a key step in Meta's quest to challenge the dominance of the Apple ecosystem's iPhone and other mobile consumer electronics devices in the smart consumer electronics market.
According to information, the price of this smart glasses device, codenamed “Hypernova,” is expected to exceed 1,000 US dollars, and the maximum price may reach 1,300 to 1,400 US dollars. People familiar with the matter said that the final price may not be determined until close to launch. Meta's most popular AI smart glasses model, Ray-Ban Meta, starts at $299, and the market sales performance has exceeded expectations. Meta said it will continue to sell this entry-level smart glasses product in the future and hopes to use its popularity to push users to more high-end product models.
In addition, American tech giants such as Amazon (AMZN.US) have also promised to launch new smart glasses based on AI functions to better compete with the social media giant in the smart glasses field.
The global macroeconomy and the transformation of digital lifestyles are jointly driving the growing demand of global consumer electronics users for efficient and convenient intelligent information services anytime, anywhere. As a lightweight, wearable terminal consumer electronic product, smart glasses can provide immersive, customized, and real-time information services in various scenarios such as office, travel, entertainment, and health monitoring. The full introduction of generative AI functions can be said to further enrich the application scale of these scenarios, such as providing personalized suggestions through real-time data analysis and providing real-time accurate translation content based on AI functions, thereby greatly improving travel, quality of life, and personal work efficiency.
For the AI smart glasses industry, which is dominated by Meta, it is widely regarded by the market that end-side consumer electronics products that most benefit from the latest iteration of AI big model technology can take full advantage of the continuous update and iteration of cloud-based consumer electronics end side generative AI models, thus bringing users a more intuitive, immersive, and personalized artificial intelligence experience. Perhaps it is about to usher in their “Nvidia moment” — that is, the moment when the company's stock price and sales increase at the same time.
Meta's thousand-dollar smart glasses Hypernova shockingly revealed: gesture control+monocular screen, pointing to the Apple ecosystem
The sharp rise in the price of Meta's new smart glasses is almost entirely due to its innovative high-definition screen configuration — a monocular display located in the lower right quadrant of the right lens. This means that even if the information required by the user is displayed only in front of the wearer's right eye, rather than covering the entire screen area, the clearest viewing angle is when looking slightly downward.
People familiar with the matter said that the Meta smart glasses team has begun developing a second-generation product (codenamed Hypernova 2). The biggest difference is that it uses a more advanced and intuitive binocular display system, that is, it is equipped with two high-definition electronic screens and displays information with both eyes. According to people familiar with the matter, the second-generation consumer electronics device is tentatively scheduled to be launched in 2027.
Smart glasses with screens will be another milestone for Meta towards true “augmented reality (AR) +AI” smart glasses. The company previewed the technology last year.
From Hypernova's original prototype system, you can see the operating logic of its breakthrough smart glasses after launch:
When starting up: The display will highlight the “boot screen”. At that time, the screen will show the logos of Meta and its partners (such as Qualcomm, the manufacturer of smart glasses core chips)
Main interface: A horizontally arranged circular icon will be displayed, similar to the app dock for Apple's ecological smart device or the application layout of the Meta Quest mixed reality headset
Built-in exclusive apps: It will include apps exclusive to smart glasses, such as taking pictures, viewing photos, and map navigation. In terms of terminal notification support, it can receive real-time messages from paired smartphone apps (including Messenger and WhatsApp)
Other features are similar to current Ray-Ban Meta Wayfarer-style smart glasses, including taking real-time images/videos, calling AI functions through the built-in microphone, connecting to a smartphone, and playing music. The new version is expected to still rely heavily on the Meta View smartphone app.
Like Meta's other new consumer electronics devices, these smart glasses will run a deeply customized version of Google's Android (Android) operating system. Whether they have a built-in dedicated app store is currently unknown. Users can operate with capacitive touch on the lens leg — slide the lens leg to browse apps or photos and click to select specific content.
Meta is also planning to launch the “Neural Wristband” series of end-side products (codenamed Ceres) for the first time. The wearer can control Meta smart glasses with gestures, such as rotating the palm of the hand to scroll content and pinch the finger to select operation items. This accessory is intended to be sold in bundles with glasses.
Meta is also planning to upgrade its smart camera system. In terms of cameras, Meta believes that the current 12 megapixel camera is only equivalent to the 2019 iPhone 11, while the new model aims to be comparable to the 2021 iPhone 13. Additionally, a tri-prismatic folding storage case (product code Heres) will also be launched.
The Hypernova smart glasses are a few months away from their official launch, and current plans may still be adjusted. After all, Meta is known for modifying products and even canceling projects in the later stages of development — about 18 months ago, the company accidentally cut the release of the camera-free version of Ray-Ban Meta (codename Luna), which was designed to reduce costs and improve privacy protection.
According to information, in addition to Hypernova, Meta is also working on the development of the screenless smart glasses Supernova 2. According to media reports, its intelligent operation logic is similar to the current Ray-Ban Meta, but it uses Oakley's eyewear design, which focuses on sports scenarios such as cycling, and has recently begun testing in public environments.
Hypernova 2, which is scheduled to be launched in 2027, will partially overlap with Meta's development of real “AR+AI” smart glasses. The latter requires the superposition of interactive images, video, and information in the real world. The technical requirements are far higher than Hypernova's simple head-up high-definition displays, and are much more expensive to develop.
The media previously reported that Orion, the prototype AR smart glasses announced by Meta last year, is currently only used for software testing and application development testing within Meta, and may eventually be open to developers; the first product for a wide range of consumer markets is probably the Artemis, a successor model, which is expected to be released no earlier than 2027.
People familiar with the matter revealed that Meta's Reality Labs division — the department that develops AI smart glasses and AR consumer electronics products — is still discussing product development plans and related details internally, such as eventually merging Artemis and Hypernova product lines, or whether to launch them separately at different prices.
Smart glasses may be the best end-side carrier for artificial intelligence technology
According to the market research agency Counterpoint Research's “Global Smart Glasses Model Shipment Tracking Report”, global smart glasses shipments increased 210% year-on-year in 2024, mainly driven by strong demand for Ray-Ban Meta smart glasses. As a result, the global smart glasses market according to Counterpoint Research broke through the 2 million mark for the first time, and recorded an unprecedented growth rate.
According to the agency's definition, the Ray-Ban Meta smart glasses are the first AI smart glasses on the market. They are equipped with an NPU integrated SoC, advanced camera and audio components. These technologies enable core intelligent functions, such as photo/video capture and audio playback, and support a wide range of application scenarios using device-side AI algorithms, smartphone-based end-side AI, and Llama AI models.
Ray-Ban Meta smart glasses redefine AI smart glasses in terms of definition and user experience. Prior to their release, early smart glasses focused on hardware improvements such as connectivity, audio, and battery life to address issues such as sound leaks, inaccurate voice pickup, and limited wear time. Although the upgrade also expanded voice assistant functionality, it didn't fundamentally change the user experience. After the release of Ray-Ban Meta smart glasses, demand in the global smart glasses market surged. According to Counterpoint's “Global Smart Glasses Model Shipment Tracking Report”, global smart glasses shipments increased 156% year on year in 2023, 210% year on year in 2024, and Meta monopolized more than 60% of the market share in 2024.
With the rapid development of edge computing, 5G networks, and artificial intelligence technology, end-side consumer electronic devices are increasingly capable of real-time processing data and seamless interaction with cloud AI models. This also enables smart glasses not only to collect real-time environmental data (such as vision, sound, location information, etc.), but also to achieve real-time generative AI function applications through local preliminary processing, that is, certain complex AI workloads are completed in cloud AI computing power systems, and real-time or sensitive tasks are processed locally, such as voice interaction, real-time translation, and augmented reality Navigation and contextual information superposition, etc.

Counterpoint said that with the successful release of Ray-Ban Meta smart glasses, a new wave of AI smart glasses has emerged, from reference design products provided by supply chain companies to commercial products launched by eyewear brand manufacturers after the end of 2024. The agency anticipates that smartphone leaders such as Xiaomi, Samsung, and Transsion will launch their first AI smart glasses in 2025, and that more smart device companies may enter the market in 2025 and 2026. Counterpoint expects the global smart glasses market to achieve 60% year-on-year growth in 2025 and maintain a compound annual growth rate of more than 60% between 2025 and 2029.