Optimistic renders of what Apple Glass could look like ā Image Credit: AppleInsider
Apple is reportedly custom-tailoring specialized chips, with the needs of the Apple Glass in mind.
Appleās internal silicon design team is behind the creation of chips used in the Apple Silicon line. After handling processing on iPhone, iPad, and Mac, the team are now working towards chips for other major product categories.
People familiar with the plans told Bloomberg on Thursday that there are chips being made not only for new Macs. Itās also working on chips for smart glasses.
Apple Glass processing
The challenge that Apple faces for Apple Glass is that it has to fit into a very lightweight design. One that has constraints on weight, physical size, and power consumption. A custom chip evidently fits in as part of the solution.
According to report sources, the processor for the smart glasses will be based on chips used in the Apple Watch. The versions used in the wearable require less energy than counterparts in Appleās mobile devices, in part because they have been customized to leave elements out.
The smart glasses chip, meanwhile, will also have to deal with processing feeds from multiple cameras on the frames, pointing out into the world.
It is expected that mass production of the chips will start by the summer of 2026 to make a debut in hardware by the end of 2026 or into 2027, under the eyes of chip partner TSMC.
AR and non-AR
The actual Apple Glass hardware itself may not necessarily offer the highly touted augmented reality functionality, as a first smart glasses release could leave that element out entirely. Even after years of development, augmented reality is still impractical to offer in smart glasses.
Apple is reportedly working on two versions of smart glasses, supposedly under the codename N401. A non-AR version will follow in the footsteps of hardware like Metaās RayBan partnership, which can handle calls, photography, and works with a digital assistant.
For Appleās non-AR glasses, it is still working on the possibility of using cameras on the headwear to scan the environment. With cameras potentially being added to the Apple Watch and to AirPods, this could help expand the external awareness of Appleās AI.
Chips being made for the AirPods to handle this functionality are allegedly named āGlennie,ā while a counterpart for the Apple Watch is apparently āNevis.ā
A big chip departure
The design of custom chips for smart glasses makes sense, especially when you consider Appleās current chip landscape.
For the Apple Vision Pro, Apple uses a dual-chip design, with the headset chiefly using the M2 for handling application processing needs. The R1, which accompanies it, is similarly high-powered and handles processing input from the plethora of onboard cameras, as well as handling positioning data and motion tracking, at an extremely high speed with minimal latency.
This is all great when youāre dealing with a massive amount of processing in a not exactly small package. But, it becomes a problem when you are talking about smart glasses.
Smart glasses is a product category that relies on being in a small spectacles-like frame, with little space available for components and a need to minimize bulk. That means any chip design has to be made as small as can fit onto a typical pair of glasses.
Then thereās the problem of available resources, as it will be considerably lower than a VR headset, simply to maintain the aesthetics of being spectacles. With reduced available battery life and no active cooling opportunities, that severely constrains a chipās potential.
One way around this is to offload the processing to another device, such as an iPhone kept in the userās pocket. An iPhone has a greater potential for processing, due to having a much larger battery and could feasibly remain cooler for longer, leaving the on-glasses chips for other local tasks.
<