macroraptor

A Screen the Size of the Moon

hero

It took 80 years, but we have reached the point of diminishing returns for rectangular pixel grids. From the mass adoption of cathode ray tubes through to LCDs and modern OLED portables, we've been witness to increasingly bigger, brighter and more detailed displays. Yet for work, I find myself increasingly craving displays that are more ergonomic and portable.

Our main alternative to dots in fixed planes is virtual reality. VR has pitched itself as the display successor since the 90s, and every few years "VR is taking over" gets shouted from the rooftops. Taking over what? I dream of working from the beach, dip my toes into the latest headset, and come back to my monitor disappointed.

Maybe this time is different. The industry has everything we need, and just needs to put it together.


Everyman's Virtual Office

I'm a bit scared of the future where we all walk around with projectors glued to the inside of our corneas, cyber-wallpapering a dilapidated world.

As a stopgap, I have an aggressively unsexy use case for VR ... replacing a display for work.

My one big monitor wherever I go is 27" at 2560x1440 resolution, which I would feel comfortable working at indefinitely. Here are my criteria to replace that sort of rectangle:

wide

The human eye has a field of vision (FOV) of roughly 210x130 degrees (horizontal x vertical). Current commodity headsets cap out at about 100x100. I don't want tunnel vision when getting my work done. I want to see the world around me as much as possible in camera passthrough.

In 2025 Meta demo'd the Boba 3 prototype, providing 180x120 coverage. It uses dual-element pancake lenses with high-curvature reflective polarizer layer. Internal reflection allows for compact optics which fit the form factor of a shippable headset.

dense

Remember Apple's original iPhone "retina display"? At typical viewing distances, we can resolve ~60 pixels per degree. Indeed, simulating my preferred monitor (27" 1440p at 30" ~= 57 PPD), but consumer headsets only provide about 25-40.

Meta's Tiramisu prototype (2025) delivered 90 PPD via micro-OLED panels, but the three-element glass stack limits FOV to a tiny 33 degrees. Because pancake optical paths drop contrast and introduce blurring at high pixel densities, Tiramisu necessitated swapping the pancake for a higher quality refractive glass stack. The upcoming Tiramisu 2 will chase 60 PPD and 90x90 degree FOV through a thinner design that bends light with microscopic surface gratings instead of curved glass.

far

Our eyes do two things when focusing on an object. They rotate to point at it (vergence), and our lenses squeeze to match its depth (accommodation). These are neurologically coupled, but contemporary VR headsets break the coupling. Vergence follows virtual depth, but accommodation stays locked at the display's fixed focal plane near 1.5m, so users feel the mismatch as eyestrain. Passthrough makes this worse, since a keyboard close by and wall far away land on the same focal plane.

Meta's Butterscotch Varifocal (2023) achieves varifocal behavior by mechanically moving the displays in the style of a DSLR autofocus system. An eye tracking system rapidly detects where the user's eyes converge, and tiny actuators reposition the displays to match that depth.

wide, dense, and far

No prototype yet combines all three axes. Tiramisu hits retinal density but only 33 degrees of FOV. Boba 3 hits FOV at lower density. Butterscotch pairs density with varifocal but tops out at 50 degrees. Integration is the remaining engineering, not new invention.

The first integrated headsets will be expensive and heavy. Boba 3 alone weighs two pounds and tethers to an external computer with the power consumption of a small microwave oven. Stacking retinal-grade displays and wide varifocal optics into the same device only adds mass, heat, and compute load.

Once individual axes have proven possible, the inexorable march of miniaturization and commoditization has occurred repeatedly in consumer electronics. Meta's research-to-product lag historically runs 5 to 7 years. Wide and dense headsets should ship in consumer form around 2030, and I hope wide-dense-far integration follows ~2032.


Who will get us there first?

As evidenced by their production of every single one of the frontier-pushing dimensions above, Meta Reality Labs is far ahead of the public facing competition. They better be, having burned more than eighty billion dollars since 2020. A close second in my eyes is Apple, who never demos prototypes yet has actually managed to ship (at a ludicrous price) the productivity-focused Vision Pro.

Otherwise, specialist startups all over the world push frontier VR capabilities along singular dimensions. My intuition is that their work will be rolled into acquisitions by Apple, Meta, or a leading AI lab on the way to eating the world.

Whoever gets us there first, I can't wait for the day I strap on my headset, typing onto a faraway screen the size of the moon.

pilots