How augmented reality works: the WaveOptics story
WaveOptics develops high-performance, highly manufacturable waveguides through which people experience augmented reality.
What does this mean in practice?
The company designs and produces waveguides for AR glasses that offer a high field of vision, full colour and – one of the key selling points – a scalable manufacturing process at a lower cost.
What are waveguides?
Waveguides are widely recognised as the key technology that make augmented and mixed reality work, by guiding waves of light from a projector to the glass in front of the user’s eye so that virtual objects can be viewed comfortably in a compact form factor, for example through a pair of smart glasses.
Last year the company successfully closed a £12m Series B funding round, the year’s biggest UK funding round in AR hardware, attracting investments from Octopus Ventures, Touchstone Innovations plc, Robert Bosch Venture Capital GmbH and China’s Gobi Ventures. And, not only that, they are also currently in discussions with several Tier 1 OEMs and ODMs in the US and Asia. Though they are not currently divulging names.
Background and history
The company was founded by Sumanta Talukdar and David Grey, and started commercial operations in 2014. The recent high profile appointment of David Hayes as CEO has created interest in the industry, given his extensive experience in hardware and technology innovation across research, product development and manufacturing, and his previous position at DAQRI smart helmet and glasses producers.
Heralded by the industry as a significant development – VRFocus called it “the beginning of a new growth phase for the firm” – the company confirms it is the start of its move towards wider commercialisation and the strengthening of its management team of AR experts spanning over five decades of cumulated experience.
WaveOptics chairman Martin Harriman says: “WaveOptics is reinventing the AR market by developing a series of new AR display tech that enables a wider field of view and brighter full colour images — a unique combination in today’s market.”
How does the technology work?
Marketing Director Suzie Smith explains that the design of the waveguide is of critical importance, to create a very large eye-box enabling a high field of view. “Our eye-box is very large, which means that AR wearables can be made for 98% of the population, covering all head sizes and eye shapes, which also makes the product efficient to manufacture at scale – it really distinguishes our product in the market.”
Within this large eye box and high field of view, diffractive waveguides combine real and virtual worlds. Text and imagery are projected by the near-eye displays, presenting computer images overlaid on the real world. The light is transmitted along the waveguide, via total internal reflection, towards the output region where it is guided towards the eye. This process, known as 2D pupil expansion, allows a small light engine to support a large eye-box. This enables the AR device using the waveguides to be small and light, with the ability to accommodate different head sizes and eye separations.
The waveguide consists of three panes of glass with microns-thick air gaps between each pane. It looks similar to a pair of glasses except that it’s layered. There are no active electronics or moving parts in the actual waveguide.
Combined, two light engines and two waveguides form a binocular module and WaveOptics is developing two variations of these modules with different light engine technologies that have a 40-degree field of view (FoV). WaveOptics have also demonstrated a monocular prototype that shows a higher FoV.
What challenges have they overcome on the journey?
Not only does the waveguide module need to give the user a great viewing experience but this has to be done with a high volume capable manufacturing process. WaveOptics main challenge has been to address both of these simultaneously.
What impact has the technology had on the sector?
The product differs fundamentally from others on the market, which has created substantial interest in the sector. Conventional diffractive waveguides have three distinct grating regions making them more complex: an input grating, a turn grating and an output grating, which is more complicated and more expensive.
In contrast the WaveOptics waveguides have a simpler optic structure with just two gratings: an input grating and an output grating. This science is called two-dimensional pupil expansion and gives WaveOptics the ability to create a large eye-box, which is the volume of space within which an augmented world is formed. In simple terms this has given the company the ability to make a large eye-box user AR experience at a lower cost and a faster rate of manufacture.
“AR is one of the most exciting growth markets today, however, commercialisation of the market has been very slow to date. WaveOptics’ patented technology is the core ingredient required to unlock mass market adoption of AR. Its diffractive waveguides ensure high-performance, high field of view see-through displays providing for the first time scalability for real world applications across all leading market segments – enterprise, prosumer, and consumer.”
— David Hayes, CEO Wave Optics
Tech news and review website Tom’s Hardware differentiates the offering from the Vive and Rift headsets as follows: “When you strap on a VR HMD, you’re essentially putting a small monitor right in front of your face. When you run a VR experience for the headset, all the graphics have to be fully rendered at high resolutions and framerates and produced on that little display.”
Comparing the process as similar to what happens between a gaming PC and monitor, the review goes on to explain that the waveguide model works in the opposite way. “Instead of rendering graphics to a hi-res display, it uses what’s called a light engine to produce a tiny image that’s projected into the waveguide, which then allows you to see the image. Because the rendered image is so small, it requires little in the way of compute resources, which enables lightweight and slim eyeglass-type designs.
When will we see it on the market?
According to Suzie, the technology will initially be adopted by the enterprise sector for industrial use cases. “There are many applications of this technology in a range of sectors, including aviation, manufacturing and construction, where there is a specific ROI for productivity or yield benefit. AR will then start to become more prevalent in the prosumer market, where the focus is likely to be on remote assistance. From 2020 onwards, we’ll see AR in the mass consumer market.” WaveOptics say their customers plan to have end-user products on the market by the end of 2019 into early 2020.
What does the future look like?
If predictions and trends are any indication, the future looks very bright indeed. The company featured in the list of the top 10 AR and VR start-ups in the UK selected by Growth Enabler recently. In its AR and VR Market Pulse Report, Growth Enabler noted the team’s expertise in AR, design and system engineering, as well asthe fact that there is limited competition for these optics in the UK market. It’s an opportunity for the company to accelerate growth, as well as to lead and define industry standards in hardware manufacturing for the AR market.
WaveOptics’ founders Sumanta Talukdar and David Grey were also included in the Fresh Business Thinking Shift 100 list, recognising retail technology entrepreneurs that are bringing new ideas and products to market, creating more integrated customer experiences and leading the move away from traditional “bricks and mortar” e-commerce.
Fresh Business Thinking says: “These entrepreneurs have been selected because they are dynamic, agile and shifting gears. They have brought an innovative product to market, cleverly harnessed a technology or process, led the revolutionary execution of a new technology and disrupted their markets.”
Wearable tech: where is the market going?
David Hayes explains that even if the most amazing pair of smart glasses was available today, the market wouldn’t take off because the ecosystem – the apps and software – are not yet ready, so the hardware wouldn’t get used.
He believes that the ecosystem – the apps and the hardware – will come together for mass adoption from 2020/2021. Merging mobile and AR using 5G technology is also a really exciting prospect, because the speed of the 5G network will mean that “connected” AR devices will be able to render the content from the cloud and stream services in just two to three years’ time.
Words: Bernadette Fallon