ULTRAHAPTICS: INNOVATIVE TECHNOLOGY TO MAKE VIRTUAL REALITY FEEL MORE REAL
“The purpose of Ultrahaptics is to change the way we interact with machines and the devices around us forever.”
— Tom Carter, CTO and co-founder, Ultrahaptics
The concept behind Ultrahaptics was launched as an undergraduate project at the University of Bristol that Tom Carter, company CTO and founder, developed into a PhD thesis. Today he is working on one of the most ground-breaking innovations in the immersive technology world, and the tech company was recently selected by Adweek as one that “brands should know about” at the Consumer Electronics Show (CES) 2018.
Advertising itself as working to create “the most remarkable connection between people and technology”, Ultrahaptics is using ultrasound to project sensations onto users’ hands, pushing through – it says – “an era of change, allowing people to feel and control technology like never before”.
Basically, the technology lets users touch and feel what they can see in augmented and virtual reality and also provides sensations for flat 2D objects. This ranges from interacting with buttons and dials that are felt rather than seen to tracing the outline of invisible 3D objects or feeling the sensation of raindrops falling on your fingertips.
The word “haptic” comes from a Greek verb that means “to contact” or “to touch” and while there are several tech companies in the industry working with haptic feedback simulations, Ultrahaptics says it’s the only one that lets people feel and manipulate virtual objects in the air.
Designing the concept
Looking back to the early days of the project, Tom points to Microsoft Kinect for the Xbox as the pioneering change in the industry that lay the ground work for his vision – the first time users could control a computer without touching it, for less than £10,000.
“I started to think about how to enable a more fluid and intuitive way of interacting with the world around us. My university supervisor had an idea about using sound-waves to simulate sensations and so I started to investigate the concept.”
Together with that supervisor, computer science professor Sriram Subramanian, and Benjamin Long, a research assistant from the university lab, they formed Ultrahaptics in 2013. And the first thing they did was book their place at CES in Las Vegas, the global gathering place for innovators and pioneering thinkers. With 4,000 consumer technology companies exhibiting to an audience of just under 200,000, their main objective, explains Tom, was to find out who would be interested in using their technology and how.
And the answer is that everybody was. Which didn’t narrow down the market as they’d hoped it would, he admits, but at the same time was very good news.
“We gained a lot of insights from potential customers,” reveals Tom. “People had problems that we could solve.” Such as the big players in the automotive world who were developing sensor controls for cars. Pressing physical buttons while driving a car is too dangerous but using touch-controls is also tricky, as it takes the driver’s attention off the road. Jaguar Land Rover is now planning to integrate this technology into a gesture-control system for its cars.
Developing the technology
The basic technology is ultrasound, which vibrates at a frequency too high for humans to hear. The innovation lies in combining the sound waves, each one vibrating at a slightly different frequency and merging them to create enough force to push and displace the user’s skin, producing a discernible skin vibration. The ultrasound waves construct 3D objects in the air that users can feel. “We’re manipulating sound waves to create sensations,” says Tom.
Speaking to press in the past, Tom has admitted that the hardware they use – an array of ultrasound emitters hooked up to an off-the-shelf gesture-control platform – is nothing special. “The clever bit is in our software. It is actually in the algorithms of how you drive the emitters to create the sensations.” This software can be programmed to find the user’s hand and direct the sound to it to create the sensations. It can also create multiple sensations simultaneously, allowing different hands to experience different things.
“This elegant and simple technology was created using complex mathematics, yet is based on human nature.”
Their first foray into the public arena also secured the young company the funding it needed to turn their prototype into a robust platform. Gaining £600,000 in seed money from British VC firm IP Group Plc gave them the means to hire current chief executive Steve Cliffe in 2014 and in 2015 they secured a £10.1m Series A investment round, led by Woodford Investment Management. This gave them the chance to work with Neil Woodford, one of the most successful fund managers in the last 20 years.
Further investment came last year in the shape of a £17.9m Series B round of funding, from Dolby Family Ventures, Cornes, IP Group and Woodford – and a funding office in Tokyo has opened up Asian contacts and new market potential.
Starting off in the university in Bristol, then working from people’s houses and occasionally – says Tom – from the pub with a laptop, the company now employs just under 100 employees in offices around the world. The headquarters remains in Bristol, with satellite offices in the San Francisco Bay Area, Munich and Singapore, as well as partners in Japan and South Korea.
Finding the right skills
With the majority of the team drawn from an engineering background, they are building the roles from scratch because, as Tom points out, “you would never be able to find an ultrasound haptics engineer in the market, because they didn’t exist until we created them”.
However, the process of recruitment has been less painful than he expected, mainly due to a few factors. “We’re based in Bristol which has traditionally been home to two key sectors, deep technology/engineering and the creative industries – and we draw in people from both of these. Also, we have developed a reputation as a cutting-edge technology company and a stimulating, fun place to work, so people want to join our team.”
Dealing with the challenges
Though of course there have been challenges, mainly born out of the pressures of growing a company very quickly, building from three to 100 employees in just four years. “We are constantly devising new working process that then have to be thrown away,” Tom reveals. “It’s been a challenge to go from having everybody in one office to having a team operating out of different time zones around the world. It’s been a big culture shift for the company and it affects how we communicate – I can’t just stand up from my desk and shout anymore.”
Their international customer base, based mainly in the US – and California in particular – gave rise to the possibility of moving their entire operation to the States. However, UK investment, particularly in the form of patent capital, has helped them to remain in the country, despite the fact the market here is quite small.
Ultrahaptics’ latest collaboration, unveiled at the start of the year at CES 2018, is designed to connect people “more naturally” with digital experiences. Working with technology pioneers Meta Company and ZeroLight, the experience offers users the opportunity to intuitively touch and explore a virtual Pagani Huayra Roadster hypercar through an AR headset.
Focusing advanced augmented reality, haptic feedback and real-time 3D visualisation technologies to create a holistic interaction, it demonstrates what the marketplace of the future might look like in the automotive industry, explains Tom. “Potential buyers can touch a virtual car, and can even raise the bonnet and feel the vibration of the V12 AMG engine.” They can also configure the vehicle and suspend individual components of the car for further inspection.
“Touch is intrinsic to our understanding of the world and how we interact with it. This shouldn’t be lost when we interact with digital media and virtual objects.”
— Anders Hakfelt, VP for Product and Marketing, Ultrahaptics
The company has also just signed an agreement with global gaming brand IGT to supply its mid-air haptic feedback solution for implementation in IGT’s TRUE 4D™ games on the CrystalCurve™ TRUE 4D™ cabinet, the first product of its type in the market. “Consumer gaming is huge in Asia and the US,” says Tom. “Using our kit, users can now see 3D objects in front of them without wearing glasses and touch them without wearing haptic gloves.”
A recent project at New Scientist Live 2017 in London, showcased an interactive digital Star Wars poster that used ultrasound to let users’ experience “the Force” by passing their hand over it. This is opening up a new way for brands to connect with and engage consumers on a different level and has attracted some impressive feedback: “I am used to ignoring adverts but this is different” and “It is having an effect on me, one that an advert wouldn’t have”.
Giving people the ability to “touch” invisible objects has provoked many amazed reactions – such as the responses the team recorded in a science museum in Bristol as people interacted with their technology.
What impact has it had on the industry?
“We do what we do in two sectors,” explains Tom. “We provide feedback for controlling devices without touching them in the real-world and in immersive realities, we allow people to “feel” digital content. We’re working with some very big players in the industry, including Bosch, Harman and several big automotive brands.”
The team’s recent partnership with Dell, Nike and Meta is also proof that the sector is moving away from a siloed way of working and taking a more collaborative approach to creating next generation products, supported by new and emerging technologies. The project combined augmented reality, voice control, a digital canvas and haptic technology to allow designers to engage with a holographic image or 3D model of their design, using the Dell interface, Meta2 glasses and Ultrahaptics haptic feedback technology.
Ultrahaptics has also had a very big impact in the immersive reality entertainment sector, with projects including The Entangled Body interactive installation at STRP Biennale Eindhoven 2017, where an invisible second body was constructed from visitors’ brain activity and made tactile using haptic technology, and the Tate Sensorium exhibition at Tate Britain, where four paintings from the gallery’s collection were stimulated by touch, realised by the Ultrahaptics transducer array. They also provided the haptics for a Halloween Magic Castle in Hollywood, where visitors fought off an alien invasion by throwing balls of sizzling power at the invaders.
“That’s what we bring to an experience,” says Tom. “You can see digital content but being able to touch and feel it makes you believe it’s real, it takes the immersive experience to a new level. We’re creating quite magical experiences for people, they can literally feel and see lightning coming out of their fingertips.”
What does the future look like?
Ultrahaptics believe that eventually we all will experience this technology as part of daily life, whenever we go – at home or at work, in the car, doing the shopping, chilling out. These experiences will proliferate our lives, they claim.
Speaking at the launch of the virtual Pagani Huayra Roadster hypercar experience at CES 2018, Apps Team Lead at Ultrahaptics Adam Harwood said, “I’m really excited to see where this goes and how we’re using it in five and ten years’ time”.