Tech

Lofelt’s framework that brings better haptics to Android is available for OEMs

Haptic feedback on Android phones is usually terrible, at least compared to devices that have Apple’s Taptic Engine. You know it, I know it, and a haptics company called Lofelt apparently knows it, too. Today, though, Lofelt, in partnership with Qualcomm, announced that it’s making its VTX haptic framework available to OEMs, with the hope that these manufacturers will start putting better haptics into more Android devices.

The framework will allow OEMs to utilize high-definition haptic vibration on phones running the latest Snapdragon 8-series and 7-series without any hardware tweaks. The system is “tightly integrated with Snapdragon and fully validated by Qualcomm,” according to Lofelt. While the framework is technically compatible with older phones that have a Snapdragon 7-series and 8-series chipset, it’s at the discretion of each OEM whether to add support. It seems more likely that companies will make the big effort to implement improved haptics in upcoming phones than adding it to older models. As for which newer models will feature the framework, none have been announced yet.

Ahead of this announcement, I got to actually feel the difference myself that Lofelt’s improved haptics can make on a phone through software tweaks only. The company sent over a custom Google Pixel 4 packed with demos that show off how its vibrations feel compared to the stock Android implementation (the same setup can’t be duplicated on other devices). Compared to phones designed to take full advantage of Lofelt VTX’s integration with the chipset, this Pixel 4 was running the haptics tricks through the application layer as a proof of concept.

Lofelt’s app let me feel the difference between stock Android vibrations and the company’s own — and the difference was noticeable.

Each of the demos featured a toggle that lets you switch between the stock Android vibration and Lofelt’s refined haptics, as shown in the bottom right of the picture above. One instance ran through a few scenarios in Call of Duty: Mobile. The vibrations varied in strength and duration depending on the kind of gun being shot, and I was able to feel distinct pulsations of a helicopter soaring through the sky. A demo for Asphalt 9: Legends let me feel the roar of an engine through haptics, as well as the fast, crunchy pulsing of the vibration motor when the car ran over a dirt median. Even though the Pixel 4 and earlier models don’t natively support VTX, these demos were far more expressive than standard Android phone vibrations. If you’re someone who takes mobile gaming seriously, to the point of owning something like a Razer Kishi controller, haptics could make your favorite games feel more immersive.

The big challenge for Lofelt isn’t in proving that these sorts of immersive haptic experiences are worth building (give the Lofelt Studio app for iOS a try if you need convincing yourself). The real work is ensuring that as many phones as possible can eventually get them, and that it won’t take too much effort for developers. That’s where Lofelt’s AX (adaptive experience) signal processing tech built into VTX comes into play.

According to Lofelt, AX converts a universal haptic signal into vibrations that play to the strengths of each individual phone, taking into consideration its haptics driver, actuator hardware, and control algorithms from the manufacturer. The goal is for Android devices to catch up to Apple’s excellent Taptic Engine that’s in modern iPhones.

Older phones with the Snapdragon 7-series and 8-series can be updated as OEMs see fit, but you’ll likely find better haptics in new phones.
Photo by Amelia Holowaty Krales / The Verge

App and game developers can design advanced haptics using Lofelt Studio, which integrates with Unity, Unreal, and Xcode, and create a universal .haptic file that works across multiple devices. According to Lofelt, these files contain “universal parameters, such as whether the haptics should include smooth, continuous signals or more punchy, dynamic events.” In other words, you’ll only need to build it once, then the framework can deliver a consistent experience across different phones.

Existing games and apps running on supported Snapdragon hardware won’t be left in the dust. Lofelt claims that framework will also be able to convert the audio stream coming from an app into vibrations in real time without any hardware or coding modifications. So, you won’t necessarily need to wait for every developer to create bespoke haptics in order to start feeling some tactile enhancements with apps you already know and love.

Lofelt says that its VTX framework comes with no performance trade-offs, and that it aims to deliver better haptics “while minimizing the impact on battery and actuator.” That sounds great, and the demo I tried feels ready for deployment. But now, the wait begins for some phone manufacturers to kickstart what could be a big, positive change for Android.


Source link

Sonal

Scoop Sky is a blog with all the enjoyable information on many subjects, including fitness and health, technology, fashion, entertainment, dating and relationships, beauty and make-up, sports and many more.

Related Articles

Check Also
Close
Back to top button