Quantcast
Channel: Haptics – Road to VR
Viewing all 84 articles
Browse latest View live

Hands-on: Gloveone’s Newest Haptic Glove Has Impressive Tracking Tech

$
0
0

At E3 2016 I got to see the latest prototype version of Gloveone, a haptic motion input glove designed for virtual reality. The prototype now uses its own tracking system which in early testing seems impressively robust.

Gloveone, created by NeuroDigital Technologies, successfully funded just over $150,000 on Kickstarter in July 2015 for the first version of their haptic VR glove which uses tiny vibrators placed at the ends of your fingers to create the sensation of touching virtual objects. But that first version of the glove required a third-party tracking system like Leap Motion in order to translate the movement of the user’s hands into virtual reality.

gloveone prototype tracking (4)The latest Gloveone prototype integrates its own tracking system which utilizes IMUs arranged along each finger and along the user’s arm and torso. The result is tracking for not just your fingers, but for your entire arm and torso as well.

Now I know what you’re thinking and it’s the first thing I thought as well: a purely IMU-based tracking system is going to drift; after just a few minutes my hand will be in some crazy position that doesn’t match its real-world orientation at all. So when NeuroDigital Founder Luis Castillo claimed there was “zero drift,” I was extremely skeptical.

Much to my surprise, the tracking held up. Even after I went out of my way to torture test it by spinning my hand around on my wrist continuously for several seconds, the virtual version of my hand remained convincingly close to the orientation of my real hand.

Castillo wouldn’t say what they’re doing differently mitigate drift, but given that IMU’s are only capable of relative tracking, there must be some drift correction happening. At an individual IMU level, a magnetometer is generally used for drift correction, but even then it’s tough to eliminate entirely. The secret, I suspect, is treating all of the IMUs as a self-corrective system, and the sensor on my chest is probably an essential reference point for the rest of the array, not to mention the VR headset on my head which is itself tied to an absolute positioning system and may be involved in the positional determination.

gloveone prototype tracking (3)

The tracking enabled my hand to be positionally tracked, such that I could reach forward to grab objects, and also track the orientation of my wrist, as well as each of my fingers. My arm and chest were also tracked and accurately reflected the movements of my torso while in VR. Gloveone uses a pinch motion allowing you to grab objects in the virtual world, but instead of the usual optical or gestural detection, the glove smartly uses capacitive sensors to detect the pinch, which is much more robust than other methods.

The latency of the input was also quite impressive. Although in my short hands-on I had no means of measuring latency directly, it felt in the same class as many of the best VR input devices out there, like Leap Motion, Oculus Touch, and HTC Vive controllers.

gloveone prototype tracking (1)The haptics on the Gloveone prototype were functional but didn’t quite sell me on the sensation. The glove uses small vibrating motors at the tips of each finger to give the feeling that you are touching objects in the virtual world. The problem is, when I grab a bottle of water in the real world, I feel pressure on my fingers, not vibration—there’s a mismatch between what my brain is expecting to feel and what it actually feels. Granted, there were a few effects—like the jet exhaust from a hovering drone—where the vibrations did create a plausible sensation. Castillo told me that the particular demo that I experience was made to show the tracking and not so much the haptics. He said that other demos the company has built show a wider range of haptic effects, but I unfortunately didn’t have time to dig into them.

The post Hands-on: Gloveone’s Newest Haptic Glove Has Impressive Tracking Tech appeared first on Road to VR.


Tactical Haptics Adapts Prototype Haptic Controller for Oculus Touch

$
0
0

Tactical Haptics, creators of the ‘Reactive Grip’ haptic technology have adapted their latest prototype for use with the Oculus Touch VR controllers.

Reactive Grip is a novel haptic feedback technology that’s unlike anything you’ll find in modern day controllers which by and large rely on ERM motors or linear actuators to provide a rumbling sensation. Rather than rumble, Reactive Grip uses sliding bars positioned around the controller’s handle which put pressure on your hands to simulate and object moving within them. For certain use-cases, the effect can be very convincing. This (now quite dated) video does a good job of showing how it works:

To show off their latest prototype, Tactical Haptics adapted it for the Oculus Touch controllers and took them to yesterday’s SVVR Meetup at the NVIDIA campus in Santa Clara, CA. There they let attendees try the controller and the latest demos. While the makeshift mounting makes the controller quite tall, Tactical Haptics founder Will Provancher says that it works quite well because of the Touch controller’s light weight.

oculus touch 2016 prototype hands on gdc (3)
See Also: Hands-on – Oculus Touch 2016 Prototype Brings Refinements to an Already Elegant Design

When I last had my hand on the Reactive Grip feedback, it was when the company had adapted their prototype for the HTC Vive controllers. It was then that I concluded, “Forced to choose between the two, I’d easily pick Reactive Grip over rumble, but ultimately the two complement one another, especially if the rumble comes from the more modern linear actuator approach which is great for subtle clicking and tapping effects as well as the usual ‘dumb’ rumble that we associate with the ERM motor rumble common in modern-day gamepads.”

The end goal for Tactical Haptics is of course to have a single, sleek integrated controller that does both tracking and haptics in one, rather than having to mount a VR controller to the haptic controller. That should be possible once Oculus and Valve finally open up their tracking solutions to third parties, but at last check it seems both companies are content to take their time.

The post Tactical Haptics Adapts Prototype Haptic Controller for Oculus Touch appeared first on Road to VR.

Hands-on: 4 Experimental Haptic Feedback Systems at SIGGRAPH 2016

$
0
0

One of the biggest limitations of current VR technology is the inability to convey a realistic and complete sense of touch. So far, the most you would feel from a modern consumer VR system, like the HTC Vive, is vibration through a controller. From what we saw at SIGGRAPH 2016, it seems that the logical next step just is to produce more kinds of vibrations, at more points around the body, all synced with proper visuals and sound in VR.

Developers, researchers, and companies from around the world presented experimental haptic technologies at SIGGRAPH this week, ranging from novel hardware, to narrative experiences simply enhanced by a few vibrating units on your back.

Synesthesia Suit

One of the first things I tried was the Synesthesia Suit by researchers from Keio University. You may know it already as it was first seen in the demonstration of Rez Infinite on the PlayStation VR back in 2015. Now they’re back again showing off the suit with an additional VR experience they developed.

First I played Rez Infinite with the PSVR and the suit. It did seem to add more immersion, but ultimately did not make the game feel that much more real. Then I tried the other experience with a different suit running on the HTC Vive (an unnamed demo for show purposes) and it did actually impress me with how real some of it felt.

Synesthesia Suit haptics siggraph
The Synesthesia Suit exhibit

This other suit had differently placed vibrators on the suit than the first one I tried; there were even two on my head! The vibrators are actually voice coil based actuators, according to Kouta Minamizawa, the associate professor who worked on the suit, which are capable of a wide range of vibrations.

The demo put me through several abstract scenes with similarly weird synthetic sound that reacts to what you do, or what happens to you. There were lasers hitting me, phantom beings passing through me, floating orbs of energy all throughout, and other random things. This time it was painfully clear—almost but not quite literally—that the vibration units on the suit were more capable of complex feedback than you’d think, from smooth buffets of energy, to sharp piercings of lasers. But there’s also another trick.

rez-suit
The configuration screen showing different haptic effects

The trick is in the synchronization of the right intensity vibrations with the right visuals and the right audio, which will be the key lesson to all of these projects. Without those ingredients, the haptics of the suit would simply feel like vibrations from a suit, not something actually touching you. That may be because the brain expects to feel a certain kind of feedback when seeing certain things happen to the body, or likewise expects something visually matching to understand better what’s being felt. Audio can also have a role in those expectations.

So if you’re missing any of those components, haptics can come off as fake. That may have been what was holding back Rez Infinite from showing off the full potential of the suit, since the game, at least in the demo I tried, didn’t have anything visually come into contact with my body in VR, so my brain wouldn’t have any context for what the vibration represents.

tactical-haptics-htc-vive
See Also: Hands-on – Tactical Haptics’ Vive Demo is Further Proof That VR Needs More Than Rumble

In the unnamed demo they had with the other suit, there was all manner of stuff touching and hitting me and it felt a lot more real; it showed me for the first time the potential for vibration-based body haptics in VR.

Much of the time, however, it actually wasn’t very effective, as what I saw wasn’t exactly in sync with the vibrations. That was mostly because the suit didn’t actually do any body tracking, and just relied on assumptions of where my body would be, based on the position of the head and hands from the Vive. In the future that could change of course, as body tracking technology gets better and more widespread. Nonetheless, when it worked, it was on the verge of impressive. It truly felt like lasers were hitting my noggin at one point, and that’s a good thing, at least for now.

It’s true however that this concept, of higher realism with perfectly synchronized haptics, visuals, and audio, isn’t really new. It’s actually a rather simple and logical idea that has been proven for a long time. But with the rise of consumer VR technology and the new levels of ‘presence’ brought forth, the topic is more relevant than ever, to both researchers and businesses. In any case, many are attempting to reach for the holy grail of VR in their own way, and haptics is an important part in getting there.

Oculus HapticWave

oculus research hapticwave haptics siggraph

Work from the other researchers at SIGGRAPH reinforces this overarching concept of matched feedback. Especially Oculus Research’s ‘HapticWave’ project, a VR multi-sensory experiment with a vibrating metal plate. It essentially has users place their hand on a metal plate, put on an Oculus Rift, and control a bouncing ball, or a moving spark, on a table. The user would feel directional vibrations that mimic with a high degree of accuracy what is happening on the table and hear it happening with positional audio from the Rift’s headphones.

When I tried it, it actually took a considerable amount of time to get used to the locked in calibration they had to set for efficient show demos, and in fact I had to try it several times, but when it worked for a few split moments, again, the objects did feel more physical rather than virtual. So it supports the idea that matched haptics, visuals, and audio, indeed were incredibly valuable together while not so much individually.

FOVeated-rendering-SMI-2
See Also: NVIDIA Says New Foveated Rendering Technique is More Efficient, Virtually Unnoticeable

We also can’t forget that Oculus, one of the biggest consumer VR players, is involved with HapticWave, and a few have questioned what they’re even doing with a such a bulky and kind of ‘useless’ metal plate. The answer is basically what most researchers presenting at SIGGRAPH would say: this is research to further our understanding of the field, not necessarily something that will be developed into a consumer product.

In this case, it would be to study the human perception of haptics, the limits, the effective baseline, etc. The HapticWave technology is partly a tool for making this possible, and one that will continue to advance as research is furthered. It’s also useful to keep in mind that Oculus Research, who created HapticWave, is a division at Oculus focused on technology that may be beneficial in the long term—5 to 10 out—so what they’re doing with HapticWave might just be a slice of the groundwork they’re setting.

Read the HapticWave Research Paper

Continue Reading on Page 2 >>

The post Hands-on: 4 Experimental Haptic Feedback Systems at SIGGRAPH 2016 appeared first on Road to VR.

Striker VR Shows off Working Prototype of ARENA Infinity Haptic VR Gun

$
0
0

Striker VR’s ARENA Infinity is a haptic VR gun which can simulate various weapon fire modes and other haptic effects. After revealing the design of the accessory back in April, the company is now showing off a working prototype.

Based on the awesome retro-futuristic design by Edon Guraziu, Striker VR is showing off the first working prototype of the Arena Infinity. Aimed at the Digital Out-of-Home VR sector, the wireless peripheral has on-board haptics based on a linear actuator. The gun is capable of impressively powerful kick, especially for an electronic system, adding a convincing recoil to firing a virtual weapon.

striker-vr-arena-infinity-prototye striker-vr-arena-infinity-prototye-v1

The haptic engine in the gun can give feedback for the usual single, burst, and full-auto firing modes, but can also be used for other effects, like a sci-fi railgun that needs to be charged before firing (shown in the video at 0:28), or a chainsaw for hacking zombies apart (0:10).

Photo courtesy Valve
See Also: Valve Opens Vive’s Tracking Tech to Third-parties for Free, Details Dev Kit for Licensees

The Arena Infinity Prototype is currently using a temporarily affixed tracker, but the company plans to provide formal support for several tracking systems, giving location-based VR firms a choice in which tracking system is best suited for their use. The company says the Arena Infinity currently supports PhaseSpace and Sixense STEM tracking, and is also aiming to integrate Valve’s Lighthouse, Oculus’ Constellation, and PlayStation’s Move tracking systems.

Striker VR says that the first Arena Infinity development kits will be delivered to select partners in Q4 2016, which will include the haptic gun, SDK, and haptic sandbox range as an SDK sample. The company says more broad delivery of the development kit will come “soon after” the initial rollout.

While Striker VR hasn’t announced a consumer-facing version of their haptic VR accessory, they tease, “the Arena Infinity is a first step to a broad solution aimed at peripherals that are easily attached to the virtual environment and afford users an infinite array of possibilities.”

The post Striker VR Shows off Working Prototype of ARENA Infinity Haptic VR Gun appeared first on Road to VR.

PowerClaw is a Haptic Glove Ready To Freeze, Burn, and Shock You (Virtually)

$
0
0

PowerClaw is a haptic glove that lets you feel heat, cold, and a number of sensations on the tips of your fingers. Coming to Gamescom 2016 right after the recent launch of their IndieGogo campaign, the team is now showing off a near finished version of their hardware in hopes that the burgeoning virtual reality industry will make way for a glove that effectively lets you feel VR.

In the business center of Gamescom 2016, I happened upon PowerClaw, a Mexican start up that’s integrated a series of miniature motors and thermoelectric cells into a glove that really lets you feel the cold of an ice cube, the heat of a flame, and simulates electric shocks and whatever else you can possibly simulate via the spasmatic whirring of the same hardware you find in ordinary smartphones.

Putting on the glove and seating the individual haptic units onto the tips of my fingers correctly, I was put through a virtual torture chamber of ice guns, flame throwers, needle machines, and electric shocks to demonstrate the haptic power of the glove while using the Oculus Rift DK2. Interactions like the last two mentioned depend purely on the frequency and duration of the motors’ buzzing – something that, given the right visual cues, can be surprisingly convincing.

powerglove demo finger-shaped actuator unit

All of this however was done without any sort of tracking, a duty that PowerClaw has rested on the backs of optical hand tracking devices like Leap Motion.

The gloves each had a thick cable leading to a single 3D printed breakout box that provided the voltage necessary to run the glove’s interactions. A single USB connection snaked back to the computer driving the demo from the box.

powerclaw box

After popping out of the gloves and making sure the tips of my fingers were in good shape, project creator Alyed Tzompa told me that until they cranked down the voltage to the current setting, the thermoelectric components actually had the ability to burn developers’ finger tips. (mine were just fine)

PowerClaw is currently bottoming out at $595 for a pair of gloves (super early bird special), a steep price for something that doesn’t really work out of the box in VR without the aid of a separate and decidedly imperfect tracking solution (although with Leap Motion’s Orion update tracking has improved).

The company will be releasing their SDK and a number of development examples upon release of the haptic gloves, which is slated for delivery in February 2017.

The post PowerClaw is a Haptic Glove Ready To Freeze, Burn, and Shock You (Virtually) appeared first on Road to VR.

Dexta Shows Off Latest Exoskeleton Gloves That Let You Touch VR

$
0
0

Dexta Robotics are looking to create the next generation VR input device, one that not only lets you interact with virtual objects but also lets you sense the size and solidity of them, letting you touch virtual reality. This is the Dexmo exoskeleton glove.

The technology that enables the core experience that comprises virtual reality is progressing quickly, but as VR headsets become more advanced companies are looking to where the next generational leap in VR will come from. Arguably, that next advance will probably involve input. By the end of the year, we’ll have three major VR platforms with motion controllers paired with their systems, with the HTC Vive already providing out of the box Lighthouse-tracked controllers, Oculus’ Touch set to launch by the end of the year and PlayStation VR’s Move controllers set to see a resurgence alongside the headset’s launch in October. So what’s the next step for VR?

If you’re a long time reader of Road to VR, you may recall Dexta Robotics. They’re the company behind Dexmo, a robotic exoskeleton glove that provides force feedback to simulate the act of touching objects in virtual reality. Dexta believe that this advanced form of haptic feedback encapsulates a next step in the VR experience. Well, Dexta are back and they’re ready to show the latest version of their devices.

Dexmo works by capturing the full range of your hand motion, providing what Dexta term as “instant force feedback”. With Dexmo, the company claim, you can feel the size, shape, and stiffness of virtual objects. “You can touch the digital world.” When your virtual avatar encounters a virtual object, its physical properties are, its physical properties are encountered via Dexmo’s dynamic grasping-handling software, which then provide computed variable force feedback, the glove essentially applies inverse force to your fingers, allowing virtual objects to provide a sensation of ‘pushing back’ against your fingers, just like a real object.

That’s impressive enough, but thanks to the variable nature of the force applied, the gloves can also translate more subtle physical object properties, the company says, so you can detect the softness of a rubber duck compared to the solidity of a rock, for example. You can see some of the examples of how this works in practice in the demonstration video above. “We conducted many studies where test subjects performed tasks using Vive controller or Dexmo. Tasks such as turning a knob, grasping an odd-shaped object, playing piano, pressing buttons, and throwing a ball. As expected, no contest! Users preferred Dexmo, and reportedly enjoyed a much higher level of immersion.”

Right now, the Dexmo works in tandem with two Vive controllers strapped to the underside of the user’s arms to provide positional tracking of your hands in VR; that tracking is something the company is looking to integrate properly at a later date, either via Lighthouse tracking licensing, recently opened up to developers by Valve, or another method yet to be decided.

dexmo-prototype-1 dexmo-prototype-2 dexmo-prototype-3 dexmo-prototype-4jpg dexmo-prototype-5 dexmo-prototype-6 dexmo-prototype-7

In terms of software integration, Dexta provide an SDK which allows developers to assign physical property values to virtual objects, so that the VR experience can send that data to the battery powered gloves, wirelessly communicating with the host via NRF, with Dexta citing Bluetooth and WiFi as “too slow” for their needs, claiming their solution achieves around 25-50ms of input latency.

Dexta are now ready to show the latest version of the Dexmo, which they believe now represents the core of what they intend to offer in a final device; this latest prototype can be seen in the images and videos in this article.

We wanted to dig a little deeper into the history and future of Dexmo, so we spoke to Aler Gu, CEO of Dexta Robotics.


Road to VR: The Dexmo has come a long way since we first saw it. Can you outline the some of the key milestones you’ve passed?

Gu: Indeed Dexmo has came a long way. Ever since Oct 2013 when we initiated Project Dexmo, we haven’t stop improving it:

  • In June 2014 we came up with the design of “switching force feedback”, proved the concept and filed a US patent for it.
  • In Oct 2014 we launched and then canceled the KS project. But we found our way of gathering capital after that and pushed Dexmo into Manufacturing afterall.
  • In June 2015 our experiments of the “variable force feedback” design along with some other force feedback attempt was proven to be working. We filed several patents to protect the IP as well.
  • In Feb 2016 our force feedback unit was successfully batch manufactured and then assembled into Dexmo. It was a big deal for us because the FFU was designed from scratch and it took a lot of effort to go from sth that “barely works” to a “manufacturable reliable product”
  • In June 2016, we finished multiple Unity Demos to demonstrate what Dexmo is capable of. And our SDK was finally finalized.

Road to VR: Your promotional video predicts that Dexmo will be beneficial in various industries, but I wonder if you have your eye on a consumer device? How much would such a consumer product cost?

Gu: It’s not the right time to talk about price now. Before actually selling Dexmo to consumers, we have to make sure it works right out of the box. So our plans for now is partner up with talented VR developing firms and deliver the best immersion experience possible. So when people get their Dexmo, they would immediately realize how game-changing this innovation is. We are also interested in talking to the leaders of the VR/MR industries to see possibilities of in-depth cooperation.

Road to VR: You’re using the SteamVR controllers to provide positional tracking. What are your plans for tracking on the final devices? Would licensing Lighthouse tracking from Valve be something you’d look at?

Gu: It’s funny two years ago we have to explain to people “tracking is a solved problem”, and nowadays thanks to Valve we no longer need to do that anymore. People get it. It is very easy to integrate the lighthouse tracking system into Dexmo, and we will definitely look into that. What you see in the video already projects what can be done.

Beside Lighthouse, Dexmo can also work with any tracking methods. For example we tried to use the tracking coordinates that Hololens API provided, and that doesn’t even require the setup of lighthouse. I guess what I want to say is: The problem that Dexmo solved is hand tracking with integrated force feedback, tracking can come from anything because there are a lot of options for developers to choose from.

Road to VR: How much do the current units weigh and what’s the current battery life? Do they use Bluetooth, WiFi?

Gu: Nothing is definite though but I can give you an idea: Dexmo weighs slightly more than an iPhone 6 Plus [circa 172g], and can be made even lighter. The battery can ideally run over 4 hours when fully charged. It uses NRF modules for wireless communication. Wifi and BT was too slow for our applications.

Road to VR: How simple is Dexmo to develop for? For example, if a developer takes your SDK and add Dexmo integration, what work is involved in making this happen?

Gu: Dexmo is very developer friendly and well documented. With our SDK, libDexmo, any developer should be able to pick up Dexmo in a few hours after reading the documents. We built some Unity plugins that is somewhat similar to the Vive. Developers can just pull out a pair of hands and our grasping algorithm will take in place and automatically handle the grasping for them. Frankly all they need to do is to apply the colliders and stiffness to objects.

Road to VR: Can you give us some of your reasoning behind the cancellation of your 2014 Kickstarter campaign for the Dexmo?

Gu: What we demonstrated on KS back in 2014 was only a concept. When we saw people’s reaction, we were flattered. But we soon realized they expected a lot more than what we could deliver at that time. Most backers wanted a all-in-one final product that they could immediately use, with complete platform support. Back then VR wasn’t as established as it is now. We know that if people didn’t have the knowledge to even set up a tracking system themselves, they probably gonna have a bad time using it. So we actually needed way more than $200k to research and manufacture the product to meet people’s (unrealistic) expectations by then.

So there was a decision for us to make: Either we take the money, pretend none of these issues exist, lie to our backers then deliver a shitty product that nobody is satisfied with (or just not deliver at all like the ControlVR, what a laugh); Or to be responsible, do the right thing, commit to the challenges instead of just talking about it, and come back when we actually have a better product.

Being an engineer, I am not much of a talker. I believe a product will speak for itself. I hope people can feel the time and effort we put into it. It was a tough choice by then, but I am glad we made the right decision. And today, I am happy to say that this technology finally meets every expectation I had for it.


Thanks to Aler Gu for his time. If you’re keen to know more about Dexmo, we’re hoping to bring you hands on experience with the devices soon. In the meantime, check out Dexta Robotics’ website for more information.

The post Dexta Shows Off Latest Exoskeleton Gloves That Let You Touch VR appeared first on Road to VR.

Dexta’s ‘Dextarity Interaction Engine’ Lets Devs Build Tactile VR Worlds

$
0
0

Dexta’s exoskeleton glove is a haptic input system which is designed to give users a sense of touch in VR. These new videos which show how the company’s ‘Dextarity Interaction Engine’ gives developers the tools they need to get the virtual world to push back.

The subtleties of a our sense of touch is taken for granted most of the time, you’re unlikely to have given much thought as to just how much information can be passed on through infinitely subtle interactions our fingers and hands have with real world objections. Dexta’s Dexmo exoskeleton gloves are designed to emulate those subtleties, to help add not only immersion to virtual reality experiences, but to give the user that naturalistic, human, tactile guiding force we take for granted in reality, but miss horribly when it’s gone in VR.

We wrote recently on Dexmo’s exoskeleton gloves, quizzing Dexta Robotic’s CEO Aler Gu on what makes Dexmo tick from a hardware perspective, briefly touching upon development APIs. Now, Dexta have released two new videos which demonstrate what they’re calling ‘Dextarity Interaction Engine’ (DexIE for short), an extended set of developers tools and algorithms which define a sort of haptic language for which devs can communicate events, interactions and help to build a virtual physical interface of sorts.

One of the bigger issues when your mind is immersed in a virtual space, and with your hands actions faithfully reproduced, it’s jarring when you’re virtual avatar passes through objects that have no physicality on the virtual plane. DexIE provides options for developers to give solidity to those objects, by ‘preventing’ your fingers physical position penetrating the mesh of a polygonal object.

Dexta CEO Aler Gu, speaking to us via email says “The guiding principle in hand interaction is that the mesh of hands and objects shouldn’t penetrate each other. Once our brain observe that, it will immediately know it’s not real. And that really breaks the immersion,” adding, “There are also a lot of the problems which only appear when you have the right hardware for it. For example, when you are using the Vive controller to pick something up, there is no physics. You are basically ‘sticking’ the object using 2 sets of 3d coordinates.”

dexmo-prototype-8

With Dexmo however, Dexta thinks their device makes all the difference, Gu tells me that “There are multiple contact points that actually performs physical interaction with 3d objects, which is why “switch between hands interaction” is actually very difficult to implement. Fortunately our engineers found a way to solve that problem after all. We also have an “Object selection indicator” and “Invalid interaction Indicator” to help further improve the user experience.”

Dexmo gloves are, naturally, limited in that they can only enact opposing force to your hand’s digits and only within a certain range. But Dexta believe that their “complex algorithms” go a long way to give VR users that instant hand-object collision notification.

Elsewhere the haptic events are more subtle; Emulating the physical ‘click’ of a button or switch, or conveying the weight of an object playfully flipped by your fingers for example. Then there’s the downright useful tactile sizing of objects interacted with, for example picking up a screw, sensing its dimensions before fitting it to a nearby table.

Gu tells us what they aimed to convey with the above video:

This is a parallel comparison between Vive and Dexmo for certain common tasks in VR. The user is requested to pull a lever, turn a knob, and press some buttons. Dexmo can simulate the physical presence of the lever, knob and the different layers of stiffness for the buttons. When the force feedback is combined with the sound, graphics, it really leaps to the next level of immersion.

We have had a total of 40 volunteers trying out this demo, and 100% of them agreed that Dexmo provides a more immersive VR experience. Another thing we have observed during our user studies is that, when people are using the Vive, we have to teach them how to use Vive controllers to operate the widgets; With Dexmo, it is very natural and intuitive. People reach out their hands and it just works. we don’t have to say anything. So that was a really encouraging feedback for us.

The demo you are seeing in the video is actually made by one of our software engineers who spent only 3 month playing with Unity. You can imagine the magical experience that developers with 5 years of experiences can pull out with this Device!

It’s undeniably all very cool. However we still have questions over cost, tracking and availability to market. Some which we touched upon in our recent interview with Aler Gu. It’s also another example of a peripheral that will require specific software integration in order to reach its potential, which is why these videos are important of course – proving to potential developer partners that Dexta have already done a lot of hard work for them.

The post Dexta’s ‘Dextarity Interaction Engine’ Lets Devs Build Tactile VR Worlds appeared first on Road to VR.

Microsoft Research Demonstrates VR Controller Prototypes With Unique Haptic Technology

$
0
0

Microsoft Research has devised two novel methods for more realistic haptic feedback on virtual reality controllers. They call it NormalTouch and TextureTouch.

Haptic feedback in general-purpose controllers has been limited to vibration feedback since the introduction of the Rumble Pak for the Nintendo 64 in 1997. Vibration motors come in all shapes and sizes, the most popular being the Eccentric Rotating Mass (ERM) motor, found most modern gamepads. Mobile phones often use very small ERM motors, or. more recently, linear actuators. Linear actuators tend to offer more haptic ‘detail’ and responsiveness, as can be found in Apple’s ‘Taptic Engine’, the HTC Vive controllers, an the Oculus Touch controllers. While vibrations as haptic feedback is the current state of the art in the consumer realm, limitations remain.

SEE ALSO
Hands-on: Tactical Haptics' Vive Demo is Further Proof That VR Needs More Than Rumble

Tactile feedback has proven to be effective across a wide variety of applications, but if you’re looking for significant force or resistance in your haptics, you need kinesthetic feedback. This is commonly available through force-feedback controllers, which tend to be designed for a specific task, such as joysticks for flight/space simulators, and wheels for driving simulators. The wealth of powerful haptic hardware on the market is one of the main reasons why flight and driving simulations are already so effective in VR. The closest product to a general-purpose kinesthetic controller is probably still the Novint Falcon, first shown in 2006, but this is also fairly limited, as it needs to be attached to a desk.

TextureTouch NormalTouch

Microsoft Research’s new experimental controllers bring kinesthetics into the VR space, offering two types of force-feedback applied to fully-tracked motion controllers. NormalTouch uses three servo motors to operate a small disc with tilt and extrusion movements, and TextureTouch uses a bank of 16 servos to operate a 4×4 pixel array of small blocks that move up and down to correspond to virtual shapes and structures. The result is a feeling of physical resistance as you drag your finger across a virtual shape, with enough fidelity to actually convey a sense of touch and an understanding of an object’s form and texture.

In both controllers, the feedback surface acts on a single finger or thumb, which may limit the practical use cases. But the key point is that this type of feedback is normally the domain of dedicated devices, elaborate gloves, or exoskeletons, whereas Microsoft Research’s designs are based on a normal handheld controller, which Michael Abrash, Oculus’ Chief Scientist, recently suggested could remain the standard input for VR for decades to come.

Texture is one thing, but offering real resistance (where the virtual world can push back on you) is still a pipe dream however, as there is nothing preventing the user from clipping through objects with today’s VR controllers. But with more realistic haptics, the desire to clip through something is reduced, in the same way that more realistic VR visuals often prevents people from trying to walk through virtual objects.

SEE ALSO
Microsoft Details HoloLens Streamable Live Action Video Recording Technique

In their testing, the Microsoft Research team developed a ‘penetration compensation’ technique, that made it appear that the user’s hand was not clipping, by decoupling them from the real tracking location. The finger is the most sensitive part of the hand to kinesthetic feedback, so this is effective, although it remains to be seen how this haptic-visual mismatch could work in a less controlled environment.

The research group’s findings are promising even at this early stage. Three tests were run—targeting accuracy, tracing accuracy and fidelity assessment—and both controllers were used, comparing them to vibration-only feedback and visual-only feedback. Both new haptic feedback techniques demonstrated advantages over vibration and visual only tests, despite some of the limitations of the prototypes causing issues.

The designs can no doubt be improved dramatically in terms of ergonomics, range of movement, responsiveness and detail, but already testers reacted positively to the heightened sense of touch. It was noted that the fact it’s already effective demonstrates the overriding power of the visual system, and that perhaps fully detailed or accurate feedback isn’t too critical, as the visual system automatically makes the corrections.

texturetouch-haptic-controller-prototype

Perhaps the toughest challenge of this project is in improving the physical design. Any device with a large number of mechanical parts always comes at a cost, usually in the form of weight and noise, and that’s certainly the case here. If this technology was utilized for a consumer product it would need to get smaller while staying quiet and reliable. It’s an area of research that is worth pursuing further, but it’s unclear at this stage how likely these prototype haptic technologies are to find their way into a real product.

The post Microsoft Research Demonstrates VR Controller Prototypes With Unique Haptic Technology appeared first on Road to VR.


Tactical Haptics Raises $2.2 Million to Build Haptic VR Controller Dev Kit

$
0
0

Tactical Haptics, a company pioneering a novel form of haptic feedback which can create compelling sensations that go far beyond rumble, announced today it has raised $2.2 million to create a development kit of a haptic VR controller as a stepping stone to an eventual consumer product.

Tactical Haptics is one of the OGs of the of the new VR landscape. The company was among just five or so companies exhibiting anything related to VR back at GDC 2013 (the first year Oculus attended the show). At the time they were showing their ‘Reactive Grip’ haptic technology attached to a hacked up Razer Hydra (a popular VR motion controller in those early days, long before the likes of Vive controllers and Oculus Touch). This (old) video, explains how it works:

Reactive Grip is a novel method of haptic feedback which uses sliding segments in the handle of a controller to create ‘shear’ forces in your hand which mimic an object moving against your palm, like the handle of a gun when it shoots, or the handle of a sword when it comes in contact with an enemy. The effect is unique and impressively convincing for certain interactions, and in many cases feels more authentic than mere rumble.

Tactical Haptics tried to jumpstart their path to creating a Reactive Grip developer kit in an unsuccessful 2013 Kickstarter that raised nearly $90,000, but was unfortunately well short of its $175,000 goal. But that didn’t mean defeat for the company, which has been refining its tech ever since; we’ve seen multiple prototype iterations since that time, including demonstrations tapping into the tracking of the Vive’s controllers and Oculus Touch.

SEE ALSO
Hands-on: Tactical Haptics' Vive Demo is Further Proof That VR Needs More Than Rumble

Now the company has announced that they’ve raised $2.2 million to create a development kit of a Reactive Grip VR controller. $749,000 of the funds come from a National Science Foundation grant, while the other $1.47 million comes as venture captial in a round led by SV Tech Ventures and the Youku Global Media Fund, with participation by SIG Asia Investment Fund, Sand Hill Angels, and the Stanford-StartX Fund. The company says the funds will be used to “[create] a developer kit, including mini-games, for VR game developers so they can integrate the company’s advanced haptic controller with their VR game content.”

The company plans for the development kit to have cross-platform support between the Rift and Vive, and include its own tracking to function as a replacement for the official VR controllers on those platforms.

Tactical Haptics prototype adapted with Oculus Touch Tactical Haptics prototype adapted with Vive controller

Exactly what form that tracking will take remains unclear, especially given the competing tracking technologies employed by the Rift and Vive. When we reached out to Tactical Haptics CEO, Will Provancher, for comment he shared the following:

…our plan for tracking won’t be to mount an entire Vive or Touch controller on our haptic controller. So we plan to have a more integrated solution than what we currently do (which is mount a Vive or Touch controller on our controller).

I can say that we are attending the SteamVR tracking partner program training, which will create several options for us to implement tracking.

However, the exact implementation of what we do with respect to motion tracking integration will depend on a lot of factors so it’s hard to say more than this at this point.

While Oculus has said previously that they planned to open their tracking API up to third parties, Valve is the only one of the two to have done that so far, meaning its more likely that we’d see a development kit that’s compatible with SteamVR Tracking sooner than Rift’s ‘Constellation’ tracking.

SEE ALSO
Striker VR Shows off Working Prototype of ARENA Infinity Haptic VR Gun

Whatever form it comes in, we look forward to haptic tech in VR that goes beyond common rumble, and hope that Tactical Haptics’ approach is just the first of many third-party VR controller choices for users.

The post Tactical Haptics Raises $2.2 Million to Build Haptic VR Controller Dev Kit appeared first on Road to VR.

AxonVR is Building a Haptic Holodeck Powered by NVIDIA PhysX

$
0
0

AxonVR, the company currently working on a full-body haptic solution for VR, are on their way to bringing their technology to an ever-widening audience. Announced today via an official NVIDIA blogpost, AxonVR says that the company’s fledgling haptic engine is actually built on top of NVIDIA’s PhysX, a middleware SDK that provides GPU acceleration for complex, real-time physics simulations.

AxonVR’s all-in-one haptic solution is aiming to combine an exoskeleton walking platform and haptic suit that provides simulated pressure, hot and cold sensations, so not only can you walk around VR, but you can feel it too.

This is accomplished by the company’s HaptX textile that’s said to deliver the feel of texture, shape, motion, vibration and temperature of virtual objects. As ambitious and captivating the early prototypes may be, a hardware platform can only be as good as its software base though—and that’s where NVIDIA’s PhysX comes in.

“At its heart, haptic simulation comes down to physics. But the precision and level of detail required for haptic simulation go beyond the capabilities of a typical physics engine. Objects don’t just have to look right. They have to feel right,” writes AxonVR CEO Jake Rubin and Zvi Greenstein, General Manager at Nvidia and head of Business Development for VR.

exosuit axonvr
AxonVR’s prototype exosuit

AxonVR says it’s using PhysX to calculate force feedback by figuring out how objects act during a collision, and then extracting more detail about the object’s surface geometry when touching your skin. According to AxonVR, the information gathered from PhysX can allow them to simulate anything from “gentle curves to micro-textures so small you can’t see them with the naked eye.”

“Based on this data,” says AxonVR CEO Jake Rubin “AxonVR’s software models how your skin will deform when touching the surface to produce tactile and vibrotactile feedback. Finally, the software models the transfer of heat between the surface and your skin to produce thermal feedback.”

AxonVR’s HaptX SDK includes a plug-in for leading game engines that lets developers add touch sensations to their projects. AxonVR maintains that content creators can use existing 3D assets and add haptic properties and effects with little or no modification with the PhysX-based HaptX SDK.

PhysX acceleration is supported across many game engines including Unreal Engine and Unity, and works on GeForce 9‑series, and 100‑series to 900‑series GPUs with a minimum of 256MB dedicated graphics memory.

AxonVR has spent the last 4 years in stealth mode, coming into the public eye back in May. Only a few short months later, the company announced the closure of a $5.8 million seed investment which will be used to further develop their HaptX platform. The company is aiming to license their walking platform and haptics suit directly to businesses such as theme parks and VR arcades.

And if you’re still wondering what all the fuss is about, Road to VR’s Michael Glombicki got his hands on a prototype of the HaptX material at Immerse Technology Summit (ex- SEA VR) this fall:

“To try out their technology, I placed my hand, palm upwards, in a slot on the side of large metal box that contained pneumatic drive system needed to make their haptic skin function. Using a Vive controller in my other hand, I was able to place a variety of objects on my virtual hand and feel the corresponding pressure response from the HaptX skin,” said Glombicki.

axonvr-haptx-demo-screen-capture

“The most impressive part of the demo was when I got to place a virtual deer on my hand. I could feel the individual points of contact as the deer moved its legs around my hand.”

Although we haven’t to see the technology worked into a large textile like a full haptic suit just yet, we’ll be following AxonVR’s progress to see just how scalable it turns out to be.

The post AxonVR is Building a Haptic Holodeck Powered by NVIDIA PhysX appeared first on Road to VR.

Tricking the Brain is the Only Way to Achieve a Total Haptics Solution

$
0
0

eric-vezzoliDeep in the basement of the Sands Expo Hall at CES was an area of emerging technologies called Eureka Park, which had a number of VR start-ups hoping to connect with suppliers, manufacturers, investors, or media in order to launch a product or idea. There was an early-stage haptic start-up called Go Touch VR showing off a haptic ring that simulated the type of pressure your finger might feel when pressing a button. I’d say that their demo was still firmly within the uncanny valley of awkwardness, but CEO Eric Vezzoli has a Ph.D. in haptics and was able to articulate an ambitious vision and technical roadmap towards a low-cost and low-fidelity haptics solution.


LISTEN TO THE VOICES OF VR PODCAST

Vezzoli quoted haptics guru Vincent Hayward as claiming that haptics is an ‘infinite degree of freedom problem’ that can never be 100% solved, but that the best approach to get as close as possible is to trick the brain. Go Touch VR is aiming to provide a minimum viable way to trick the brain starting with simulating user interactions like button presses.

I had a chance to catch up with Vezzoli at CES where we talked about the future challenges of haptics in VR including the 400-800 Hz frequency response of fingers, the mechanical limits of nanometer-accuracy of skin displacement, the ergonomic limitations of haptic suits, and the possibility of fusing touch and vibrational feedback with force feedback haptic exoskeletons.

SEE ALSO
Hands-on: 4 Experimental Haptic Feedback Systems at SIGGRAPH 2016

Support Voices of VR

Music: Fatality & Summer Trip

The post Tricking the Brain is the Only Way to Achieve a Total Haptics Solution appeared first on Road to VR.

Quantifying Touch on 15 Dimensions with SynTouch

$
0
0

matt-borzageSynTouch has created a system that can quantify the sense of touch on fifteen different dimensions called the SynTouch Standard, and they’re one of the most impressive haptic start-ups that I’ve seen so far. SynTouch isn’t creating haptic displays per se, but they are capturing the data that will vital for other VR haptic companies to work towards creating a display that’s capable of simulating a wide variety of different textures. SynTouch lists Oculus as one of their partners, and they’re also providing their data to a number of other unannounced haptic companies.

LISTEN TO THE VOICES OF VR PODCAST

I had a chance to talk with Matt Borzage, head of development and one of the co-founders of SynTouch at CES where we talked about the 15 different dimensions of their SynTouch Standard across the five major areas of Texture, Compliance, Friction, Thermal, and Adhesive. This research was originally funded by DARPA in order for adding the feeling of touch to prosthetics, and the founders have backgrounds in biomedical engineering. But their mechanical process of objectively measuring the different dimensions of textures has a lot of applications in virtual reality that creates a baseline of input data for haptic displays.

Here’s a comparison of denim and a sponge across the 15 dimensions of the SynTouch Standard:
spider-plot-syntouch

SynTouch has found a great niche in the haptics space in being able to already provide a lot of insight and value to a number of different companies looking at the ergonomics of industrial design, and they’re a company to watch in the VR space as more and more different haptics companies try to solve some of the hardest engineering problems around creating a generalized haptic device for VR.


Support Voices of VR

Music: Fatality & Summer Trip

The post Quantifying Touch on 15 Dimensions with SynTouch appeared first on Road to VR.

EXOS Haptic VR Exoskeleton Glove Aims to Deliver Practical Touch Feedback

$
0
0

EXOS is a new haptic enabled VR glove which uses force feedback to deliver the sensation of physicality when inside immersive applications.

Tokyo based developer Exiii are working on a new VR glove which delivers so-called “reactive force” in response to your actions inside virtual reality – the practical upshot of which is that you’re able to ‘feel’ virtual objects.

Some of you may recall our coverage of Dexta Robotic’s Dexmo’s haptic feedback solution a little while back and, although perhaps not quite as ambitious, EXOS does look like an interesting approach to the problem of force touch.

Unlike Dexmo however, the EXOS adopts a more simplistic, less granular approach to the problem. Whereas Dexmo provides incremental resistance and finger extension tracking for all four digits and thumbs (per glove), EXOS offers individual thumb and then collective 4 finger movement and force feedback. And whilst this might seem like a regressive step when compared to its other exoskeleton stablemate, it might turn out to be a smart design choice. By reducing complexity and sacrificing fidelity, EXOS’ design may prove more robust, with less moving parts in play and a simpler set of programmatic requirements. This is pure speculation at this stage of course, we’ve not had our hands on the device yet.

EXOS-VR-glove-1 (2) EXOS-VR-glove-1 (1) EXOS-VR-glove-1 (1)

More detailed information on the device is scant at this stage, although the developer’s video above does indicate that the devices are at present wired and don’t currently have an integrated tracking solution (check the retrofitted Vive controllers). Demonstrations of how the glove deals with hard and soft surfaces are given, but with no detail as to how much force or to what granularity it can be applied, it’s difficult to know how effective the device is.

SEE ALSO
Latest Dexmo Input Glove Features Positional Tracking with Full Finger Input, Claims 5ms Latency

Nevertheless, this sort of 2nd or 3rd generation VR-related technology keeps us excited for the future and reminds us that, although VR may be available and in people’s homes, there are a vast array of opportunities and problems still yet to be solved.

The post EXOS Haptic VR Exoskeleton Glove Aims to Deliver Practical Touch Feedback appeared first on Road to VR.

NullSpace VR’s New ‘Hardlight’ Haptic Suit is Heading to Kickstarter

$
0
0

NullSpace VR are poised to launch a new haptic vest focused toward immersive, virtual reality gaming, via Kickstarter soon. The Hardlight suit integrates 16 haptic pads that allow you to feel directional impact linked to actions inside the VR experience.

We’re all for amping up immersion on Road to VR, via whatever means necessary frankly. But our experiences in the world of wearable haptics as a mean to do so has not been exactly stellar so far. Nevertheless, the appeal of having directional, accurate force feedback which allows your chosen VR experience to punish you for your failures, or indeed merely give you a prod into action, is clear.

NullSpace VR, are poised to unleash their solution to this gap in the VR haptics market and they’re calling it the Hardlight Suit. This upper-body vest contains 16 haptic pads for delivering feedback to your chest, back, arms and shoulders. These pads can be triggered by any software integrated with NullSpace VR’s APIs and indeed, the company (who’ve made substantial progress since we first covered them) have persuaded a number of VR developers, including recent Indie favourite Sairento VR, to add Hardlight Suit support.

The team recently took their latest prototype to the World’s Fair ‘Nano’ event to show off their progress, filming attendee reaction for posterity.

The key concern for us is still the accuracy at which the suit can detect your orientation in relation to the virtual world. The Hardlight Suit contains inertial sensors, which detect rotational movement, but these sorts of sensors are not absolute and therefore can suffer from drift and positional inaccuracies. That said, since we first covered the suit, we now have room-scale capable positional tracking for both headsets and motion controllers, which adds more data to guess the user’s body orientation, but there are still gaps in that data which will need to be filled in order to be truly immersive.

hardlight-suite-2

The vest has been cannily designed, with a simple, open design and adjustable straps which should allow the system to be worn by people of varying shapes and sizes.

The team are adding the finishing touched to their Kickstarter campaign as I write this and we’ll pass on more details on that once they go live. In the mean time, if you’ve gone hands on with the Hardlight Suit in the past, why not share your experiences in the comments section below.

The post NullSpace VR’s New ‘Hardlight’ Haptic Suit is Heading to Kickstarter appeared first on Road to VR.

Hardlight VR $499 Haptic Suit Kickstarter Passes $80k Target

$
0
0

Hardlight VR is a new haptic suit from Nullspace VR that launched its Kickstarter last week. It’s already set to pass its original goal and the team have announced the project’s first stretch goal.

We wrote recently about Nullspace VR’s haptic suit project Hardlight VR and the team’s intent to bring the product to Kickstarter. Well, the company launched their campaign last week and it looks as if Hardlight VR will hit the original $80,000 target in under a week.

Hardlight VR is an upper-body vest containing 16 haptic pads that deliver impact feedback to your chest, back, arms and shoulders. The pads can be triggered by any software integrated with NullSpace VR’s APIs with relative rotational information for your body provided by integrated IMUs.

The company (as we mentioned in our last piece) have persuaded a number of VR developers, including recent Indie favourite Sairento VR, to add Hardlight Suit support. Joining that are 14 other games, including the likes of futuristic racer Redout and room-scale archery favourite Holopoint. The latest announcement for the project is the first stretch

hardlight-vr-kick-games

Early bird Hardlight VR units are already gone, but interested backers can still get their hands on a suit from $499. Note that Hardlight VR is currently tethered, with a USB cable attaching the suite to the PC providing both power and the input / output feed. The team claim that a wireless add-on is on the roadmap, should they reach the stretch goal.

However, if you can live with those, the team certainly have a large enough selection of software for you to sample on delivery. And, with the campaign’s goal met in under a week, it’ll be interesting to see how much more interest the project garners and how much more can be raised in the

The post Hardlight VR $499 Haptic Suit Kickstarter Passes $80k Target appeared first on Road to VR.


Hands-on: StrikerVR’s Latest Prototype Haptic Gun Packs More Than Just Virtual Bullets

$
0
0

At GDC 2017 this week we’ve gone hands-on with the latest prototype of StrikerVR’s ‘Arena Infinity’ haptic VR gun. Built (for now) for the out-of-home sector, the peripheral’s powerful haptics adapt from guns to chainsaws to grenade launchers and more.

Update (3/3/17, 1:14PM PT): The video interview above has an audio issue on some platforms; those listening on mobile devices may not be able to hear the audio or hear a corrupted version. However, desktop playback or mobile with headphones plugged in should work fine. New video uploaded, issue should now be resolved. If you’re still having issues, let us know in the comments below.

In virtual reality you can make the tracked object you’re holding look like anything. So a one-size-fits-all haptic kick isn’t going to cut it when immersion is the goal. StrikerVR knows this, and has created their Arena Infinity haptic gun to be able to output an impressive range of haptic effects which feel significantly different depending upon what virtual weapon you’re firing.

strikervr arena infinity (5)I got to try the latest prototype, which is now fully self contained, at GDC 2017 this week and was impressed with the extremely solid feel of the Arena Infinity and the powerful and satisfying kick it provides. In the demo I wore a VR backpack, an Oculus Rift, and wielded the Arena Infinity, all tracked by the new active-marker ‘OptiTrack Active’ system.

strikervr arena infinity (7)In the single-fire mode, you get a very satisfying kick with every pull; you can feel the gun move your shoulder, and even see it when other people are using the peripheral. Because of the type of haptics StrikerVR has implemented, the response time is also very tight relative to the in-game visuals and sound effects, and continues to be responsive as fast as I was able to pull the trigger. After pulling the trigger enough times to deplete the virtual clip, successive pulls give only a tiny nudge to indicate that you’re out of ammo. The bottom of the gun has a ‘smack button’, that somewhat recreates the motion of smacking a clip to ensure it’s been securely inserted into the magazine well. That initiates a reload and allows you to continue firing away.

A button on the side of the gun (in this case) was used to toggle between different virtual weapon modes with different haptic effects. One mode was full-auto which gave a satisfying repeating kick as I held down the trigger (this was especially fun for dual wielding the guns, Rambo style). The next mode was a grenade launcher which gave the feeling of a single ‘thump’ followed by a rumble indicating when the weapon was reloaded. And then there was the chainsaw mode, which put revving chains on the end of the gun model in the game (along with sound effects). In this mode, the gun is constantly making a low rumble which picks up speed as you hold the trigger down. When you let the trigger go, the chains slow down after losing their momentum and return to the idle rumble.

SEE ALSO
Hands-on: Striker VR Pistol Prototype Borrows Design From Halo's Famous M6 Magnum

The effect, especially paired with the in-game chainsaw visual and sound effects, made me momentarily fearful of putting my hand on the end of the gun where the virtual chainsaw was.

StrikerVR has also shown other haptic effects that weren’t present in this demo, and developers could make their own effects that specifically fit their in-game weapons.

strikervr arena infinity (2)While the Arena Infinity is made for the out-of-home market and presently tracked by a high-end commercial tracking system, StrikerVR co-founder Martin Holly says that the company wants to prove the device in the commercial market before commoditizing it into a consumer package. When it comes to tracking, Holly says that future versions of the gun will feature a scope rail which will make an easy attachment point for something like the Vive Tracker to enable SteamVR Tracking. Holly also says the company is in contact with Oculus regarding the possibility of using Oculus’ Constellation tracking system.

The post Hands-on: StrikerVR’s Latest Prototype Haptic Gun Packs More Than Just Virtual Bullets appeared first on Road to VR.

Hands-on: Reactive Grip Haptic Controller Prototype with Vive Tracker, Touch, and Custom SteamVR Tracking

$
0
0

Tactical Haptics, developers of the Reactive Grip controller, are showing their latest prototypes now with attachments for the Vive Tracker, Oculus Touch, and a custom-built SteamVR Tracking solution. The controller employs a unique solution to haptic feedback which aims to recreate the feeling of friction against objects in your hands rather than just rumble. The company is moving toward bringing a development kit of the device to developers.

Tactical Haptics has been in development of their Reactive Grip haptic technology for several years now. Having shown off some of their earliest prototypes at GDC 2013—years before HTC and Oculus even began talking about VR motion controllers—tracking has remained a hurdle in getting the product ready for consumers. This (old) video shows the foundation of the haptic technology which we’ve said ‘proves VR needs more than rumble‘.

Vive Tracker and Oculus Touch Tracking

Now that both Oculus’ Constellation and Valve’s SteamVR Tracking systems are deployed in users’ homes, the door is open to using those systems as add-ons to track the Reactive Grip controller for use in VR. That means users who already own Touch or a Vive Tracker can attach those peripherals without the need to bear the cost of additional tracking hardware built into the controller.

tactical haptics reactive grip (3)While the company had shown off a similar approach previously by attaching the Vive controllers to their haptic controller, Tactical Haptics founder William Provancher says that between the Vive Tracker and Touch controllers, the lighter weight and more compact profiles make the overall device lighter, more balanced, and more comfortable to use. Though that’s not to say that Vive controller adapter might not be offered when the Reactive Grip controller becomes available.

Custom SteamVR Tracking

tactical haptics reactive grip (5)Thanks to Valve opening up their tracking solution to third-parties over the seven months, Tactical Haptics is also experimenting with a custom SteamVR Tracking solution which could be offered for those who want to buy an all-in-one controller. Provancher says the company attended the SteamVR Tracking development course and had created a working SteamVR Tracking integration for the controller in just a few weeks. Though the company is still refining the integration, Provancher says early tests reveal that it tracks just as well as the Vive Tracker.

SEE ALSO
SteamVR Tracking HDK Now Available for Anyone to Buy

– – — – –

At GDC 2017 this week, the company was showing off the new controller prototypes with new mini-games made to show what its like to develop for the controller and what sorts of applications the unique haptic feedback can be applied to.

tactical-haptics-mini-gmaeUsing the Reactive Grip controller, I played a game that was something like ‘VR Asteroids’ where I used my hand to fly a little ship around to avoid asteroids and incoming fire from enemy ships. Using the orientation of the controller and the trigger, I could fire the ship’s weapons to destroy asteroids and enemy ships. The controller’s haptics gave me a sense of the ships momentum in my hand and feedback as my ship took damage and fired its weapons.

tactical-haptics-cyber-golfThe other game, Cyber Golf, was like a futuristic version of disc golf where the goal was to throw the disk into a goal which was blocked by obstacles. In the game I held a wand-like tool which could be used to grab the disk. Grabbing on the disc’s edge let me throw it like a frisbee, while grabbing the core extended a laser-rope from the wand-tool that let me whirl the disc over my head like a lasso and then throw it for extra distance. While spinning the disk over my head, the controller gave me a sense of the disc’s weight as its momentum pulled the tool in a circular motion in my hand.

Both mini-games were fun and functional, but not the most compelling demos I’ve seen (and felt) from these controllers. Prior demos that I’ve tried using the controller—like gun shooting, sword wielding, and using a ‘Gravity Gun’-like tool to swing boxes around—gave me a more immersive sense of connection between what I was doing and how the haptics felt on my hand. But, importantly, the new mini-games on display at GDC show how the tech can be applied in a more abstract way, which opens the doors to more gameplay possibilities that would make use of the controller’s unique haptics.

SEE ALSO
Hands-on: StrikerVR's Latest Prototype Haptic Gun Packs More Than Just Virtual Bullets

In November, Tactical Haptics announced that they’d raised $2.2 million to finalize a development kit of the Reactive Grip controller, and now the company has begun soliciting developer interest for dev kits. The company suggests reaching out by email to info@tacticalhaptics.com for more details about development kits.

The post Hands-on: Reactive Grip Haptic Controller Prototype with Vive Tracker, Touch, and Custom SteamVR Tracking appeared first on Road to VR.

Hands-on: Go Touch VR’s Haptic Feedback is So Simple You’ll Wonder Why You Didn’t Think of it First

$
0
0

Sometimes, the simplest solutions are also the smartest. Go Touch VR’s approach to VR haptics achieves surprising effectiveness out of small, simple haptic devices that provide stimulation to the end of your fingers.

Call it “obvious,” but this is the first time I’ve seen Go Touch VR’s approach to VR haptics, which provides nothing more than a variable force against the top of your fingertip using a flat piece of piece of plastic that moves back and forth with a little motor. Simple, and yet surprisingly compelling. The sensation is much like what you feel when you press your finger against a flat surface like a desk.

While oldschool ERM rumble (like you’ll find in today’s gamepads) and more modern linear-actuator based rumble (like you’ll find in VR motion controllers) both offer various rumbling sensations as an added dimension of feedback to users on top of visual and audio cues. And while sometimes that rumble can be interpreted as direct feedback (ie: vibration caused by shooting a gun leads your hand to rumble), often times the haptic sensation is a bit more abstract than that, like feeling a rumble when you press a button; but pressing a button doesn’t exactly cause your hand to ‘rumble’ in real life, and thus the rumble in this case is abstract rather than direct (ie: it requires a level of interpretation from your brain to make the connection between the information being conveyed and the sensation).

And while rumble is widely applicable for that abstract approach, it seems best suited for shooting games if you want to make use of the more immersive direct approach. And yet in VR we find lots of experiences where you aren’t shooting, but are instead grabbing, touching, and manipulating objects in VR which wouldn’t vibrate in real life, making it difficult to use rumble to convey meaningful, direct feedback.

vr-touch-haptics
Photo by Road to VR

It’s that grabbing, touching, and manipulation where Go Touch VR’s ‘VR Touch‘ haptics hopes to excel. Based on what CEO Eric Vezzoli says is a ‘Real Contact Sensation’ haptic approach, VR Touch is a simple, compact device which straps to the end of your fingers and provides nothing more than a plastic pad which can exert varying levels of force against the top of your fingertip.

That force can create a surprisingly compelling sensation of touching and grabbing objects with your fingers. Rather than abstract rumble, VR Touch gives the illusion of objects pushing back against your fingers directly.

SEE ALSO
Hands-on: Reactive Grip Haptic Controller Prototype with Vive Tracker, Touch, and Custom SteamVR Tracking

For non-controller VR input solutions like hand tracking, VR Touch fills the significant need of informing the user when they have actually initiated a ‘grab’ of a virtual object; having something that is not your own fingers to push back against your fingers as an indication of contact turns out to be far more immersive than the ‘air grab’, where you create a grabbing gesture with your hand, but have no idea if you are making the ‘correct’ contact with the virtual object because there’s no real object providing feedback to your fingers. This issue presently plagues non-controller VR input, and it’s one that VR Touch is poised to solve.

vr-touch-haptics-2
Photo by Road to VR

Demonstrating VR Touch haptics at SVVR 2017 this week, Go Touch VR showed the device in action using an Oculus Rift with attached Leap Motion for hand tracking. They placed three of the VR Touch units across my thumb, index, and middle fingers, secured with a small elastic band with velcro.

Through the series of demos, I found that the VR Touch haptics are great for things like pressing buttons and poking & grabbing objects.

Again, the key is providing useful feedback to indicate that your virtual fingers are interacting with the virtual objects. But it isn’t just useful; the sensation is a good stand-in for the forces you expect to feel and the places you expect to feel them. The direct nature of the feedback clicks instantly with your brain which expects to feel a force specifically against your fingertip whenever you touch something.

Among the demos I tried was an abstract usage of the feedback which attempted to convey the heat coming off of a small fire when I placed my hand over top of it. And while the feedback was useful from an informational standpoint (to tell me perhaps that the fire is dangerous), as you might expect, this use of the haptics was much less convincing because fire doesn’t actually push back against your hand.

I also tried using just one VR Touch unit instead of three, though I found that three was far more immersive.

SEE ALSO
Tricking the Brain is the Only Way to Achieve a Total Haptics Solution

For how small the VR Touch device is, I was actually surprised how much force it can apply; it easily provides enough force in its current state to emulate the sensations as you’d expect them when touching and holding small objects.

That’s not to say the device is ready for market however. The prototype VR Touch units I saw were 3D printed and hand-built. Still, the team says they can already last for two hours on a single charge (and my guess is that there’s more progress to be made there as the device matures). After about 10 minutes of use, the elastic band securing the units to my fingers caused a reduction in circulation which I could easily feel once I took them off. CEO Eric Vezzoli tells me that the final model will fix this by using materials which provide greater friction between the contact points of the device along your finger, allowing it to rely less on the elastic band to keep the device in place (indeed, the current 3D printed plastic was very smooth and offered little friction).

He also says that the final VR Touch form-factor is expected to become significantly more compact, and will smartly include a few physical controls on the device as well, like buttons, to aid in interaction.

vr-touch-haptics-3Each unit is also planned to include its own IMU which can be fused with other tracking solutions to enhance the finger-level tracking necessary for VR Touch to work effectively. And while I saw VR Touch demoed using Leap Motion’s hand tracking, Vezzoli says the device can work across a number of tracking technologies, including integration into glove-based systems.

Go Touch VR is currently soliciting interested in dev kit pre-orders of the device and they are also actively fundraising to continue growing the company.

The post Hands-on: Go Touch VR’s Haptic Feedback is So Simple You’ll Wonder Why You Didn’t Think of it First appeared first on Road to VR.

TouchSense Force Plugin Aims to Help VR Game Devs Make Better Haptics, Faster

$
0
0

Founded in 1993, Immersion Corp. designs and licenses haptic technology that’s come to be used in gamepads you’re well familiar with. Now the company is developing a new haptic programming system which aims to help game developers make better haptics effects for their games, faster.

The latest generation of VR controllers use more advanced haptics than the basic rumble that you find in today’s gamepads. But if you’ve played much VR, you’ve probably found that the capabilities of these haptics have gone largely underutilized by a wide swath of today’s VR games.

Immersion Corp. says this is because programming haptic effects is hard, and today it generally involves code-level input of values like amplitude, frequency, and time signatures in order to trigger the haptics inside of a controller. Now the company says they’ve made a better way, thanks to the TouchSense Force plugin (and API) which makes the creation of haptic effects into a much more intuitive visual process.

SEE ALSO
Hands-on: StrikerVR's Latest Prototype Haptic Gun Packs More Than Just Virtual Bullets

Rather than coding specific timing, amplitude, and frequency values, TouchSense Force (launching initially for UE4) creates a ‘clip’-based timeline interface which will be instantly familiar to anyone who has edited audio or video files. The timeline allows developers to pull an animation into the system, very easily design haptic effects that are finely tuned to the animation, and play those effects back with the animation on the fly for testing and tweaking.

touchsense-force

So instead of a reloading gun animation just causing a simple rumble for a second or two, a developer using TouchSense Force could create a complex series of haptic effect clips that closely match every part of the animation for added realism.

Immersion Corp was demonstrating TouchSense Force at SVVR 2017 this week where they were showing an animation of a robotic glove enclosing the hand of a VR user (definitely not inspired by Iron Man’s suit-up scenes), which had a lot of intricate detail and moving parts.

In something like 10 minutes, according to the company, they were able to use the TouchSense Force plugin to design a series of varied effects which carefully synced to the activity on the glove; something which traditionally could have taken hours of careful tweaking. Trying the result for myself using a Rift headset and Touch controller, it was indeed very impressive, far beyond the level of haptic detail I’ve seen from any VR game to date.

Actually creating these effects with the plugin is completely essentially code-free. I have a basic understanding of audio editing and waveforms, and I very quickly understood the process of creating each effect; I’m confident I would be able to create my own using the plugin, which is pretty cool considering that I have no game development experience. That level of intuitivity means that creating such effects is easier and faster, and gives a huge level of control to developers. There’s hope that this will open the door to developers bringing much more attention to detail in their use of haptics for VR games.

And there’s a few other cool functions that Immersion Corp. is building into this tool. For one, if the animation ends up getting changed, the haptic effects can change with it automatically. This works by associating specific haptic clips to specific moments in the animation using UE4’s Notifies animation system. Because the animation and the haptic clips are linked, changing the animation will also change the playback timing of the haptic clips, which means developers can tweak the haptics and the animation independently without needing to repeat their work if they decide to make a tweak after the fact.

SEE ALSO
Microsoft Research Demonstrates VR Controller Prototypes With Unique Haptic Technology

The plugin also offers cross-platform support, so that developers can author their haptic effects once, and have those effects play back as closely as possible on other controllers (which could even have different haptic technologies in them) without re-authoring for that controller’s own haptic API.

TouchSense Force is now available to select developers in early access as a UE4 plugin (you can sign up for access here). The pricing model is presently unannounced. The company says they also plan to make available a Unity plugin in Q3, and will release an API to allow integration of the same feature set into custom game engines. Presently the system supports Oculus Touch, Nintendo Switch, and unspecified “TouchSense Force-compatible hardware,” though we imagine the company is working to get Vive controllers integrated ASAP.

The post TouchSense Force Plugin Aims to Help VR Game Devs Make Better Haptics, Faster appeared first on Road to VR.

VRgluv Force-feedback Glove Blasts Past $100K Kickstarter Goal in 56 Hours

$
0
0

VRgluv has blown past its $100,000 funding goal in 56 hours on their Kickstarter page, with 27 days still to go. The product is described as the “first affordable force feedback gloves” that feature “total hand tracking, full force feedback, and pressure sensitivity”.

Update (4/28/17, 12:39PM PT): VRGluv has now exceeded its $100,000 Kickstarter goal, presently just over $113,000. The company has announced in an update a $250,000 stretch goal which will add a replaceable battery to the glove. The company has also opened a new $370 pricing tier limited to 400 backers.

Original Article (4/27/17): Going live on March 31st, VRgluv’s website revealed their haptic gloves, compatible with both HTC Vive and Oculus Rift hardware, said to be comfortable, functional, and affordable (despite looking rather clunky). An appealing $300 price point (relatively speaking) for the Super Early Bird is the likely reason for the rapid influx of early backers and, being limited to 100 backers, is already sold out. The next tier at $350 is limited to 200 sets, followed by a ‘Kickstarter special’ price at $400, with the final retail price expected to be $580.

VRgluv Kickstarter

As shown in the Kickstarter video, different adapters allow for Oculus Touch controllers, HTC Vive controllers or Vive Trackers to clip to the sides of the gloves to perform the spatial tracking duties, with the gloves containing proprietary technology to determine finger positions and grip strength. Ideally, Vive Trackers would be used—being the least-bulky attachment to what is already a chunky pair of gloves—although VRgluv describes the units as ‘lightweight’, and the adapters are said to be carefully designed to hold each tracking solution in the most balanced position.

Applying haptic glove support to VR applications involves incorporating the VRgluv SDK, and the team recently created a short video to showcase a few examples of games already compatible.

VRgluv is one of several devices in development that provide a haptic feedback solution for hand interaction in VR. Others include the EXOS, the Dexmo exoskeleton, the temperature-changing Senso and PowerClaw, the Gloveone and Avatar VR from NeuroDigital Technologies, and more. As the haptics challenge is being approached from so many different angles, it’s difficult to predict if one product will rise to the top; this area of VR development is likely to remain experimental and niche, although VRgluv’s price is impressive considering the low volume, wireless technology, rechargeable batteries, and the likely high number of mechanical components involved.

VRgluv is aiming to deliver the first sets to customers in December 2017.

The post VRgluv Force-feedback Glove Blasts Past $100K Kickstarter Goal in 56 Hours appeared first on Road to VR.

Viewing all 84 articles
Browse latest View live