Object grabbing in Virtual Reality

Post Reply
luisoncpp
Posts: 3
Joined: Thu Mar 31, 2016 7:00 pm

Object grabbing in Virtual Reality

Post by luisoncpp »

Hi everyone, I'm working in an application in virtual reality, that application(like a lot of others) have the mechanic of grab objects in the environment and move them using motion controls.
The problem is that I don't find a good way to move the objects, if I move them changing their positions and rotations within the world, the objects don't collide with the environment while I'm grabbing them, if I change the velocities then I notice I delay between the movement of my hand and the movement of the object and that reduces the immersion a lot (also I have a lot of problems with the collisions).
Is there any way to change the position and rotation of an object and at the same time keep detecting the collisions and make the rest of the objects in the environment to keep reacting to the object as normal?
Do you have a better idea of how to achieve this task?
User avatar
drleviathan
Posts: 849
Joined: Tue Sep 30, 2014 6:03 pm
Location: San Francisco

Re: Object grabbing in Virtual Reality

Post by drleviathan »

Yes setting the velocity does add a little bit of lag, but it should be possible to do it such that the lag is small. Someone tested various solutions and wrote a blog about it.

A simple way to do this is to scale the velocity directly to the displacement. In code it might look something like this (dunno if this code actually compiles or works but it should be close):

Code: Select all

// linear
const btScalar TIMESCALE(0.1);
const btScalar CLOSE_ENOUGH_SQUARED(1.0e-6);
btVector3 deltaPosition = targetPosition - currentPosition;
if (displacement.length2() > CLOSE_ENOUGH_SQUARED) {
    btVector3 decayVelocity = deltaPosition * (- 1.0 / TIMESCALE);
    btVector3 targetVelocity = computeTargetLinearVelocity(); // you need to implement this
    body->setVelocity(targetVelocity + decayVelocity);
}

// angular
btQuaternion deltaRotation = targetRotation * currentRotation.inverse();
btScalar angle = deltaRotation.angle();
const btScalar MIN_ANGLE(0.01);
if (angle > MIN_ANGLE) {
    btVector3 decayVelocity =  deltaRotation.axis() * (- angle / TIMESCALE);
    btVector3 targetVelocity = computeTargetAngularVelocity(); // you need to implement this
    body->setAngularVelocity(targetVelocity + decayVelocity);
}
Basically what this does is create a critically damped spring (linear and angular) with zero history. It will slave the object to the targetTransform (which is moving at targetVelocity) such that it reduces its delta to 1/e of what it was in about one TIMESCALE. A benefit of this method is that it is very easy to "tune" the response -- you just dial the TIMESCALE accordingly -- if you want it fast then make the timescale small -- if you want it slow then make it long. However the TIMESCALE cannot go arbitrarily small -- it MUST be least two or three times the expected simulation timeStep period else you risk instabilities.

For best results set the body's linear and angular damping coefficients to zero --> less lag.

Why does this work?

You assume the displacement has a time evolution of an exponential curve: X(t) = X0 * exp(-t/timescale)

The velocity is the derivative of this: V(t) = (-1/timescale) * X0 * exp(-t/timescale) = (-1/timescale) * X(t)

The velocity is just a negative multiple of the displacement under the above assumption. If you always set the velocity thus then your assumption becomes true --> the displacement will decay exponentially.

Edit: fixed a bug in my logic.
Edit: added targetVelocity offset
Last edited by drleviathan on Sat Aug 06, 2016 12:12 am, edited 1 time in total.
Dirk Gregorius
Posts: 861
Joined: Sun Jul 03, 2005 4:06 pm
Location: Kirkland, WA

Re: Object grabbing in Virtual Reality

Post by Dirk Gregorius »

This is quite a difficult problem. In a nutshell you want zero lag when moving an object in free space without collision. In this case it is best to parent the picked object to the VR hand. Once you detect collision you need physics and I switch to a soft weld joint in Rubikon. This works pretty well. The collision becomes pretty involved now since the user can move the picked object out of the world. I use exact CCD (translation + rotation) with substepping to handle this. Simple tunneling prevention like speculative contacts or inner spheres don't get the job done.

In general I am seeing that VR is raising the demands for higher quality physics quite a bit. In particular for CCD. Contact and joint softness are pretty visible in VR and also don't help with responsiveness.
luisoncpp
Posts: 3
Joined: Thu Mar 31, 2016 7:00 pm

Re: Object grabbing in Virtual Reality

Post by luisoncpp »

drleviathan wrote:Yes setting the velocity does add a little bit of lag, but it should be possible to do it such that the lag is small. Someone tested various solutions and wrote a blog about it.

A simple way to do this is to scale the velocity directly to the displacement. In code it might look something like this (dunno if this code actually compiles or works but it should be close):

Code: Select all

// linear
const btScalar TIMESCALE(0.1);
const btScalar CLOSE_ENOUGH_SQUARED(1.0e-6);
btVector3 deltaPosition = targetPosition - currentPosition;
if (displacement.length2() > CLOSE_ENOUGH_SQUARED) {
    btVector3 decayVelocity = deltaPosition * (- 1.0 / TIMESCALE);
    btVector3 targetVelocity = computeTargetLinearVelocity(); // you need to implement this
    body->setVelocity(targetVelocity + decayVelocity);
}

// angular
btQuaternion deltaRotation = targetRotation * currentRotation.inverse();
btScalar angle = deltaRotation.angle();
const btScalar MIN_ANGLE(0.01);
if (angle > MIN_ANGLE) {
    btVector3 decayVelocity =  deltaRotation.axis() * (- angle / TIMESCALE);
    btVector3 targetVelocity = computeTargetAngularVelocity(); // you need to implement this
    body->setAngularVelocity(targetVelocity + decayVelocity);
}
Basically what this does is create a critically damped spring (linear and angular) with zero history. It will slave the object to the targetTransform (which is moving at targetVelocity) such that it reduces its delta to 1/e of what it was in about one TIMESCALE. A benefit of this method is that it is very easy to "tune" the response -- you just dial the TIMESCALE accordingly -- if you want it fast then make the timescale small -- if you want it slow then make it long. However the TIMESCALE cannot go arbitrarily small -- it MUST be least two or three times the expected simulation timeStep period else you risk instabilities.

For best results set the body's linear and angular damping coefficients to zero --> less lag.

Why does this work?

You assume the displacement has a time evolution of an exponential curve: X(t) = X0 * exp(-t/timescale)

The velocity is the derivative of this: V(t) = (-1/timescale) * X0 * exp(-t/timescale) = (-1/timescale) * X(t)

The velocity is just a negative multiple of the displacement under the above assumption. If you always set the velocity thus then your assumption becomes true --> the displacement will decay exponentially.

Edit: fixed a bug in my logic.
Edit: added targetVelocity offset
I already tried that and the delay it's very noticeable.
Dirk Gregorius wrote:This is quite a difficult problem. In a nutshell you want zero lag when moving an object in free space without collision. In this case it is best to parent the picked object to the VR hand. Once you detect collision you need physics and I switch to a soft weld joint in Rubikon. This works pretty well. The collision becomes pretty involved now since the user can move the picked object out of the world. I use exact CCD (translation + rotation) with substepping to handle this. Simple tunneling prevention like speculative contacts or inner spheres don't get the job done.

In general I am seeing that VR is raising the demands for higher quality physics quite a bit. In particular for CCD. Contact and joint softness are pretty visible in VR and also don't help with responsiveness.
It's a very interesting suggestion, but something that I noticed is that if I change the position of an object, bullet doesn't report me the collisions with that object.
Is there any way to tell bullet to keep reporting the collisions?, or do I need to call the collision check in every frame manually?(I would prefer not to do this for performance reasons).
User avatar
Erwin Coumans
Site Admin
Posts: 4221
Joined: Sun Jun 26, 2005 6:43 pm
Location: California, USA
Contact:

Re: Object grabbing in Virtual Reality

Post by Erwin Coumans »

The latest Bullet version comes with a Virtual Reality demo for HTC Vive and Oculus Rift (based on Valve's OpenVR) with hand controller and grabbing interaction.

See the video of the result: https://www.youtube.com/watch?v=VMJyZtHQL50

The grabbing happens similar to mouse picking in 2D: using constraints: either a point to point constraint, or a fixed constraint. You can control the stiffness using the maximum applied force/impulse of the constraint. This way, all interactions are consistent, including collision detection and collision response and joint limits, motors etc. The VR pybullet environment uses btMultiBody by default, for rigid joints needed in robotics.

You'll notice some lag in the hand controller, which can be tuned using max force and error reduction parameter, time step, solver iterations etc.
The user feedback so far confirmed that this current implementation using constraints is a reasonable compromise as a starting point, but by no means finalized.
The nice part of the Virtual Reality demo is that it is a generic physics sandbox: you can start with an empty world, and use pybullet Python to add objects, control joint motors etc.

At some stage I hope to find time to document this properly, but until then, please try it out and try out pybullet and read those minimal instructions:
https://docs.google.com/document/d/1ac3 ... sp=sharing

The URDF and SDF format used in VR/pybullet is very simple xml that adds physics parameters and allows you to use .obj, .stl or COLLADA .dae graphics/collision representations.
See ROS http://wiki.ros.org/urdf/Tutorials and http://sdformat.org/spec. Although this is robotics oriented, it is also suitable for games I think.

I'll upload some Windows executable + Python distribution + pybullet plugin all precompiled, with the data assets, to make it really easy to try it out, given HTC Vive and/or Oculus Rift + Touch.

A few more interesting details about the Bullet VR demo:
1) physics runs in its own thread, with non-blocking synchronization with the graphics
2) pybullet Python can connect using such Python scripts:

Code: Select all

import pybullet as p
p.connect(p.SHARED_MEMORY)
p.loadURDF("r2d2.urdf",posX,posY,posZ)
help(p)
p.disconnect()
3) Physics runs at 240 Hz, 100 constraint solver iterations, everything simulated as btMultiBody. This is mainly for robotics quality, it can be tuned towards games too.
4) Graphics runs as fast as possible, with VSYNC disabled. For best experience use an NVIDIA GTX1080, and don't spawn too many extra objects.

The VR demo will become better over time, and also allows to create new scenes (basic physics world editor)
benelot
Posts: 350
Joined: Sat Jul 04, 2015 10:33 am
Location: Bern, Switzerland
Contact:

Re: Object grabbing in Virtual Reality

Post by benelot »

I added this to the documentation on using pybullet and VR to the physics wiki: http://bulletphysics.org/mediawiki-1.5. ... g_pybullet
Post Reply