I'm creating a Panorama view which allows the user to look around in a spherical image by rotating his smartphone. I used Rajawali's Skybox for this together with the TYPE_ROTATION_VECTOR sensor.
I got it working, but only when I look forward (it's literally based on my rotation (yaw))
This is the behavior:
Now I do have a hunch what is going on. It seems the "camera object" keeps looking at the same direction, even if it doesn't appear to. This means that pitching seems to be the same as rolling, but in stead it's still pitching because the object hasn't rotated. I'm comparing it to being in an airplane and looking around.
What do I need to do to solve this?
I have a feeling I'm going to have to rotate the camera with lookAt(), but I'm not sure how.
public class SkyboxFragment extends RajawaliFragment implements SensorEventListener {
public static final String TAG = "SkyBoxFragment";
private SensorManager mSensorManager;
private float[] orientationVals = new float[3];
private float[] mRotationMatrix = new float[16];
private Sensor mRotVectSensor;
@Override
public View onCreateView(LayoutInflater inflater, ViewGroup container,
Bundle savedInstanceState) {
super.onCreateView(inflater, container, savedInstanceState);
LinearLayout ll = new LinearLayout(getActivity());
ll.setOrientation(LinearLayout.VERTICAL);
ll.setGravity(Gravity.CENTER_HORIZONTAL | Gravity.TOP);
mSensorManager = (SensorManager) getActivity().getSystemService(
Context.SENSOR_SERVICE);
mRotVectSensor = mSensorManager.getDefaultSensor(Sensor.TYPE_ROTATION_VECTOR);
mLayout.addView(ll);
mSensorManager.registerListener(this,
mRotVectSensor,
10000);
return mLayout;
}
@Override
public AExampleRenderer createRenderer() {
mRenderer = new SkyboxRenderer(getActivity());
return ((SkyboxRenderer) mRenderer);
}
@Override
public void onClick(View v) {
}
@Override
public void onSensorChanged(SensorEvent event) {
if (event.sensor.getType() == Sensor.TYPE_ROTATION_VECTOR) {
SensorManager.getRotationMatrixFromVector(mRotationMatrix, event.values);
SensorManager.remapCoordinateSystem(mRotationMatrix, SensorManager.AXIS_X, SensorManager.AXIS_Z, mRotationMatrix);
SensorManager.getOrientation(mRotationMatrix, orientationVals);
orientationVals[0] = (float) Math.toDegrees(orientationVals[0]);
orientationVals[1] = (float) Math.toDegrees(orientationVals[1]) * -1;
orientationVals[2] = (float) Math.toDegrees(orientationVals[2]) * -1;
//Log.d(TAG, "YAW:" + orientationVals[0] + " PITCH:" + orientationVals[1] + "ROLL:" + orientationVals[2]);
}
}
@Override
public void onAccuracyChanged(Sensor sensor, int accuracy) {
}
private final class SkyboxRenderer extends AExampleRenderer implements View.OnClickListener {
private final Vector3 mAccValues;
boolean odd = true;
public SkyboxRenderer(Context context) {
super(context);
mAccValues = new Vector3();
}
@Override
protected void initScene() {
getCurrentCamera().setFarPlane(1000);
/**
* Skybox images by Emil Persson, aka Humus. http://www.humus.name [email protected]
*/
try {
getCurrentScene().setSkybox(R.drawable.posx, R.drawable.negx,
R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
} catch (ATexture.TextureException e) {
e.printStackTrace();
}
}
@Override
protected void onRender(long ellapsedRealtime, double deltaTime) {
super.onRender(ellapsedRealtime, deltaTime);
getCurrentCamera().setRotation(orientationVals[2], orientationVals[0], orientationVals[1]);
}
@Override
public void onClick(View v) {
try {
if (odd) {
/**
* Skybox images by Emil Persson, aka Humus. http://www.humus.name [email protected]
*/
getCurrentScene().updateSkybox(R.drawable.posx2, R.drawable.negx2,
R.drawable.posy2, R.drawable.negy2, R.drawable.posz2, R.drawable.negz2);
} else {
/**
* Skybox images by Emil Persson, aka Humus. http://www.humus.name [email protected]
*/
getCurrentScene().updateSkybox(R.drawable.posx, R.drawable.negx,
R.drawable.posy, R.drawable.negy, R.drawable.posz, R.drawable.negz);
}
} catch (Exception e) {
e.printStackTrace();
} finally {
odd = !odd;
}
}
public void setAccelerometerValues(float x, float y, float z) {
mAccValues.setAll(-x, -y, -z);
}
}
}
You have two problems. The first is the problem you are describing, but another problem is that the sensor for TYPE_ROTATION_VECTOR
is affected by nearby magnets such as those found in phone cases.
A solution could be to use a combination of the accelerometer and the gyroscope. Luckily, the Google Cardboard SDK already abstracted this away.
You can track the current rotation by instantiating an instance of com.google.vrtoolkit.cardboard.sensors.HeadTracker
using HeadTracker.createFromContext(this.getActivity())
and calling startTracking()
on it.
Now you don't need onSensorChanged
anymore. Instead, in your onRender
, you can do this:
float[] R = new float[16];
headTracker.getLastHeadView(R, 0);
to get the rotation matrix. This solves your unstated problem of magnetic fields.
The easiest way to use this rotation matrix to point the camera in the right direction is to convert it to an org.rajawali3d.math.Quaternion
and then call getCurrentCamera().setCameraOrientation(quaternion);
The conversion from float[16]
to a quaternion can be difficult to get right, but once again, the Google Cardboard SDK did it for you. In this case, it's in the source code of an old class that is no longer used: HeadTransform.
You can easily adapt that code to return new Quaternion(w, x, y, z);
.
Now this will result in the same issues as current code would have if you did not multiply orientationVals[1]
and orientationVals[2]
by -1
.
That problem, however, is easily solved by inverting the rotation matrix. That would result in the following code in onRender
(assuming getQuaternion(R)
returns an org.rajawali3d.math.Quaternion
):
@Override
protected void onRender(long ellapsedRealtime, double deltaTime) {
super.onRender(ellapsedRealtime, deltaTime);
float[] R = new float[16];
headTracker.getLastHeadView(R, 0);
android.opengl.Matrix.invertM(R, 0, R, 0);
Quaternion q = getQuaternion(R);
getCurrentCamera().setCameraOrientation(q);
}