Search code examples
iosobjective-ccore-motionmotiondevice-orientation

How to use CoreMotion for getting device orientation in space


I am having a thinking and searching problem here and can't find the good direction to look at... I am looking to develop an algorithm to move in a 360 image (sphere like) by using the device motion.

So if the user point the device in front of him he get the determined origin point of the image. As he moves the device around him the panoramic image moves according to it.

Any idea or source I can look into ?

Thanks and good luck to everyone with Swift :)


Solution

  • I see two easy ways to implement this without to much math hassle:

    1. Use Euler Angles i.e. the roll property of CMAttitude. Define a mapping between the width of the image and the measured angle and take care of the singularity at 180° / -180°. Drawback of this approach:
      • Possible danger of Gimbal Lock when users start to move their devices in a chaotic way.
      • The same applies to extensibility regarding a full 3D view.
    2. Use the magnetic field from CMDeviceMotion which is robust against Gimbal Lock. The magnetometer is a little bit slower than accelerometer / gyro, but I think the CoreMotion's fusion algorithm will provide a reasonable estimation so that this won't be the point. Drawbacks here:
      • Magnetic field is not always available or tends to be slighlty imprecise.
      • Extending it to 3D view might be a hassle.

    Both approaches should be pretty easy to implement and thus I would start with one of them. If you then want to have a more sophisticated solution, you will need to dive a little bit deeper into the maths. In this case a ppossible solution can be to use the current device normal (s. for example Finding normal vector to iOS device), project it onto the earth suface plane and take the angles' delta for the cylindric panorama.

    The sphere projection is even easier in this case as you can use the nomal vector directly.