I am very much enjoying some better results achieved with UIMotionEffect
subclass than listening to CMMotionManager
. Using CoreMotion
usually gives inaccuracies as also saw many people complain about it through the web.
I cannot find any explanation though to how these differ, in terms of which sensors are used, or what manipulation is used on the output of these sensors that explains the different results that I am experiencing.
UIMotionEffect
by Apple:
https://developer.apple.com/documentation/uikit/uimotioneffect
CMMotionManager
by Apple:
https://developer.apple.com/documentation/coremotion/cmmotionmanager
Now while its obvious that CMMotionManager
gives you much more functionality like Gyro, acceleration and more, I am mostly asking about roll/pitch that I can either get as a UIOffset
from overriding UIMotionEffect
or get it from the CMDeviceMotion.attitude
.
It's probably not the right place to ask this question, however I'll try to answer you.
CoreMotion
is an API to access raw values of the sensors from the accelerometer, the gyroscope, the magnetometer and other higher level indicators. It can be configured with different update profiles (from very high refresh rate to economic, power saving refresh rate).
With UIMotionEffect
you can move/animate your views along horizontal and vertical axis rotations of your device. It's mostly used for creating parallax effects like the background moving on the springboard (the home screen). You also have to subclass this class or use the already made UIInterpolatingMotionEffect
.
UIMotionEffect
is made for quick, subtle effects on your UI, and can be used out of the box with small energy footprint, when CoreMotion
is more suited in intensive gyroscope usage like in games with motion controls.
Oh, and UIMotionEffect
is permanently centering the view, so when you rotate your phone and stand still on the side, it will gently slide back to the center to match your new position, not very usable in games.