I have the same code for imageAnalysis and for imageCapture. But I get different getRotationDegrees.
ImageAnalysis imageAnalysis = new ImageAnalysis.Builder()
.setTargetResolution(new Size(1200,720 ))
.setBackpressureStrategy(ImageAnalysis.STRATEGY_KEEP_ONLY_LATEST)
.setOutputImageFormat(ImageAnalysis.OUTPUT_IMAGE_FORMAT_YUV_420_888)
.setTargetRotation(rotation)
.build();
imageAnalysis.setAnalyzer(executor, image -> {
imageRotationDegrees = image.getImageInfo().getRotationDegrees();
and
// Create image capture
imageCapture = new ImageCapture.Builder()
.setTargetRotation(rotation)
.build();
public void takePicture() {
imageCapture.takePicture(getExecutor(), new ImageCapture.OnImageCapturedCallback() {
@Override
public void onCaptureSuccess(@NonNull ImageProxy image) {
imageRotationDegrees = image.getImageInfo().getRotationDegrees();
On Emulator:
imageRotationDegrees = 90 in imageAnalisys;
imageRotationDegrees = 90 in imageCapture;
On Device:
imageRotationDegrees = 90 in imageAnalisys;
imageRotationDegrees = 0 in imageCapture;
I cant understand why RotationDegrees are different between capture modes.
It's because the ImageCapture
output is rotated by CameraX, while the ImageAnalysis
is not.
The rotationDegrees
represents how the app should rotate the image to get it "upright". The original image usually needs rotation, that is why ImageAnalysis output has a rotation degrees.
For ImageCapture, some OEMs choose to rotate the output in the hardware layer for efficiency. If the image is already rotated, then there is no need for the app to rotate, thus the rotation degree becomes 0.