I am writing simple video player using ffmpeg for android. Following are the steps I followed
sws_scale
av_image_copy_to_buffer
SurfaceView
by copying buffer to ANativeWindow_Buffer
Most of the videos are playing fine, but there is issue with videos which have lower resolution than window. For ex, when I play a 656x480 video on my OnePlus 7T (2206x1080), video looks distorted. The same video plays fine on emulator (2160x1080).
When I debugged whole pipe, I found that on OP7T, after locking ANativeWindow
, ANativeWindow_Buffer.stride
was set to 704 instead of 656. For all videos which plays normal, stride is same as width of buffer. The same is not the case with Android emulator.
I did some trials and tried to scale width to 600, then stride jumped to 640 and video was distorted. When I scaled width to 640, video was displayed vertically half correct.
Can anyone help me understand, How stride is calculated? And What is the reason stride is wrongly calculated?
I found one same problem here: Simple FFMpeg player for Android OP mentions that video works fine for 640, 1280, 1920.
It seems since my device is arm64-v8a, stride was always aligned to 64. To overcome this, I get stride after locking window and using ANative_WindowBuffer
. Then I use this windowbuffer.stride
to calculate dst_slice for sws_scale
.
AVFrame dummy;
if ((ret = ANativeWindow_lock(window, &windowBuffer, nullptr)) < 0) {
log_error("cannot lock window: %d", ret);
} else {
dummy.data[0] = (uint8_t *) windowBuffer.bits;
dummy.linesize[0] = windowBuffer.stride * 2 // For RGB565;
}
And then:
sws_scale(renderer->sws_ctx,
(const uint8_t* const *) frame->data,
frame->linesize,
0,
codecpar->height,
dummy.data,
dummy.linesize)
This will directly render scaled frame data to windowbuffer.