I have an image and I want to perform OpenCV boxFilter
on a ROI
of this image.
image = cv::imread(argv[1], CV_LOAD_IMAGE_COLOR);
cv::Rect roi( 10, 10, 64, 64);
cv::Mat output;
cv::boxFilter(image(roi),output,-1,cv::Size(scale_size,scale_size));
I want to know what happen if the kernel is out of the ROI
? Use the pixels out of the ROI
but still within the image to do the filtering or just use the value specified by the BorderType
?
Actually I want to use the former to do the filtering.
My original answer (got accepted) was that cv::boxFilter
will not use the pixels out of the ROI.
This actually turned out to be wrong.
See @Micka's excellent answer that demonstrates that by default cv::boxFilter
does use the pixels out of the ROI.
It is also discussed in this github issue.
I just wanted to add that although this behavior is not well documented, it is implied:
cv::boxFilter
has a flag-like parameter borderType
.
The values for it should come from enum cv::BorderTypes
(some values can be combined by |
).
The last value of the enum is:
BORDER_ISOLATED = 16 //!< do not look outside of ROI
This implies that if this value is not used, the function will look outside of the ROI.