Search code examples
computer-visionpytorchconv-neural-networkdeconvolution

Replacing nn.Upsample with alternative upsample operation


I have a UNet++(view in private, code for model at the bottom of the article) which I'm trying to reconfigure. I'm getting some artifacts in some images so I'm following this article which suggests doing upsampling then a convolution operation.

I'm replacing the up-sample layers with sequential operation shown below but my model isn't learning. I suspect its to do with how I've configured the channels so I'd like another opinion.

Old up-sample operation:

self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True)

New operations:

class upConv(nn.Module):
"""
Up sampling/ deconv block by factor of 2 
"""
def __init__(self, in_ch, out_ch):
    super().__init__()
    self.upc = nn.Sequential(
                             nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True),
                             nn.Conv2d(in_ch, out_ch*2, 3, stride=1, padding=1),
                             nn.BatchNorm2d(out_ch*2),
                             nn.ReLU(inplace=True)
                             )
def forward(self, x):
    out = self.upc(x)
    return out

My question is do these two operations have the same output/function within my model?


Solution

  • Do these two operations have the same output/function within my model?

    If out_ch*2 == in_ch, then: yes, they have the same output shape.

    If the input x is the output of a BatchNorm+ReLU op, then they could be even more similar.