Search code examples
pythondeep-learningpytorchconv-neural-network

RuntimeError: stack expects each tensor to be equal size, but got [3, 224, 255] at entry 0 and [3, 224, 351] at entry 1


I am building a CNN model on my custom dataset and i am getting the error. Don't know where i am going wrong. I have put the resize in the transforms too but its still not working. I saw many other posts regrading the error but none of them seems to be working in my case. What could be the issue?

#------------------------train transform----------------------#
train_transform=transforms.Compose([
    transforms.Resize(224),
    transforms.ToTensor(),
    transforms.Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)),
    ])
    
   
#------------------------test transform------------------------------#

test_transform=transforms.Compose([
    transforms.Resize(224),
    transforms.ToTensor(),
    transforms.Normalize((0.485, 0.456, 0.406), (0.229, 0.224, 0.225)),
    ])

#----------------------image dataset----------------------------#
train_data=datasets.ImageFolder(os.path.join(root_dir, 'train'),transform=train_transform)
test_data=datasets.ImageFolder(os.path.join(root_dir,'test'),transform=test_transform)

#create data loader
train_loader=DataLoader(train_data,batch_size=16,shuffle=True,drop_last=True)
test_loader=DataLoader(test_data,shuffle=False,batch_size=16)

Solution

  • Seems like the data in your "train" folder and the ones in your "test" folder don't have the same size.
    One way to solve it, is to change the transforms.Resize(224) to:

    transforms.Resize(256),
    transforms.CenterCrop(224),