I'm training a neural network in PyTorch which at some point has a BatchNorm3d(C)
.
Normally, I'm training it with a batch size of 1, and the input of this specific level will then be of shape (1, C, 1, 1, 1)
.
Unfortunately, the BatchNorm
then fails with the error message:
ValueError: Expected more than 1 value per channel when training, got input size torch.Size([1, 32, 1, 1, 1])
The same happens, when I use the InstanceNorm3d
. It works perfectly fine when I use a batch size greater than two (i.e. the input will then be of shape (2, C, 1, 1, 1)
.
Does anyone know a solution to this problem? What am I missing?
The problem can be reproduced with the following snippet:
import torch
x_working = torch.ones([2,32,1,1,1])
x_not_working = torch.ones([1,32,1,1,1])
norm = torch.nn.InstanceNorm3d(32)
out_working = norm(x_working)
out_not_working=norm(x_not_working)
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…