I have this error in the line:
(“embeddings = x + self.position_embeddings”):
Error:
RuntimeError: The size of tensor a (8) must match the size of tensor b (2744) at non-singleton dimension 1
Can anyone help me fix it? Thanks in advance.
Snippet code:
class Channel_Embeddings(nn.Module):
“””Construct the embeddings from patch, position embeddings.”””
def init(self, config, patchsize, img_size, in_channels):
super().init()
img_size = (img_size, img_size, img_size) # Assuming img_size is scalar, for 3D, it should be a tuple
patch_size = (patchsize, patchsize, patchsize)
n_patches = (img_size[0] // patch_size[0]) * (img_size[1] // patch_size[1]) * (img_size[2] // patch_size[2])
self.patch_embeddings = nn.Conv3d(in_channels=in_channels,
out_channels=in_channels,
kernel_size=1,
stride=patch_size)
self.position_embeddings = nn.Parameter(torch.zeros(1, n_patches, in_channels))
self.dropout = Dropout(config.transformer["embeddings_dropout_rate"])
def forward(self, x):
if x is None:
return None
x = self.patch_embeddings(x) # (B, hidden, D', H', W')
x = x.flatten(2) # (B, hidden, D'*H'*W')
x = x.transpose(-1, -2) # (B, n_patches, hidden)
embeddings = x + self.position_embeddings
embeddings = self.dropout(embeddings)
return embeddings
This error is my first code error, and I didn’t try anywhere to fix it. I am looking for a solution. Thanks
Amir9663 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.