Concatenating two tensors with different dimensions in Pytorch
You could do the broadcasting manually (using Tensor.expand()
) before the concatenation (using torch.cat()
):
import torcha = torch.randn(15, 200, 2048)b = torch.randn(1, 200, 2048)repeat_vals = [a.shape[0] // b.shape[0]] + [-1] * (len(b.shape) - 1)# or directly repeat_vals = (15, -1, -1) or (15, 200, 2048) if shapes are known and fixed...res = torch.cat((a, b.expand(*repeat_vals)), dim=-1)print(res.shape)# torch.Size([15, 200, 4096])