Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about embedding concat #11

Open
chengweige opened this issue Feb 20, 2021 · 3 comments
Open

about embedding concat #11

chengweige opened this issue Feb 20, 2021 · 3 comments

Comments

@chengweige
Copy link

layer1 ~ layer3 have different feature map size, so how to concate embedding from layer1 ~ layer3?
i can't find it from the paper

@plutoyuxie
Copy link

layer1 ~ layer3 have different feature map size, so how to concate embedding from layer1 ~ layer3?
i can't find it from the paper

The concatenation function is defined here:

def embedding_concat(x, y):

You could also use torch.nn.functional.interpolate(input, scale_factor=2, mode='nearest') as you like.

@Omarelsaadany
Copy link

Hello
Have any one got this error
x = x.view(B, C1, -1, H2, W2)
RuntimeError: shape '[20, 256, -1, 50, 50]' is invalid for input of size 200724480

@bjaeger1
Copy link

@Omarelsaadany. you need to change your input image size, as with your current input image size there is an dimension error. 200724480/(20x256x50x50) is a float number and not an integer
The dimension "-1" is a PyTorch alias for "infer this dimension given the others have all been specified" (i.e. the quotient of the original product by the new product)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants