Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

why do we need self.adjust and self.bias in BoxTower? #14

Open
hajungong007 opened this issue Nov 27, 2022 · 0 comments
Open

why do we need self.adjust and self.bias in BoxTower? #14

hajungong007 opened this issue Nov 27, 2022 · 0 comments

Comments

@hajungong007
Copy link

    # adjust scale
    self.adjust = nn.Parameter(0.1 * torch.ones(1))
    self.bias = nn.Parameter(torch.Tensor(1.0 * torch.ones(1, 4, 1, 1)))

def forward(self, search, kernel, update=None):
    # encode first
    if update is None:
        cls_z, cls_x = self.cls_encode(kernel, search)  # [z11, z12, z13]
    else:
        cls_z, cls_x = self.cls_encode(update, search)  # [z11, z12, z13]

    reg_z, reg_x = self.reg_encode(kernel, search)  # [x11, x12, x13]

    # cls and reg DW
    cls_dw = self.cls_dw(cls_z, cls_x)
    reg_dw = self.reg_dw(reg_z, reg_x)
    x_reg = self.bbox_tower(reg_dw)
    x = self.adjust * self.bbox_pred(x_reg) + self.bias
    x = torch.exp(x)

    # cls tower
    c = self.cls_tower(cls_dw)
    cls = 0.1 * self.cls_pred(c)

    return x, cls, cls_dw, x_reg
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant