Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: The expanded size of the tensor (9068) must match the existing size (453400) at non-singleton dimension 1. Target sizes: [5, 9068]. Tensor sizes: [1, 453400] #64

Open
aby005 opened this issue Dec 4, 2020 · 1 comment

Comments

@aby005
Copy link

aby005 commented Dec 4, 2020

Traceback (most recent call last): File "run_bert.py", line 222, in <module> main() File "run_bert.py", line 215, in main run_train(args) File "run_bert.py", line 120, in run_train trainer.train(train_data=train_dataloader, valid_data=valid_dataloader) File "C:\MyDrive\PythonWs\NextGenWs\harshitha\Bert\IonePatient-our model\pybert\train\trainer.py", line 153, in train train_log = self.train_epoch(train_data) File "C:\MyDrive\PythonWs\NextGenWs\harshitha\Bert\IonePatient-our model\pybert\train\trainer.py", line 132, in train_epoch metric(logits=self.outputs, target=self.targets) File "C:\MyDrive\PythonWs\NextGenWs\harshitha\Bert\IonePatient-our model\pybert\train\metrics.py", line 47, in __call__ correct = pred.eq(target.view(1, -1).expand_as(pred)) RuntimeError: The expanded size of the tensor (9068) must match the existing size (453400) at non-singleton dimension 1. Target sizes: [5, 9068]. Tensor sizes: [1, 453400]

I tried this with my data and it trained fine. But when I tried to get the Accuracy Metric while training, it threw this error.
I passed accuracy metric like this Accuracy(topK=5) in the trainer object in the run_bert.py file.
Any idea what is the fix for this?

@BarryRun
Copy link

I met the same trouble.
This error occured in codes below:

class Accuracy(Metric):
    '''
    计算准确度
    可以使用topK参数设定计算K准确度
    Examples:
        >>> metric = Accuracy(**)
        >>> for epoch in range(epochs):
        >>>     metric.reset()
        >>>     for batch in batchs:
        >>>         logits = model()
        >>>         metric(logits,target)
        >>>         print(metric.name(),metric.value())
    '''
    def __init__(self,topK):
        super(Accuracy,self).__init__()
        self.topK = topK
        self.reset()

    def __call__(self, logits, target):
        _, pred = logits.topk(self.topK, 1, True, True)
        pred = pred.t()
        correct = pred.eq(target.view(1, -1).expand_as(pred))
        self.correct_k = correct[:self.topK].view(-1).float().sum(0)
        self.total = target.size(0)

    def reset(self):
        self.correct_k = 0
        self.total = 0

    def value(self):
        return float(self.correct_k)  / self.total

    def name(self):
        return 'accuracy'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants