Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

inconsistent tensor size #6

Open
simmoncn opened this issue Dec 21, 2016 · 6 comments
Open

inconsistent tensor size #6

simmoncn opened this issue Dec 21, 2016 · 6 comments

Comments

@simmoncn
Copy link

Hi, do you know the following errors. thanks

/root/torch/install/bin/luajit: /root/torch/install/share/lua/5.1/threads/threads.lua:183: [thread 1 callback] .../Photo-Realistic-Super-Resoluton-master/data/dataset.lua:339: inconsistent tensor size at /root/torch/pkg/torch/lib/TH/generic/THTensorCopy.c:7
stack traceback:
[C]: in function 'copy'
.../Photo-Realistic-Super-Resoluton-master/data/dataset.lua:339: in function 'tableToOutput'
.../Photo-Realistic-Super-Resoluton-master/data/dataset.lua:359: in function <.../Photo-Realistic-Super-Resoluton-master/data/dataset.lua:347>
[C]: in function 'xpcall'
/root/torch/install/share/lua/5.1/threads/threads.lua:234: in function 'callback'
/root/torch/install/share/lua/5.1/threads/queue.lua:65: in function </root/torch/install/share/lua/5.1/threads/queue.lua:41>
[C]: in function 'pcall'
/root/torch/install/share/lua/5.1/threads/queue.lua:40: in function 'dojob'
[string " local Queue = require 'threads.queue'..."]:15: in main chunk

@yulunzhang
Copy link

Hi, I also have the same problem. In /data/dataset.lua line 339, I checked the size of data[i] and dataTable[i] and found that they really had different sizes. The size of data[i] is 1x96x96, while dataTable[i] has four different sizes (e.g., 1x128x96, 1x96x128, 1x143x96, 1x96x144) corresponding to 'nThreads = 4'. As a result, the error 'inconsistent tensor size' occurred.
The error information are as follows:
lunge@lunge:/media/lunge/Users/projects/SRGAN/Photo-Realistic-Super-Resoluton$ th run_sr.lua
{
ntrain : inf
beta1 : 0.9
name : "super_resolution"
dataset : "folder"
niter : 250
lr : 0.001
model_folder : "model"
gpu : 1
nThreads : 1
t_folder : "/media/lunge/Data/IMAGENET2015/CLS_LOC_dataset/train"
batchSize : 32
loadSize : 96
}
Starting donkey with id: 1 seed: 1
table: 0x41e4bfe0
Loading train metadata from cache
Dataset: folder Size: 1281167
cunn used
/home/lunge/torch/install/bin/luajit: /home/lunge/torch/install/share/lua/5.1/threads/threads.lua:183: [thread 1 callback] ...s/SRGAN/Photo-Realistic-Super-Resoluton/data/dataset.lua:339: inconsistent tensor size at /home/lunge/torch/pkg/torch/lib/TH/generic/THTensorCopy.c:7
stack traceback:
[C]: in function 'copy'
...s/SRGAN/Photo-Realistic-Super-Resoluton/data/dataset.lua:339: in function 'tableToOutput'
...s/SRGAN/Photo-Realistic-Super-Resoluton/data/dataset.lua:359: in function <...s/SRGAN/Photo-Realistic-Super-Resoluton/data/dataset.lua:347>
[C]: in function 'xpcall'
/home/lunge/torch/install/share/lua/5.1/threads/threads.lua:234: in function 'callback'
/home/lunge/torch/install/share/lua/5.1/threads/queue.lua:65: in function </home/lunge/torch/install/share/lua/5.1/threads/queue.lua:41>
[C]: in function 'pcall'
/home/lunge/torch/install/share/lua/5.1/threads/queue.lua:40: in function 'dojob'
[string " local Queue = require 'threads.queue'..."]:15: in main chunk
stack traceback:
[C]: in function 'error'
/home/lunge/torch/install/share/lua/5.1/threads/threads.lua:183: in function 'dojob'
...ects/SRGAN/Photo-Realistic-Super-Resoluton/data/data.lua:76: in function 'getBatch'
run_sr.lua:85: in function 'opfunc'
/home/lunge/torch/install/share/lua/5.1/optim/adam.lua:37: in function 'adam'
run_sr.lua:133: in main chunk
[C]: in function 'dofile'
...unge/torch/install/lib/luarocks/rocks/trepl/scm-1/bin/th:145: in main chunk
[C]: at 0x00405d50
lunge@lunge:/media/lunge/Users/projects/SRGAN/Photo-Realistic-Super-Resoluton$
The size of dataTable[i] should be 1x96x96, I think. Would you please help us solve this problem? Thank you! @leehomyc

@leehomyc
Copy link
Owner

Hi I don't have this problem while training.

Yes data[i] has size 96x96. However dataTable[i] uses sampleHookTrain which relates to the function local trainHook=function(self,path) in donkey_folder_supres.lua. The function returns two outputs (input_y, and input_y2) where the first one is of size 96x96 and the second one has size 24x24. I do not know where other sizes come from.

If you know what caused the problem, please let me know and I can update the repository. Thanks!

@zhangqianhui
Copy link

@leehomyc

I have also got this problem.

@zhangqianhui
Copy link

Can you tell us your torch and other libs version?

@leehomyc
Copy link
Owner

Hello!

Sorry I fixed the issue which is caused by image size. When the input image's height and width are different there will be such error.

@zhangqianhui
Copy link

@leehomyc thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants