Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concat layer #125

Merged
merged 5 commits into from
Feb 28, 2014
Merged

Concat layer #125

merged 5 commits into from
Feb 28, 2014

Conversation

sguada
Copy link
Contributor

@sguada sguada commented Feb 18, 2014

Added layer to concatenate blobs along one dimension. For now it only allows to concatenate along num concat_dim=0 or channels concat_dim=1 dimension. This layer can take multiple blobs (at least two) and produce one that is the concatenation along one dimension of them. The other dimensions must agree.

It adds optional unit32 concat_dim=65 [default=1] de to the list of params of caffe.proto. By default it concatenate blobs along channels.

@mavenlin
Copy link
Contributor

What about not doing memory copy. In the setup function, replace the bottom data with a continuous chunk of memory, and pack the pointers in the Blob and forward and backward functions just do nothing.

@Yangqing
Copy link
Member

Might be not very kosher because the bottom blobs are supposed to be
treated as constant inputs.

Yangqing

On Mon, Feb 17, 2014 at 6:30 PM, Lin Min notifications@github.com wrote:

What about not doing memory copy, in the setup function, replace the
bottom data with a continuous chunk of memory, and pack the pointers in the
Blob.

Reply to this email directly or view it on GitHubhttps://github.com//pull/125#issuecomment-35345328
.

@sguada
Copy link
Contributor Author

sguada commented Feb 18, 2014

@mavenlin I'm a bit confused by your comment. I'm not doing any memory copy in the setup, just setting up the top_blob.
Also contiguous memory it is not enough to allow concate along different dimensions, since the data is not contiguous along every dimension.
Plus as @Yangqing said, bottom blobs should be treated as constant.

@mavenlin
Copy link
Contributor

@sguada Sorry, you are right, contiguous memory is not enough for concatenation along different dimensions.

@mavenlin
Copy link
Contributor

@Yangqing Sure it is not an elegant way. I guess some of the memory optimisations can be done on the network level. In a similar way that theano does.

@jeffdonahue
Copy link
Contributor

Hey @sguada, is this still WIP? Let me know when I should test and possibly merge.

@sguada
Copy link
Contributor Author

sguada commented Feb 27, 2014

Rebased according the new dev @jeffdonahue I think now it is ready for test and merge if considered appropriate.

ConcatLayer<TypeParam> layer(layer_param);
GradientChecker<TypeParam> checker(1e-2, 1e-3);
// it is too expensive to call curand multiple times, so we don't do an
// exhaustive gradient check.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these comments are copied from some other layer, please remove (unless this layer really does use random in some way that I'm missing)

@jeffdonahue
Copy link
Contributor

I verified that the tests for this layer pass - could you remove the comments as noted above and clean up the lint errors?

@shelhamer
Copy link
Member

This layer needs to be split into cpp and cu per #152 .

@sguada
Copy link
Contributor Author

sguada commented Feb 28, 2014

@jeffdonahue Removed comments and cleaned up lint errors
@shelhamer splited into cpp and cu as done in #152

I wonder if we should also splits the tests codes into .cpp and .cu?

@shelhamer
Copy link
Member

@sguada Good observation: everything will be split, but it's fine for now. Let's revisit all this post deadline.

jeffdonahue added a commit that referenced this pull request Feb 28, 2014
@jeffdonahue jeffdonahue merged commit 82faa9a into BVLC:dev Feb 28, 2014
@jeffdonahue
Copy link
Contributor

Thanks for polishing @sguada ! Merged.

@wkal
Copy link

wkal commented Sep 26, 2014

@sguada can you give us a more detailed example, so we can know exactly how to use that concat layer, thanks!

mitmul pushed a commit to mitmul/caffe that referenced this pull request Sep 30, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants