Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated Reshape layer #1263

Closed
wants to merge 1 commit into from
Closed

Updated Reshape layer #1263

wants to merge 1 commit into from

Conversation

ssafar
Copy link
Contributor

@ssafar ssafar commented Oct 11, 2014

Here is a layer similar to sguada's #108. Main differences:

  • like FlattenLayer, it doesn't copy blobs
  • has numpy-like "-1 for automatic dimension inference" and uses the default 0 to copy dimensions from below (just like it was planned in Reshape layer #108)
  • is using the latest API.

As for unifying FlattenLayer and this one, also discussed in #108: technically it could be doable but they have really different defaults. For ReshapeLayer, they are 0 0 0 0 (that is, don't change any dimensions unless explicitly requested), while for a FlattenLayer-like default behavior, we'd need 0 -1 1 1 ("keep num, infer channels, flatten everything else). Also, it might be confusing to have something called RESHAPE automatically flatten things or FLATTEN with parameters doing something entirely different from flattening. Or at least that was my reasoning behind creating another layer; it might still be wrong :)

Also, sorry if it's already work in progress by someone; I was looking at the dates on #108, concluded that I'd like to use such a thing most likely sooner than its expected arrival time, and that it might be useful if I posted it if I was going to implement it for myself anyway. Let me know if this wasn't really the right way of doing things :)

@shelhamer
Copy link
Member

Thanks for resurrecting ReshapeLayer! It looks like there are a few issues to work out but this is a good start.

@sguada please review and merge.

@@ -178,6 +178,7 @@ REGISTER_LAYER_CLASS(MEMORY_DATA, MemoryDataLayer);
REGISTER_LAYER_CLASS(MVN, MVNLayer);
REGISTER_LAYER_CLASS(MULTINOMIAL_LOGISTIC_LOSS, MultinomialLogisticLossLayer);
REGISTER_LAYER_CLASS(POWER, PowerLayer);
REGISTER_LAYER_CLASS(RESHAPE, ReshapeLayer);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No action required, but FYI: I am going to change the layer_factory code so that all REGISTER_LAYER_CLASS commands go to their corresponding .cc files, instead of all inside layer_factory.cpp.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Yangqing good, I had meant to ask about that some time ago...

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

#1270 materializes this plan - all you need to do is to move the

REGISTER_LAYER_CLASS(RESHAPE, ReshapeLayer);

line to reshape_layer.cpp under the INSTANTIATE_CLASS line. Hope this does not cause much trouble.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done! (It indeed didn't :))

@shelhamer
Copy link
Member

Please squash into a single commit with an informative message for merge. Thanks for the layer!

@shelhamer shelhamer mentioned this pull request Oct 16, 2014
@ssafar ssafar force-pushed the reshape_layer_2 branch 2 times, most recently from 0c323c8 to 855709f Compare October 16, 2014 03:47
@ssafar
Copy link
Contributor Author

ssafar commented Oct 16, 2014

Done! Also added some docs to the layer catalog about how to use it.

@shelhamer
Copy link
Member

This looks good to me but needs a master edition. @ssafar if you have a chance that's ideal, or else we could merge to master from this PR at some point.

@jeffdonahue
Copy link
Contributor

rebased in #2217

@ssafar
Copy link
Contributor Author

ssafar commented Mar 28, 2015

@jeffdonahue: thanks for the rebase & fixing it up! (& sorry for not doing it myself & disappearing for so long, I had neither a proper GPU-equipped box nor time lately. But then it's all good in the end :))

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants