Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dummy data layer #480

Merged
merged 2 commits into from
Jun 9, 2014
Merged

Dummy data layer #480

merged 2 commits into from
Jun 9, 2014

Conversation

jeffdonahue
Copy link
Contributor

This is a layer that outputs "dummy" data from a Filler, e.g. ConstantFiller or GaussianFiller to an arbitrary number of blobs of arbitrary sizes. It takes N >= 1 top blobs, 1 or N nums, channelss, heights, and widths and 0, 1, or N data_fillers. If there are 0 data_fillers, output all 0s to all top blobs. If there is exactly 1 of any of num, channels, height, width, data_filler, that value is used for all top blobs. If there are exactly N of any of num, channels, height, width, data_filler, use the ith value for the ith top blob.

For any ConstantFillers, the data is filled during SetUp and never touched again in Forward. For all other filler types (all of which output random data), the data is filled on each iteration using Forward.

I can think of a couple use cases for this, including:

  1. unit tests for an entire net where one might not want to create an entire temporary LevelDB (I use it for this purpose in later private commits, to be publicized soon).

  2. regularization by use in combination with an EltwiseLayer (with op == SUM) to add random (e.g., Gaussian) noise to data in some layer of the net.

@sguada
Copy link
Contributor

sguada commented Jun 9, 2014

This great Jeff,

I have something similar for net_speed but this is more general.

On Sunday, June 8, 2014, Jeff Donahue notifications@github.com wrote:

(Please merge #479 #479 before this;
this is based on #479 #479 so I don't
have to reformat DummyDataLayer's header declaration after that one is
merged. Sorry for the bad PR manners...I can rebase purely on dev if
strongly preferred.)

This is a layer that outputs "dummy" data from a Filler, e.g.
ConstantFiller or GaussianFiller to an arbitrary number of blobs of
arbitrary sizes. It takes N >= 1 top blobs, 1 or N nums, channelss, heights,
and widths and 0, 1, or N data_fillers. If there are 0 data_fillers,
output all 0s to all top blobs. If there is exactly 1 of any of num,
channels, height, width, data_filler, that value is used for all top
blobs. If there are exactly N of any of num, channels, height, width,
data_filler, use the ith value for the ith top blob.

For any ConstantFillers, the data is filled during SetUp and never
touched again in Forward. For all other filler types (all of which output
random data), the data is filled on each iteration using Forward.

I can think of a couple use cases for this, including:

  1. unit tests for an entire net where one might not want to create an
    entire temporary LevelDB (I use it for this purpose in later private
    commits, to be publicized soon).

  2. regularization by use in combination with an EltwiseLayer (with op ==
    SUM) to add random (e.g., Gaussian) noise to data in some layer of the

net.

You can merge this Pull Request by running

git pull https://github.com/jeffdonahue/caffe dummy-data-layer

Or view, comment on, or merge it at:

#480
Commit Summary

  • layers declare their names and number of input/output blobs, and
    don't
  • move MemoryDataLayer decl. from vision_layers.hpp to data_layers.hpp
  • add DummyDataLayer
  • add DummyDataLayer tests

File Changes

Patch Links:


Reply to this email directly or view it on GitHub
#480.

Sergio

@shelhamer
Copy link
Member

Nice layer and short and to the point implementation.

Please add the GPU version then rebase since #479 was rewritten. Feel free to merge yourself once that's done! This is great otherwise.

jeffdonahue added a commit that referenced this pull request Jun 9, 2014
@jeffdonahue jeffdonahue merged commit 65d80ec into BVLC:dev Jun 9, 2014
@jeffdonahue jeffdonahue deleted the dummy-data-layer branch June 10, 2014 18:39
mitmul pushed a commit to mitmul/caffe that referenced this pull request Sep 30, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants