Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How can I access parameters of a node in nngraph #131

Open
mingstupid opened this issue Oct 24, 2016 · 6 comments
Open

How can I access parameters of a node in nngraph #131

mingstupid opened this issue Oct 24, 2016 · 6 comments

Comments

@mingstupid
Copy link

Hello!
I am creating an nngraph which combines two embeddings together. One of the embeddings is a word embedding and the other is not. For the word embedding, I would like to initialize its weights with the pre-trained word2vec. Is there a way to do so?
My nngraph looks like the following:

input1 = nn.Identity()()
input2 = nn.Identity()()
embed1 = nn.LookupTableMaskZero(nIndex1, embeddingSize)(input1)
embed2 = nn.LookupTableMaskZero(nIndex2, embeddingSize)(input2)
madd = nn.CAddTable()({embed1, embed2})
madd_t = nn.SplitTable(2)(madd)
embed = nn.gModule({input1, input2}, {madd_t})

Could you suggest a way to, say, set embed1.weight to pretrained word2vec?
Thank you!

@taineleau-zz
Copy link

taineleau-zz commented Oct 28, 2016

Hi,

nngraph follows the design of nn.Module.

Hence you just need to call the method parameters (or :getParameters()) to obtain the weights in order.

Try:

require 'nn'
require 'rnn'
require 'nngraph'


nIndex1 = 20
embeddingSize = 128
nIndex2 = 30

input1 = nn.Identity()()
input2 = nn.Identity()()
embed1 = nn.LookupTableMaskZero(nIndex1, embeddingSize)(input1)
embed2 = nn.LookupTableMaskZero(nIndex2, embeddingSize)(input2)
madd = nn.CAddTable()({embed1, embed2})
madd_t = nn.SplitTable(2)(madd)
embed = nn.gModule({input1, input2}, {madd_t})

print(embed:parameters())

The output is:

{
  1 : DoubleTensor - size: 21x128
  2 : DoubleTensor - size: 31x128
}
{
  1 : DoubleTensor - size: 21x128
  2 : DoubleTensor - size: 31x128
}

The first table is weights, and the second is the gradient of weights, accordingly.

Hope this help!

@chithangduong
Copy link

I believe you can assess it using embed1.data.module.weight

@brisker
Copy link

brisker commented Mar 16, 2017

how to copy particular layer's weights from pretrained nn.Sequential model to nngraph's particular module?

@haanvid
Copy link

haanvid commented May 20, 2017

@brisker I'm dealing with same problem and I'm thinking of annotating the layer that I want to check/initialize its weights and biases.

@haanvid
Copy link

haanvid commented May 20, 2017

@brisker

If you want to check/initialize parameters of a specific layer,
then you can use annotation ( annotate() ) to give name to that layer,
and find that layer to check/initialize its parameters

If you see the code below, I've gave name to the batch normalization layer as 'BN_1' and initialized the parameters and biases of that layer.

require 'torch'
require 'nn'
require 'nngraph'
require 'cunn'

function make_net()
  local x = nn.Identity()()
  local next_act1 = nn.Linear(2,3)(x)
  local next_act2 = nn.BatchNormalization(3,1e-5,0.1,true)(next_act1):annotate{
   name = 'BN_1', description = 'batch normalization'}
  --next_act.weight:fill(0.1)
  --next_act.bias:zero()
  local next3 = nn.Linear(3,1)(next_act2)

  local module = nn.gModule({x}, {next3})
  return module
end


function main()
  local mlp = make_net()

  x = torch.Tensor({10,20})

  print('mlp.forwardnodes : (press enter)')
  print(mlp.forwardnodes)
  io.read()


  for layer_idx = 1,3 do
    for _, node in ipairs(mlp.forwardnodes) do
      if node.data.annotations.name == 'BN_1' then
        node.data.module.weight:fill(1.0)
        print('BN weight: (press enter)')
        print(node.data.module.weight)
        io.read()
        node.data.module.bias:fill(0)
        print('BN bias: (press enter)')
        print(node.data.module.bias)
        io.read()

      end
    end
  end

  --local y = mlp:forward(x)
  --local paramx, paramdx = mlp:parameters()
  --print('paramx: ')
  --print(paramx)
  --print('paramdx: ')
  --print(paramdx)
end

main()

@mamun58
Copy link

mamun58 commented Mar 13, 2018

how can I add only weights and bias term in the neural net graph?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants