You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@smy69
Actually, the feat_sum is sum of weight of feature-functions(local functions).
In ME or CRF model, we defined a set of feature functions which usually are binary functions. It means if a feature function is activated, then it's weight will be added to an accumulated value, i.e. the score value of feature.
So this summation process can be viewed as multiplying a weight matrix by a sparse vector. And the sparse vector is the sum of one-hot represent feature function's index.
This is very similar to embedding looking up. Since sparse matrix multiplication costs resources very much, I used embedding_lookup instead.
In
hybrid_model.py
, you addfeature template
, if I am right, the implementation is line 116,where
feat_sum
is defined as:It seems that
feat_sum
is just thesum
of theembedding vector
, so, what's the principle of this ?The text was updated successfully, but these errors were encountered: