Skip to content

Something in the middle of Autograd's grad engine and Karpathy's Micrograd engine

License

Notifications You must be signed in to change notification settings

soumyadip1995/numgrad

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

numgrad

If torch is based on the Torch Tensor, then numgrad is based on the numpy array. A Tensor class wrapping the numpy array. If karpathy/micrograd provides support for scalar values and its gradients, numgrad provides support for both scalar values and matrices.

Note- NNs to be implemented. Yet to be finished.

A few Examples

A few examples have been provided below.

1)

x = Tensor(np.arange(-4, 4).reshape(2, 4))
y = Tensor(np.arange(-2, 2).reshape(4, 1))
n = dot(x, y)
n1 = relu(n)
backward_graph(n1)
print(x.grad, y.grad)

2)

x_init = np.random.randn(3,3).astype(np.float32)
W_init = np.random.randn(3,3).astype(np.float32)
m_init = np.random.randn(1,3).astype(np.float32)
x = Tensor(x_init)
y = Tensor(W_init)
c = mul(x,y)
out = relu(c)
d = sum(out)
tr = backward_graph(d)
print(out, d.data, tr)

Scalar values

The input can either be a scalar or a np.ndarray , some scalar value computations included. Not final.


a = Tensor(-8.0)
b = Tensor(9.0)
c = Tensor(-3.0)
outadd = add(a, b)
outm = mul(outadd, c) 
d = backward_graph(outm)
print(outm.data, outm a.grad, b.grad)

# More operations for dd/da and dd/db will be supported


About

Something in the middle of Autograd's grad engine and Karpathy's Micrograd engine

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages