Skip to content

A repository containing experimental results about fibonacci intialized neural networks in comparison to random intialization neural networks.

License

Notifications You must be signed in to change notification settings

alerrandrorm/fibonacci-initialized-neural-networks

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fibonacci Initialized Neural Networks

A repository containing experimental results about fibonacci intialized neural networks in comparison to random intialization neural networks. In this repository we will adding experiments related to Fibonacci initializations on different neural networks or different modules of these networks. Also, will be experimenting with Golden ratio and it's signifance in activation layers/functions.

Description and Methodology followed

Fibonacci initialized weights with golden ratio as learning rate.

Results

  1. As explained in the article the fibonacci based initialization with low variance has already peformed less better as compaered to regular initialization. Forget about comparing it with Glorut Xaviar Initialization.
  2. Also, after experiments for high variance intialization as compared to earlier experiments results keep on deteriorating. But, still good enough accuracy results of 95.33 for weights going upto '1.597' max is obtained. After, that it's a downhill worth not spending time on.

Conclusion

Fibonacci initialization with golden ratio as learning rate has no improvement effect as stated by study mentioned in the article. Rather, it seems like results seems to deteriorate for this kind of regularized initialization.

About

A repository containing experimental results about fibonacci intialized neural networks in comparison to random intialization neural networks.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%