Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge master #265

Merged
merged 6 commits into from
Aug 4, 2020
Merged

merge master #265

merged 6 commits into from
Aug 4, 2020

Conversation

SparkSnail
Copy link
Owner

No description provided.

suiguoxin and others added 6 commits July 31, 2020 10:54
* init sapruner

* seperate sapruners from other one-shot pruners

* update

* fix model params issue

* make the process runnable

* show evaluation result in example

* sort the sparsities and scale it

* fix rescale issue

* fix scale issue; add pruning history

* record the actual total sparsity

* fix sparsity 0/1 problem

* revert useless modif

* revert useless modif

* fix 0 pruning weights problem

* save pruning history in csv file

* fix typo

* remove check perm in Makefile

* use os path

* save config list in json format

* update analyze py; update docker

* update

* update analyze

* update log info in compressor

* init NetAdapt Pruner

* refine examples

* update

* fine tune

* update

* fix quote issue

* add code for imagenet  integrity

* update

* use datasets.ImageNet

* update

* update

* add channel pruning in SAPruner; refine example

* update net_adapt pruner; add dependency constraint in sapruner(beta)

* update

* update

* update

* fix zero division problem

* fix typo

* update

* fix naive issue of NetAdaptPruner

* fix data issue for no-dependency modules

* add cifar10 vgg16 examplel

* update

* update

* fix folder creation issue; change lr for vgg exp

* update

* add save model arg

* fix model copy issue

* init related weights calc

* update analyze file

* NetAdaptPruner: use fine-tuned weights after each iteration; fix modules_wrapper iteration issue

* consider channel/filter cross pruning

* NetAdapt: consider previous op when calc total sparsity

* update

* use customized vgg

* add performances comparison plt

* fix netadaptPruner mask copy issue

* add resnet18 example

* fix example issue

* update experiment data

* fix bool arg parsing issue

* update

* init ADMMPruner

* ADMMPruner: update

* ADMMPruner: finish v1.0

* ADMMPruner: refine

* update

* AutoCompress init

* AutoCompress: update

* AutoCompressPruner: fix issues:

* add test for auto pruners

* add doc for auto pruners

* fix link in md

* remove irrelevant files

* Clean code

* code clean

* fix pylint issue

* fix pylint issue

* rename admm & autoCompress param

* use abs link in doc

* reorder import to fix import issue: autocompress relies on speedup

* refine doc

* NetAdaptPruner: decay pruning step

* take changes from testing branch

* refine

* fix typo

* ADMMPruenr: check base_algo together with config schema

* fix broken link

* doc refine

* ADMM:refine

* refine doc

* refine doc

* refince doc

* refine doc

* refine doc

* refine doc

* update

* update

* refactor AGP doc

* update

* fix optimizer issue

* fix comments: typo, rename AGP_Pruner

* fix torch.nn.Module issue; refine SA docstring

* fix typo

Co-authored-by: Yuge Zhang <scottyugochang@gmail.com>
@SparkSnail SparkSnail merged commit 68abe2f into SparkSnail:master Aug 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants