Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add stein methods #2183

Merged
merged 19 commits into from
May 22, 2017
Merged

add stein methods #2183

merged 19 commits into from
May 22, 2017

Conversation

ferrine
Copy link
Member

@ferrine ferrine commented May 13, 2017

More and more papers use stein methods. Here I implement convenient interface to it

@@ -19,6 +19,7 @@
'ADVI',
'FullRankADVI',
'SVGD',
'ASVGD',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's ASVGD?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Amortized Stein Variational Gradient Descent
Can be found here. It is a novel black box algorithm for wild inference that doesn't require q(z)


from .stein import Stein
from .approximations import *
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we removed all * imports in other parts of the code-base, shouldn't reintroduce it here. cc @ColCarroll

@ferrine
Copy link
Member Author

ferrine commented May 15, 2017

That's ready for review

@twiecki
Copy link
Member

twiecki commented May 15, 2017

I assigned @taku-y if he has time.

@ferrine
Copy link
Member Author

ferrine commented May 16, 2017

I think I'll add a note about behavior of sampler in high dimensions. After I see no problem for merge
CC @taku-y

@ferrine
Copy link
Member Author

ferrine commented May 16, 2017

Important Note:
For BEST model with FullRank approx I found an exact solution with large number of particles(1000). With low number of them variance was underestimated. So I'm not sure if it really works as proposed in the paper(theoretically). CC @lewisKit

Blue is ASVGD on FullRank approx, green is FullRankADVI
image

@taku-y
Copy link
Contributor

taku-y commented May 20, 2017

Here is the log of Travis on TestSVGD.test_optimizer_minibatch_with_generator:

>       np.testing.assert_allclose(np.std(trace['mu']), np.sqrt(1. / d), rtol=0.4)
E       AssertionError: 
E       Not equal to tolerance rtol=0.4, atol=0
E       
E       (mismatch 100.0%)
E        x: array(0.13835382035008825)
E        y: array(0.09476178269858855)

I'm not sure if this error rate is acceptable.

@ferrine
Copy link
Member Author

ferrine commented May 20, 2017

I'm more concerned about poor ASVGD performance on simple problems. I'l increase number of iterations for these tests and lower relative tolerance

@ferrine
Copy link
Member Author

ferrine commented May 21, 2017

@taku-y I've fixed tests, they look better now. (passing locally)

@taku-y
Copy link
Contributor

taku-y commented May 22, 2017

ASVGD results are comparable on the BEST example and the code looks good to me. I think its ready to merge.

@taku-y taku-y merged commit fab6938 into pymc-devs:master May 22, 2017
@twiecki
Copy link
Member

twiecki commented May 22, 2017

Awesome! Should also add an example with short description.

@ferrine
Copy link
Member Author

ferrine commented May 22, 2017

Sure, coming soon in my blog

@ferrine
Copy link
Member Author

ferrine commented May 22, 2017

I'll point on the problems that can be met using asvgd.

@springcoil
Copy link
Contributor

springcoil commented May 22, 2017 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants