Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perhaps finally a decent LBFGS? #426

Merged
merged 4 commits into from
Dec 3, 2023
Merged

Perhaps finally a decent LBFGS? #426

merged 4 commits into from
Dec 3, 2023

Conversation

henrymoss
Copy link
Contributor

I stole @daniel-dodd "back-of-the-envolope" implementation of a direct scipy minimize interface and fiddled it a bit.

It seems to work so much better (i.e. can optimize @thomaspinder's graph kernel).

Also means we can remove the jaxopt dependency

@henrymoss
Copy link
Contributor Author

Something a bit weird happened when I updated my lock though!

@thomaspinder
Copy link
Collaborator

Looks like something has broken with the decoupled sampler?

@henrymoss
Copy link
Contributor Author

Looks like something has broken with the decoupled sampler?

I managed to change the beartype version I think (when I deleted jaxopt). Fiddled the poetry files back to normal now

@henrymoss
Copy link
Contributor Author

Actually @thomaspinder its still broken. I am unable to remove the jaxopt dependency in pyproject.toml without massive changes to my poetry.lock (hence changing the beartype version).

How should I be doing this? at the moment I do:

  1. poetry remove jaxopt
  2. poetry update

@thomaspinder
Copy link
Collaborator

I'm not 100% sure what's going on here. Why are you running poetry update though?

@thomaspinder thomaspinder merged commit 9e4006c into main Dec 3, 2023
14 checks passed
@thomaspinder thomaspinder deleted the henry/new_optim branch December 3, 2023 14:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants