Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replicate entire sklearn module structure #7

Open
phausamann opened this issue Dec 4, 2017 · 2 comments
Open

Replicate entire sklearn module structure #7

phausamann opened this issue Dec 4, 2017 · 2 comments

Comments

@phausamann
Copy link
Owner

For easier usage, the package should replicate all sklearn estimators in their respective modules by decorating them with a generalized EstimatorWrapper.

Users could then use all sklearn estimators just by modifying their import statements, for example:

from sklearn_xarray.decomposition import PCA

which would yield an xarray-compatible PCA estimator.

@phausamann phausamann self-assigned this Dec 4, 2017
@phausamann phausamann added this to the Release 0.2 milestone Dec 4, 2017
@phausamann
Copy link
Owner Author

phausamann commented Dec 12, 2017

On second thought, class decorators seem like a bad idea, mostly because the resulting object is not pickleable. It makes more sense for each estimator to subclass the corresponding wrapper, like

class PCA(TransformerWrapper):
    def __init__(self, **fit_params):
        super(self, PCA).__init__(sklearn.decomposition.PCA, **fit_params)

and provide the full parameter list instead of **fit_params

@phausamann
Copy link
Owner Author

The benefit of this approach would be that each estimator could inherit the methods it needs from the corresponding mixin, e.g.:

class PCA(_CommonEstimatorWrapper, _ImplementsTransformMixin, _ImplementsScoreMixin)

In some cases, the class could also implement wrappers for 'exotic' methods that might not warrant a dedicated mixin to be written for them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant