Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting dedicated warehouse and metastore temp-directory doens't work #339

Open
qnob opened this issue May 21, 2021 · 0 comments
Open

Setting dedicated warehouse and metastore temp-directory doens't work #339

qnob opened this issue May 21, 2021 · 0 comments

Comments

@qnob
Copy link

qnob commented May 21, 2021

Hello,
I've noticed, when I run one of my tests, the default warehouse directory (current directory is used. Same for metastore directory. This directories will be reused by each Spark Session, therefore, my tests will have dependencies between each other.

I've found out that the framework actually handles that problem by setting these directories, among other configuration. See

DataFrameSuiteBaseLike#sqlBeforeAllTestCases

However, these settings are never used, because at the moment you're creating this configuration, the Spark Context will already be created. And therefore all future settings are ignored. More precisely, the SharedSparkContext#beforeAll method is called before DataFrameSuiteBaseLike#sqlBeforeAllTestCases

Unfortunately, I haven't found a way to fix it. Therefore, I won't be able to make a PR. Maybe someone else might help out.

Thanks.
Kuno

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant