Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A lot of exceptions during cleanup #307

Open
Salatich opened this issue Nov 26, 2019 · 0 comments
Open

A lot of exceptions during cleanup #307

Salatich opened this issue Nov 26, 2019 · 0 comments

Comments

@Salatich
Copy link

Salatich commented Nov 26, 2019

I've noticed that some extra folders like "spark-uuid" in System.getProperty("java.io.tmpdir") are created. But, some of them are not deleted. Problem is somewhere here https://github.com/holdenk/spark-testing-base/blob/79eef40cdab48ee7aca8902754e3c456f569eea6/core/src/main/1.3/scala/com/holdenkarau/spark/testing/Utils.scala#L109, it returns true, file is not deleted.

I read a bit about java.io.File#delete - seems that file cannot be deleted by two general reasons: permissions and busying by some process. So, since some of them are deleted - looks like permissions is not the reason. It's really annoying to collect megabytes in temp directory.

Spark version 2.2.1, scala version 2.11

Any thoughts here?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant