Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add maximumSampleCount limit to CompositeDeepScanLine #1230

Merged

Conversation

peterhillman
Copy link
Contributor

Address https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=44084

This changes existing behavior by disabling deep compositing of scanlines with more than 2^25 pixels by default in CompositeDeepScanLine. This API is used by the InputPart, InputFile and RgbaInputFile APIs to composite deep scanline images automatically, so they can be read as if they were regular scanline images. Compositing scanline images with very many samples will take a long time and cause unanticipated excessive memory allocation. This change does not affect deep images read using DeepInputFile or DeepInputPart APIs.

The default limit is equivalent to 8k samples in every pixel of a 4k wide image, which should be high enough not to affect most genuine deep images, but should prevent excessive memory allocation by reading damaged or malicious deep files.

A new static function CompositeDeepScanLine::setMaximumSampleCount() can be use to adjust the limit or disable the check completely.

Signed-off-by: Peter Hillman peterh@wetafx.co.nz

Signed-off-by: Peter Hillman <peterh@wetafx.co.nz>
Copy link
Contributor

@kdt3rd kdt3rd left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but see questions

@@ -427,6 +427,23 @@ LineCompositeTask::execute ()

} // namespace


namespace {
int64_t maximumSampleCount = 1<<25;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we prefer the same behaviour as for the max image / tile size and have this default to 0 to indicate no limit and in the fuzz test initialize it to 64 or something to keep it low there since they run on tiny machines the way we do w/ the image size?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm happy to change the logic if that's the consensus. I had considered the alternative: that the limits are 'off by default' and enabled if reduceMemory or reduceTime are enabled in CheckFile().

My thinking for making it 'on by default' was that it may not be widely known that deep images are supported by the InputFile API, and that certain files would take much more time and memory to read than OpenEXR users might expect. By forcing the change on, we protect code that was never intended to support deep images, and the limit is high enough that it's unlikely to inconvenience any workflows. (If you generally have more than 1<<25 samples on a scanline you probably should read the file using the deep API, and probably should have written the file using small tiles instead of scanlines)

As you say, the disadvantage of the 'on by default' is that it is inconsistent both with the existing image and tile size limits, and with older versions of OpenEXR.

@@ -508,6 +525,15 @@ CompositeDeepScanLine::readPixels (int start, int end)
overall_sample_count += total_sizes[ptr];
}

if (maximumSampleCount >=0 && overall_sample_count > maximumSampleCount)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is 0 a valid max? if you pay attention to my above suggestion, this should probably be > 0

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My reasoning was that setting the limit to 0 would be a way of completely disabling the transparent reading of deep images, and we might want to add better checks for this later on, so exceptions are thrown from the constructor rather than from readPixels if maximumSampleCount==0. The alternative approach might be to encourage code to check the type header attribute before reading pixels.

@cary-ilm
Copy link
Member

@kdt3rd and @peterhillman, where do we stand with this? Are we waiting on a change, or did we conclude this is good as is?

@peterhillman
Copy link
Contributor Author

I'll make the changes @kdt3rd suggested, unless anyone has strong opinions about doing it differently

@cary-ilm
Copy link
Member

@kdt3rd, is this OK with you?

Copy link
Member

@cary-ilm cary-ilm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@cary-ilm
Copy link
Member

@kdt3rd says this is good to go.

@cary-ilm cary-ilm merged commit 8587f4e into AcademySoftwareFoundation:main Mar 25, 2022
@cary-ilm cary-ilm mentioned this pull request Apr 2, 2022
cary-ilm pushed a commit to cary-ilm/openexr that referenced this pull request Apr 5, 2022
…eFoundation#1230)

* Add maximumSampleCount limit to CompositeDeepScanLine

Signed-off-by: Peter Hillman <peterh@wetafx.co.nz>

* maximumSampleCount checks off by default; enabled in checkOpenEXRFile

Signed-off-by: Peter Hillman <peterh@wetafx.co.nz>
Signed-off-by: Cary Phillips <cary@ilm.com>
cary-ilm pushed a commit that referenced this pull request Apr 7, 2022
* Add maximumSampleCount limit to CompositeDeepScanLine

Signed-off-by: Peter Hillman <peterh@wetafx.co.nz>

* maximumSampleCount checks off by default; enabled in checkOpenEXRFile

Signed-off-by: Peter Hillman <peterh@wetafx.co.nz>
Signed-off-by: Cary Phillips <cary@ilm.com>
@cary-ilm cary-ilm added the v3.1.5 label Jul 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants