Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Introduce RealtimeSanitizer (RADSan) real-time safety checking #121

Conversation

davidtrevelyan
Copy link
Contributor

This PR does the following:

  • Introduces a new GitHub Actions workflow that builds and tests RTNeural using RADSan
  • Creates a new config.h header file, in which all RTNeural preprocessor definitions are contained
    • The existing definitions like RTNEURAL_NAMESPACE etc. have also been moved to config.h in an attempt to be clean, but I'm very happy to move them back to the RTNeural.h header if that's preferred!
    • Introduces the RADSan [[clang::realtime]] function attribute, configured behind a new RTNEURAL_REALTIME macro
  • Enables the user (and CI!) to configure whether RADSan is being used with the cmake option -DRTNEURAL_ENABLE_RADSAN=ON
  • Adds the configured attribute RTNEURAL_REALTIME to the real-time methods of Model and ModelT
  • Fixes my earlier mistake that prevented failing test being output to the terminal in CI jobs

A couple of questions from me:

  1. Is RTNEURAL_REALTIME a good enough name? Happy to change it to something else :)
  2. Would it be a good idea to add RTNEURAL_REALTIME to the real-time methods of the Layer implementations?

@codecov-commenter
Copy link

codecov-commenter commented Nov 29, 2023

Codecov Report

Attention: 14 lines in your changes are missing coverage. Please review.

Comparison is base (04cb333) 95.70% compared to head (c3103c9) 95.70%.

Files Patch % Lines
RTNeural/batchnorm/batchnorm.h 50.00% 2 Missing ⚠️
RTNeural/batchnorm/batchnorm2d.h 50.00% 2 Missing ⚠️
RTNeural/batchnorm/batchnorm2d_eigen.h 50.00% 2 Missing ⚠️
RTNeural/batchnorm/batchnorm2d_xsimd.h 50.00% 2 Missing ⚠️
RTNeural/batchnorm/batchnorm_eigen.h 50.00% 2 Missing ⚠️
RTNeural/batchnorm/batchnorm_xsimd.h 50.00% 2 Missing ⚠️
RTNeural/conv1d_stateless/conv1d_stateless_eigen.h 50.00% 1 Missing ⚠️
RTNeural/conv1d_stateless/conv1d_stateless_xsimd.h 50.00% 1 Missing ⚠️
Additional details and impacted files
@@           Coverage Diff           @@
##             main     #121   +/-   ##
=======================================
  Coverage   95.70%   95.70%           
=======================================
  Files          58       58           
  Lines        3891     3891           
=======================================
  Hits         3724     3724           
  Misses        167      167           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Owner

@jatinchowdhury18 jatinchowdhury18 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me! Moving the macro handling to config.h is nice, and I think RTNEURAL_REALTIME is probably the right name for the attribute macro.

I guess we could add the realtime attribute to the layer-level forward methods. I don't think it would make much difference for the testing, but maybe it could be useful for folks who are using RTNeural layers without the "Model" API?

@davidtrevelyan
Copy link
Contributor Author

Nice one, thanks @jatinchowdhury18 - agreed! I've just pushed up some additions of the attribute to the Layer methods that I think are intended to be real-time safe. Would you mind quickly double checking that I've understood them correctly, and haven't added the attribute to any methods that aren't intended to be real-time safe?

@jatinchowdhury18
Copy link
Owner

Sure thing! Your additions look good to me. I hadn't really thought much about whether the methods used to set the layer weights were realtime-safe, but it's cool that they are! I guess that could be important if there's a situation where someone wants to "modulate" their layer weights or something like that.

I'm ready to merge this PR if you are!

@davidtrevelyan
Copy link
Contributor Author

Great! Yeah - super cool that the weights can be modulated - kind of like circuit bending for ML 🙃

Also, it could equally be really useful for controlling any highly interpretable differentiable elements that may live in the network, like a differentiable fader or something. Lots to play with, and I think RTNeural is really well placed in offering that option to creative developers!

I'm happy for this MR to be merged if you are! Thanks for the review 🙏

@jatinchowdhury18 jatinchowdhury18 merged commit 0485da9 into jatinchowdhury18:main Dec 1, 2023
21 of 22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants