Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix stdev calc for fixed variance #121

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

rpreen
Copy link

@rpreen rpreen commented May 13, 2024

This PR fixes the standard deviation calculation when using a fixed variance.

The code is copied correctly from:
https://github.com/carlini/privacy/blob/afe6ea7699a93899011d47aa13c4bf9a19c0c8ad/research/mi_lira_2021/plot.py#L82-L84

But, the @carlini paper seems to suggest that it should be calculated as in this PR?

With a small number of shadow models, we can improve the attack considerably by estimating the variances $\sigma_{\text{in}}^2$ and $\sigma_{\text{out}}^2$ of model confidences in Algorithm 1 globally rather than for each individual example.
That is, we still estimate the means $\mu_{\text{in}}$ and $\mu_{\text{out}}$ separately for each example, but we estimate the variance $\sigma_{\text{in}}^2$ (respectively $\sigma_{\text{out}}^2$) over the shadow models' confidences on all training set members (respectively non-members).

For a small number of shadow models ($<64$), estimating a global variance outperforms our general attack that estimates the variance for each example separately. For a larger number of models, our full attack is stronger: with $1024$ shadow models for example, the TPR decreases from $8.4%$ to $7.9%$ by using a global variance.

And further stated in the Appendix:

(2) estimate the means $\mu_{\text{in}}, \mu_{\text{out}}$ for each example, but estimate global variances $\sigma_{\text{in}}^2, \sigma_{\text{out}}^2$

@rpreen rpreen closed this May 13, 2024
@rpreen rpreen reopened this May 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant