-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Randomized Smoothing Variations Implementation #2218
Randomized Smoothing Variations Implementation #2218
Conversation
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Codecov Report
❗ Your organization is not using the GitHub App Integration. As a result you may experience degraded service beginning May 15th. Please install the Github App Integration for your organization. Read more. @@ Coverage Diff @@
## dev_1.16.0 #2218 +/- ##
==============================================
- Coverage 85.61% 81.36% -4.25%
==============================================
Files 308 313 +5
Lines 27448 27782 +334
Branches 5044 5082 +38
==============================================
- Hits 23499 22606 -893
- Misses 2669 3932 +1263
+ Partials 1280 1244 -36
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @f4str Thank you very much for your pull request! The code looks great! I have a dded a few suggestions for minor improvements, what do you think?
loss: "torch.nn.modules.loss._Loss", | ||
input_shape: Tuple[int, ...], | ||
nb_classes: int, | ||
optimizer: Optional["torch.optim.Optimizer"] = None, # type: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this type ignore required? The None should be covered by the Optional type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The type: ignore
is not required. The original randomized_smoothing/pytorch.py
had this so I assumed it was needed. It can be removed from all variations including the original.
loss: "torch.nn.modules.loss._Loss", | ||
input_shape: Tuple[int, ...], | ||
nb_classes: int, | ||
optimizer: Optional["torch.optim.Optimizer"] = None, # type: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optimizer: Optional["torch.optim.Optimizer"] = None, # type: ignore | |
optimizer: Optional["torch.optim.Optimizer"] = None, |
loss: "torch.nn.modules.loss._Loss", | ||
input_shape: Tuple[int, ...], | ||
nb_classes: int, | ||
optimizer: Optional["torch.optim.Optimizer"] = None, # type: ignore |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optimizer: Optional["torch.optim.Optimizer"] = None, # type: ignore | |
optimizer: Optional["torch.optim.Optimizer"] = None, |
) -> Tuple["torch.Tensor", "torch.Tensor"]: | ||
""" | ||
The authors' implementation of the SmoothMixPGD attack. | ||
Code modified from https://github.com/jh-jeong/smoothmix/code/train.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's include their MIT License text (https://github.com/jh-jeong/smoothmix/blob/main/LICENSE) at the top of our new file after ART's MIT license text.
Code modified from https://github.com/jh-jeong/smoothmix/code/train.py | |
Code modified from https://github.com/jh-jeong/smoothmix/blob/main/code/train.py |
Signed-off-by: Farhan Ahmed <Farhan.Ahmed@ibm.com>
Hi @beat-buesser thank you for the review. I've addressed your comments and made the corresponding changes. Feel free to let me know if there's any other issues. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @f4str Thank you very much for extending the randomised smoothing capabilities in ART! The changes look good to me.
Description
Implement the following variations of randomized smoothing:
Additionally implement the following changes
unittest
topytest
TensorFlowV2DeRandomizedSmoothing
)verbose
parameter to all randomized smoothing classifiers to hide/show the progress barThis PR is continued and consolidated from
Links to Paper and Authors' code repository:
Type of change
Please check all relevant options.
Testing
Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.
Test Configuration:
Checklist