Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError in ProjectedGradientDescentPyTorch due to Device Mismatch #2553

Closed
windshadow233 opened this issue Jan 13, 2025 · 1 comment · Fixed by #2558
Closed

RuntimeError in ProjectedGradientDescentPyTorch due to Device Mismatch #2553

windshadow233 opened this issue Jan 13, 2025 · 1 comment · Fixed by #2558
Assignees
Labels
bug Something isn't working
Milestone

Comments

@windshadow233
Copy link

Describe the bug
A RuntimeError occurs due to a device mismatch when using torch.ones(1) (created on the CPU by default) with values_tmp (on the GPU).

To Reproduce
Steps to reproduce the behavior:

  1. Create an instance of PyTorchClassifier on GPU.
  2. Create an instance of ProjectedGradientDescent to attack the PyTorchClassifier model.
  3. Try to generate adversarial data with the attack.generate method.
  4. Raise the error: "RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!"

Expected behavior
The adversarial data generation process should complete without errors.

Error Messages

Traceback (most recent call last):                                                                                                                                              
  [...]
  File "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent.py", line 202, in generate
    return self._attack.generate(x=x, y=y, **kwargs)
  File "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent_pytorch.py", line 223, in generate
    adv_x[batch_index_1:batch_index_2] = self._generate_batch(
  File "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent_pytorch.py", line 284, in _generate_batch
    adv_x = self._compute_pytorch(
  File "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent_pytorch.py", line 451, in _compute_pytorch
    perturbation = self._projection(x_adv - x_init, eps, self.norm)
  File "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent_pytorch.py", line 500, in _projection
    values_norm == 0, torch.minimum(torch.ones(1), torch.tensor(eps).to(values_tmp.device) / values_norm)
RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu!

Suggested solution

Just replace line 500, in _projection function in "adversarial-robustness-toolbox/art/attacks/evasion/projected_gradient_descent/projected_gradient_descent_pytorch.py":

From

values_norm == 0, torch.minimum(torch.ones(1), torch.tensor(eps).to(values_tmp.device) / values_norm)

To

values_norm == 0, torch.minimum(torch.ones(1).to(values_tmp.device), torch.tensor(eps).to(values_tmp.device) / values_norm)

System information (please complete the following information):

  • OS: Linux-5.15.0-105-generic-x86_64-with-glibc2.31
  • Python version: 3.9.18
  • ART version: 1.18.2
  • PyTorch version: 2.5.1+cu118
@beat-buesser beat-buesser self-assigned this Jan 16, 2025
@beat-buesser beat-buesser added the bug Something isn't working label Jan 16, 2025
@beat-buesser beat-buesser added this to the ART 1.19.1 milestone Jan 16, 2025
@beat-buesser
Copy link
Collaborator

Hi @windshadow233 Thank you very much for reporting this issue. We'll include a fix in ART 1.19.1.

@beat-buesser beat-buesser moved this to In Progress in ART 1.19.1 Jan 16, 2025
@beat-buesser beat-buesser moved this from In Progress to Done in ART 1.19.1 Jan 22, 2025
@beat-buesser beat-buesser closed this as completed by moving to Done in ART 1.19.1 Jan 22, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
No open projects
Status: Done
2 participants